Commit aacfa79e authored by Daniel Sammon's avatar Daniel Sammon 🏀
Browse files


parent 1d333d1b
......@@ -4,6 +4,7 @@ BERT is a neural network language model architecture introduced by Google in 201
et al. 2018). When training a BERT model, the network is trained not to predict the next
token in a sequence but to predict a masked token as in a cloze test.
## Notebook for assignment
The assignment required GPU power that was better suited to Google colab.
The notebook for this assignment can be found at this [link.](
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment