Framing Future
Let’s explore the limitations of GPT 3 and estimate the costs of training.
What will happen if basic machine learning algorithm will be trained on massive amount of text without supervision? Will it be enough to make computer understand human language? GPT-3 is a powerful solution to natural language processing tasks created by OpenAI. It is trained on data from millions of webpages and other sources. The results of early demos are amazing. But can it solve any language task?
TIMESTAMPS
0:00 What is GPT3
1:32 How much does it cost?
2:05 What you can create with GPT3?
5:27 Miscellaneous
[1] Brown, Tom B., et al. “Language models are few-shot learners.” arXiv preprint arXiv:2005.14165 (2020).
[3] Vaswani, A., et al. “Attention is all you need. arXiv 2017.” arXiv preprint arXiv:1706.03762 (2017).
did you say "million" or "billion" ? It's not clear
What is your accent? It's really fantastic and I love it, but I'm struggling to understand when you talk so quickly