AI Pursuit – Advancing AI Research
Steve Omohundro on GPT-3 | Numenta | Natural Language Processing | Transformers
—————————————————————————————–
If you enjoyed this video feel free to LIKE and SUBSCRIBE, also you can click the bell (🔔) for notifications!
Subscribe ⇢ https://www.youtube.com/c/AIPursuit?sub_confirmation=1
Paypal: https://paypal.me/aipursuit
Patreon: https://patreon.com/aipursuit
BuyMeACoffee: https://www.buymeacoffee.com/angustay
—————————————————————————————–
The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes.
Source: https://www.youtube.com/watch?v=0ZVOmBp29E0
In this research meeting, guest Stephen Omohundro gave a fascinating talk on GPT-3, the new massive OpenAI Natural Language Processing model. He reviewed the network architecture, training process, and results in the context of past work. There was extensive discussion on the implications for NLP and for Machine Intelligence / AGI.
Link to GPT-3 paper: https://arxiv.org/abs/2005.14165
Link to slides from this presentation: https://www.slideshare.net/numenta/openais-gpt-3-language-model-guest-steve-omohundro
Thanks
Enoyed this discussion a lot. Thanks for sharing
Consensus plays a big part in what we adopt as knowledge. In the absence of being able to run and see the results of the experiment we have to rely on a consensus of trusted sources that confirm each others results. I can see why they used reddit which adds a consensus filter to increase the quality score of the data they fed it. I think figuring out how to increase the data quality will help these models tremendously.