GPT 3

GPT-3 – explained in layman terms.



Analytics India Magazine

OpenAI researchers released a paper describing the development of GPT-3, a state-of-the-art language model made up of 175 billion parameters.

The previous OpenAI GPT model had 1.5 billion parameters and was the biggest model back then, which was soon eclipsed by NVIDIA’s Megatron, with 8 billion parameters followed by Microsoft’s Turing NLG that had 17 billion parameters. Now, OpenAI turns the tables by releasing a model that is 10x larger than Turing NLG.
Current NLP systems still largely struggle to learn from a few examples. With GPT-3, the researchers show that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even reaching competitiveness with prior state-of-the-art fine-tuning approaches.

Read more here :
OpenAI’s GPT-3 Can Now Generate The Code For You – https://bit.ly/30GxSHM
GPT-3 Is Amazing—And Overhyped – https://bit.ly/2E8XC7Z

#GPT3 #NLP #OPENAI #ARTIFICIALINTELLIGENCE