GPT 3

GPT-3 vs Human Brain



Lex Fridman

GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?

REFERENCES:

[1] GPT-3 paper: Language Models are Few-Shot Learners
https://arxiv.org/abs/2005.14165

[2] OpenAI’s GPT-3 Language Model: A Technical Overview
https://lambdalabs.com/blog/demystifying-gpt-3/

[3] Measuring the Algorithmic Efficiency of Neural Networks
https://arxiv.org/abs/2005.04305