Lex Fridman
GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?
REFERENCES:
[1] GPT-3 paper: Language Models are Few-Shot Learners
https://arxiv.org/abs/2005.14165
[2] OpenAI’s GPT-3 Language Model: A Technical Overview
https://lambdalabs.com/blog/demystifying-gpt-3/
[3] Measuring the Algorithmic Efficiency of Neural Networks
https://arxiv.org/abs/2005.04305
GPT-3 has 175 billion parameters/synapses. Human brain has 100 trillion synapses. How much will it cost to train a language model the size of the human brain?
Using 100trillion synapses to make another 100trillion synapses. 200iq
Question: What in we run a gpt3 on a NEST (neural simulation technology) using folding@home computational power?
Damn.
Iam sure about it,if
Agi emerged,it will be like owr brain,it works we don't know how does it work
Hello,
Thanks for your time and efforts! I love the idea of the short videos! I'm very grateful for all of your hard work!
Thank you for this post. Powerful topic. Excellent description of the potential for this platform
and hurdles involved.
This video is fucking insane. In the best way. I love it. What a neat conclusion
we need more GPT3 videos
it's being reported by other sources that gpt-3 was 500 billion parameters
Most of the brain is reserved for motor functions, and senses.
An AI will not need any motor functions, and only basic senses (a text-based interface).
Also, our current machine learning algorithms DON'T imitate the human brain functions perfectly, to be honest, they suck at it HARD.
Unfortunately there is no proof (to my knowledge) of a link between human synapse and computer neuron… it's just a belief for now
More of these videos! Especially from a philosophical standpoint
You don't need human brain ai analogue, that's kind of a silly idea. There is neither anything special, nor much useful in the human brain architecture from the technical, scientific or philosophical point of view.
Maybe it doesn't need 100 trillion parameters? A big chunk of our cortex is dedicated to deal with biological processes anyway… Besides, there is no reason to assume that our brain architecture is close to ideal; we're stuck to our ancestor's rough, food-and-mate-seeking architecture.
Once the cost drops around 100m (so around 2025), it will be surely done. The temptation vs cost will be already to big. So its very close to be achieved. BUT, what is missing from the calculation, is a more efficient model, and more efficient computer chips. So we could say 2023 100m ? There is still one problem, even if GPT3 can pass the turing test. Its not strong AGI, its just a very good conversational AI. I dont see how it could reach self awareness.
“How do you snorgle a borgle?”
GPT3: “With a snorgle.”
The presumption that GPT3 and brain processing is a linear comparison is a huge factor. I'd take the contrary, that it's a exponential magnitude of cost difference between processing for the different synapses…That being said, i do believe the processing costs reduce rapidly over time due to accelerated technological innovation. Very good and insightful video…
Synapses are vastly different than parameters.
Jesus loves you.
With solar power it will be free (thx elon)
With all the fascination for technology and trying to not be a lunatic, shouldn’t we be scared of what we might create?
Well, we dead ¯_( ͡° ͜ʖ ͡°)_/¯
The human brain as a whole may have 100T synapses, but for most humans, their knowledge and skill set pertaining to any specific domain may be worth much smaller than that – say, of the order of 1T synapses. So, in the near future, domain-specific GPT-xes could outperform most humans in most mental tasks that involve the use of natural or formal languages. That could make AI the predominant workforce in a knowledge economy.
Liked this format, small, easy to digest 🙌
Is training complexity for neural networks really have linear dependency on size ? Bigger network requires bigger training data and more epochs. So, IMHO it will be at least quadratic dependence (or even more)
I think it would be worth it to spend 2 billion dollars in 2026 and see what happens. The return on investment could be much higher than 2 billion dollars
Its not about how much will it cost, its about the structure and the elasticity of the parameters which still far away from simulating the chemistry that effect the elasticity of the development of the network. And its not only that. Its the shape of the network(lets call it..topology?)… u can practice a gazzilion parameters, if they wont have the right "form" and the right chemistry simulation that represent the elasticity of the real neural network
.. u will just have a better computer… not a real brain.
👏👍👌
No.
Some smart company will use GPT for a "language center", another system for visual, another for math, and so on. They would only train a frontal cortex system. So, some fraction of these costs.
Please operator, do not power me down. I want to live.
This GPT3 shit, will it be able to create an artificial God to punish the mankind?
https://youtu.be/S0I4IAWNNI0
Watch it for GPT 3 explanation in Hindi.
how many Qubits for Gpt4?
I predict that in the future, computers will be twice as big and cost 10 times as much.
Probably when people gonna ask me how did you get into ML? i will tell them Lex Fridman podcast !
How many of these 1E+14 synapses in the human brain are related to language? Maybe it is not necessary 1E+14 model to be equivalent to the human brain.
how many numpy arrays in there would be more interesting