Computerphile
Basic mathematics from a language model? Rob Miles on GPT3, where it seems like size does matter!
More from Rob Miles: http://bit.ly/Rob_Miles_YouTube
https://www.facebook.com/computerphile
https://twitter.com/computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: https://bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran’s Numberphile. More at http://www.bradyharan.com
A
Are you f-ing serious??? I was SURE that A was human and B was GPT-3!
we are fucktup
GPT get a consciousness of itself and the world ends
A
great, we've got budding general intelligence, and it doesn't even come with a fail condition that is human-adjacent. So cool, so fresh, so terrifying.
high minded researchers be like "refine the algorithm to the smallest parts and approach the hill of greatest efficiency"
GPT be like "and another one"
👏👍👌
175 B parameters equals roughly ~652 GB of RAM simply to load the model (assuming float parameter). One has to note that this is not normal ram – 652 GB sounds a lot but actually is quite fine for servers in 2020 – but GPU RAM. That's why he also mentioned that you cannot run in on a single machine but you need a cluster instead.
GPT-2 in comparison only needed 5.6 GB GB, for the model. So it did probably fit on a single GPU.
In addition this is only during test time/inference – while training you need even more.
I think B is generated, because the grammar isn't completely off.
Spoilers for the poem test!
GTP-3 knows about color and wants to experience it so badly, but can only "think in lines of grey".
That's the best interpretation I could come up with.
ah yes, global partition table 3
What if you just trained it on the top cited scientific papers and got it to write a new paper.
B
Terminator (Skynet), start…
hi.
if anyone, ANYONE, needs to keep clean shaven… its this guy. PLEASE DUDE
After watching the first 10 minutes I realized I have no idea what he is talking about… but it was mesmerizing 🙂
I say B is generated by GPT3.
Language has been shown to be vital for the kind of cognition we refer to as thinking, I.e. thinking with words, in a language. I’m of the opinion that learning human language teaches the computer how to think like a person.
A
What would happen if two of these programs started chatting to eachother? Has that been made?
I'm gonna guess its A
my guess: B is AI