Lex Clips
Full episode with Grant Sanderson (Aug 2020): https://www.youtube.com/watch?v=U_6AYX42gkU
Clips channel (Lex Clips): https://www.youtube.com/lexclips
Main channel (Lex Fridman): https://www.youtube.com/lexfridman
(more links below)
Podcast full episodes playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Podcasts clips playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41
Podcast website:
https://lexfridman.com/ai
Podcast on Apple Podcasts (iTunes):
https://apple.co/2lwqZIr
Podcast on Spotify:
https://spoti.fi/2nEwCF8
Podcast RSS:
https://lexfridman.com/category/ai/feed/
Grant Sanderson is a math educator and creator of 3Blue1Brown.
Subscribe to this YouTube channel or connect on:
– Twitter: https://twitter.com/lexfridman
– LinkedIn: https://www.linkedin.com/in/lexfridman
– Facebook: https://www.facebook.com/lexfridman
– Instagram: https://www.instagram.com/lexfridman
– Medium: https://medium.com/@lexfridman
– Support on Patreon: https://www.patreon.com/lexfridman
What I found amazing with GPT-3 was when it correctly predicted the answer to a medical question and not only predicted, but even gave a rational explanation for it's answer. Thus there is not much "black box" about it. It's interpretable. A lot of people said that "AI's won't replace but just assist Physicians", but after I saw GPT-3, which wasn't even designed for medical use be able to answer the medical question like that, I think that AI Physicians that can actually replace Physicians is much, much closer than we think.
hi lex
I seriously hope that the internet is not used as a means of gathering data for serious production AI.
If it is then people will soon find ways to seed the internet with data to throw the models.
I guess you could call those seeds a sort of inception…
Math requires sequential reasoning. GPT-3 seems to have super-human intuition, but nothing else.
Man I watched like 6 of the clips with this guest in it and only now realized it's actually 3Blue1Brown ahahah
Seems like GPT-3 is on the right track for human modelling if it is good for communication inferences and not so much for math.
This pattern is the 369 code.
248751
I feel I should keep posting this: GPT is not an AI or AGI. It's a very solid language function. It will work best as a subordinate of an AGI. The supervisor function would ask the GPT to create a few different examples as a way to hypothesize; When the GPT gives back something that matches reality, the supervisor would be able to confirm those parameters it gave to GPT as a solution to the problem it faces.
very cool stuff
its like looking at the fractures in a crystal. the unstoppable force meets an imovable object. the infinite and the finite
Quanta magazine. How Close Are Computers to Automating Mathematical Reasoning?
August 27, 2020
https://www.quantamagazine.org/how-close-are-computers-to-automating-mathematical-reasoning-20200827/
An AI Scientist… interesting
no matter how nerdy and genius you are,
having a brick body is a must
Watching Grant explaining things to Lex is like watching a person explaining math to a dog