Lex Fridman
Jeremy Howard is the founder of fast.ai, a research institute dedicated to make deep learning more accessible. He is also a Distinguished Research Scientist at the University of San Francisco, a former president of Kaggle as well a top-ranking competitor there, and in general, he’s a successful entrepreneur, educator, research, and an inspiring personality in the AI community. This conversation is part of the Artificial Intelligence podcast.
INFO:
Podcast website: https://lexfridman.com/ai
Full episodes playlist: http://bit.ly/2EcbaKf
Clips playlist: http://bit.ly/2JYkbfZ
EPISODE LINKS:
– Jeremy Twitter: https://twitter.com/jeremyphoward
– fast.ai Twitter: https://twitter.com/fastdotai
– fast.ai Web: https://www.fast.ai/
– fast.ai Course: https://course.fast.ai/
OUTLINE:
0:00 – Introduction
1:18 – First program
3:07 – Favorite programming languages
15:01 – Programming languages for machine learning
23:35 – Fast.ai intro (to be continued later)
24:31 – Ai and deep learning in medicine
32:30 – Privacy
37:55 – Fast.ai
40:42 – Theory vs practice
45:43 – DAWNBench – Stanford deep learning benchmark
56:24 – Fusing multiple audio and image sources
59:01 – Learning rate & deep learning as an experimental science
1:04:32 – Working with data
1:06:16 – Deep learning cloud options
1:09:12 – Deep learning frameworks
1:17:51 – How long does it take to finish fast.ai courses?
1:19:49 – Lessons from teaching deep learning
1:21:34 – Advice for people starting with deep learning
1:27:02 – Startups and entrepreneurship
1:32:21 – Anki and spaced repetition
1:40:06 – Next breakthrough in deep learning
1:41:17 – Job displacement and Andrew Yang
CONNECT:
– Subscribe to this YouTube channel
– Twitter: https://twitter.com/lexfridman
– LinkedIn: https://www.linkedin.com/in/lexfridman
– Facebook: https://www.facebook.com/lexfridman
– Instagram: https://www.instagram.com/lexfridman
– Medium: https://medium.com/@lexfridman
– Support on Patreon: https://www.patreon.com/lexfridman
Source
I really enjoyed this conversation with Jeremy. Here's the outline:
0:00 – Introduction
1:18 – First program
3:07 – Favorite programming languages
15:01 – Programming languages for machine learning
23:35 – Fast.ai intro (to be continued later)
24:31 – Ai and deep learning in medicine
32:30 – Privacy
37:55 – Fast.ai
40:42 – Theory vs practice
45:43 – DAWNBench – Stanford deep learning benchmark
56:24 – Fusing multiple audio and image sources
59:01 – Learning rate & deep learning as an experimental science
1:04:32 – Working with data
1:06:16 – Deep learning cloud options
1:09:12 – Deep learning frameworks
1:17:51 – How long does it take to finish fast.ai courses?
1:19:49 – Lessons from teaching deep learning
1:21:34 – Advice for people starting with deep learning
1:27:02 – Startups and entrepreneurship
1:32:21 – Anki and spaced repetition
1:40:06 – Next breakthrough in deep learning
1:41:17 – Job displacement and Andrew Yang
Totally agree with lex at 2:58!
Just spotted the https://course.fast.ai/ link I was meaning to get back to. Glad to listen to this to get some background.
Besst podcast for Artificial Intelligence(Deep Learning)
omg, this is gold
Anyone else shocked that Lex doesn’t know access or Visual Basic ? We re getting old!
1:24:04 "you're a saint" 👌❤️
Thanks for sharing this kind of inspiration and knowledge for Z generation from Russian, our professors didn't bother about sharing they knowledges because they bother only about their position at the university and they won't accept that someone better then them in new programming field! From all students from Russian: We appreciate your work for free knowledge, most of us very smart persons but we can't afford teaching at your universities!
Is there a link the the course. I do industrial automation and see a major need for ai type data structures. Even with building recipes, and layering of programs
Jeremy is totally wrong when he says that outside South Africa, the people available to examine a medical scan are at best nurses. He got his facts wrong. Otherwise it was an informative interview
Bravo!
u are my RELU(JESUS)
Wow what a great conversation 🙌
Thanks Lex and Jeremy 👏
Started Fast.ai because of this podcast 👌
Jeremy Howard seems to be proud he is going in a direction opposite to google and big companies by not scaling models through scaling hardware and data…
This has tones of when computer programmers opted not to optimize code instead they would just wait for moore's law to release better hardware in the future. I wonder which approach will win out
Innovations in algorithms? or Scaling up Data and Compute
Lex, your speaking cadence is positively disarming. Your content is next level, and delivered in a down to earth manner that belies it's depth and import. Well done.
As a matter of principle, I never subscribe to any YouTube channels.
Subbed.
Has anyone applied machine learning to determine hyperparameter settings? I imaging it would be hard to come up with a training set for that… You'd need to train hundreds or thousands of different models with different hyperparameters… Seems like an obvious application, but not sure if we're ready for that level of abstraction. Is that what Google is doing? Training a model that picks better models?
Great Episode Lex! where do you use anki on your desktop computer or on mobile? i don't think one can synchronize between them?
it could be possible to use my own gpu instead of leasing something on the web
Very intriguing stuff. Thanks for the time marker outline.
2:58 Couldn't agree more. Co author of one of my favorite books, the Pragmatic Programmer is another.
That beauty of science…
Love the Large bottle of ADVIL in behind the guest…for vodka hangovers?
1:01:35 sounds counterintuitive
Loved this, particularly the idea of making more efficient use of data and processing.
You could take a tribesman who's never seen a videogame before and have him grasp the concept and mechanics of pong in seconds but an AI would take hours. Change the graphics of the paddle, add particle effects, change the background, speed, etc and the tribesman would get it instantly whereas move the paddle 2 pixels and the AI basically has to start from scratch again.
Finding ways to narrow this gap is far more interesting to me than using increasingly bigger computers and bigger pools of data.
YANG 2020!!!
1:17:23 Was that a subtle Rick and Morty reference?
(27:00) Lex: Is there any part of health care you can see being automated away?
Jeremy: There is such a shortage of healthcare you would never want to do that for any reason whatsoever!
Um, automating the service would eliminate the shortage entirely, Jeremy. Such a weird and incoherent response to the question makes me want to avoid the Gell Mann amnesia effect, and put severe doubt into anything else he might have to say.
Swift sucks If you care about productivity and getting things done. On the other hand, If you are an Apple fan and love limited hardware, then swift is for you.
"99% of all A.I research is basically useless".
Aaah. I see he is a man of culture as well.
Jeremy Howard is one of the good guys.
Thank you Jeremy Howard for the several things you do for the community.
Thank you Lex Fridman for creating and exposing this.
2:50 Lex:" what's your favourite instrument?", Jeremy:"Saxophone", Lex: "SEX" . that escalated quickly