Lex Fridman
The talks at the Deep Learning School on September 24/25, 2016 were amazing. I clipped out individual talks from the full live streams and provided links to each below in case that’s useful for people who want to watch specific talks several times (like I do). Please check out the official website (http://www.bayareadlschool.org) and full live streams below.
Having read, watched, and presented deep learning material over the past few years, I have to say that this is one of the best collection of introductory deep learning talks I’ve yet encountered. Here are links to the individual talks and the full live streams for the two days:
1. Foundations of Deep Learning (Hugo Larochelle, Twitter) – https://youtu.be/zij_FTbJHsk
2. Deep Learning for Computer Vision (Andrej Karpathy, OpenAI) – https://youtu.be/u6aEYuemt0M
3. Deep Learning for Natural Language Processing (Richard Socher, Salesforce) – https://youtu.be/oGk1v1jQITw
4. TensorFlow Tutorial (Sherry Moore, Google Brain) – https://youtu.be/Ejec3ID_h0w
5. Foundations of Unsupervised Deep Learning (Ruslan Salakhutdinov, CMU) – https://youtu.be/rK6bchqeaN8
6. Nuts and Bolts of Applying Deep Learning (Andrew Ng) – https://youtu.be/F1ka6a13S9I
7. Deep Reinforcement Learning (John Schulman, OpenAI) – https://youtu.be/PtAIh9KSnjo
8. Theano Tutorial (Pascal Lamblin, MILA) – https://youtu.be/OU8I1oJ9HhI
9. Deep Learning for Speech Recognition (Adam Coates, Baidu) – https://youtu.be/g-sndkf7mCs
10. Torch Tutorial (Alex Wiltschko, Twitter) – https://youtu.be/L1sHcj3qDNc
11. Sequence to Sequence Deep Learning (Quoc Le, Google) – https://youtu.be/G5RY_SUJih4
12. Foundations and Challenges of Deep Learning (Yoshua Bengio) – https://youtu.be/11rsu_WwZTc
Full Day Live Streams:
Day 1: https://youtu.be/eyovmAtoUx0
Day 2: https://youtu.be/9dXiAecyJrY
Go to http://www.bayareadlschool.org for more information on the event, speaker bios, slides, etc. Huge thanks to the organizers (Shubho Sengupta et al) for making this event happen.
CONNECT:
– If you enjoyed this video, please subscribe to this channel.
– AI Podcast: https://lexfridman.com/ai/
– Show your support: https://www.patreon.com/lexfridman
– LinkedIn: https://www.linkedin.com/in/lexfridman
– Twitter: https://twitter.com/lexfridman
– Facebook: https://www.facebook.com/lexfridman
– Instagram: https://www.instagram.com/lexfridman
– Slack: https://deep-mit-slack.herokuapp.com
Source
can anyone load the slides of this presentation
Good explanation
= = I have a feeling that I accelerate the video…..since he speaks so fast
This is a great talk, really helped with my university project
27:02 How are the gradients computed?
1) Gradients of the loss with respect to the activation
2) Gradients of the mean/sum of activations with respect to the input image
cs231n
At 1:19:40. Isn't that what Google is trying to do right now?
slides: https://docs.google.com/presentation/d/1Q1CmVVnjVJM_9CDk3B8Y6MWCavZOtiKmOLQ0XB7s9Vg/edit#slide=id.p
Pro-Tip: 0.75x speed. You're welcome 🙂
Hi from Brazil!! Thank you for sharing this. It is a really good material for researchers and who is initiating in this area.
i study natural sciences what am i doing here
omg! He has done a great job in explaining this entirely new field of research in an hour!
Great video. I came here after completing 4 video sessions by Lex (MIT 6:S094). Thanks for compling these videos.
Recommended Top Data Analytics and Deep learning courses:
this awesome post from website Kupons Hub helps you to learn Deep Learning quickly
Machine Learning, Data Science, Deep Learning, Artificial Intelligence A-Z Courses
http://kuponshub.com/machine-learning-data-science-deep-learning-artificial-intelligence-a-z-courses/
Very recommendable! Good overview!
Ok I disagree with how he answered the woman's question. There is a way to choose a MINIMAL amount of layers, based on the complexity of the data. It depends on convexity vs non-convexity in the data space. non-convexity requires a minimum of 2 hidden layers to represent. This is why before you dig into deep learning and all the modern stuff a good basis in Universal Approximation Theory and NN's as Universal Approximators is critical!
This People, They did not understand anything but they liked him.
he is one of the wingmakers
thanks frid
I love listening to things I cannot understand