Daniel Bourke
For the last three years, the State of AI Report has been published as a snapshot of what’s happened in the field of artificial intelligence over the past 12 months. This is my review/walkthrough of the 2020 version.
A big shout out to Nathan and Ian for their huuuuuuge efforts putting this together!
See the full State of AI Report 2020 here: https://www.stateof.ai
Connect:
Web – https://www.mrdbourke.com
Email updates – https://www.mrdbourke.com/newsletter
Timestamps:
0:00 – Intro and hello
1:31 – AI report review start
2:12 – AI definitions
3:00 – SECTION 1: Research
3:37 – Transformers taking over NLP
5:50 – Universities starting AI-based degrees
6:35 – Only 15% of papers published share their code/data
8:20 – PyTorch outpacing TensorFlow in research papers
11:12 – Bigger models, datasets and compute budgets drive performance
15:40 – Increased performance costing more for incremental improvement
18:13 – Deep learning is getting more efficient
21:11 – AI for conversations is getting better
23:02 – Machine translation for code (Python to C++)
25:32 – Many algorithms starting to beat human baseline for NLP on GLUE test
27:04 – Using Transformers for computer vision
30:11 – AI performs incredibly well on mammography tasks across two regions (US and UK)
31:30 – Causal inference in ML
34:30 – ML for synthesizing new molecules
36:26 – AI starts to read DNA-encoded molecules
39:39 – AI generates tennis matches between any tennis players you want
40:46 – Transformers being used for object detection
41:46 – AI which learns from its dreams
44:26 – Really efficient on-device computer vision models
45:04 – Evolving machine learning algorithms from scratch (AutoML Zero)
46:20 – Federated learning is now booming
47:16 – Privacy-preserving ML
48:35 – Using Gaussian Processes for estimating model uncertainty
50:11 – SECTION 2: Talent
50:12 – Many top companies stealing AI professors
52:02 – Abu Dhabi opens the world’s first AI university
54:03 – Many Chinese AI PhD’s depart China for other countries
54:52 – US based companies and institutions dominate NeurIPS and ICML (ML conferences)
56:13 – Three times more AI job postings then views for AI roles
56:58 – TensorFlow and Keras have more job postings on LinkedIn than PyTorch
57:18 – SECTION 3: Industry
59:06 – INTERMISSION
59:44 – AI predicting metabolic response to food
1:00:45 – FDA acknowledge lack of policy for AI-driven systems
1:01:54 – Less than 1% of AI-based medical imaging studies are high-quality
1:02:48 – First reimbursement approval for deep learning based medical imagining
1:05:16 – Self-driving mileage in California still well above autonomous driving mileage
1:10:36 – Supervised ML improvements seem to follow an S-curve
1:12:13 – A new kind of approach to self-driving cars
1:17:17 – New AI-first chips (competition for NVIDIA)
1:19:41 – The rise of MLOps
1:21:52 – Computer vision for auto insurance claims
1:23:45 – Using NLP to detect money laundering and terrorist schemes on the web
1:25:17 – Robots in stock factories are picking millions of items per month
1:26:23 – HuggingFace’s open-source NLP work is driving NLP’s explosion
1:28:09 – SECTION 4: Politics
1:29:59 – Creating a search engine for faces
1:33:29 – GPT3 outputs bias predictions like GPT2
1:33:49 – US military adopting deep reinforcement learning techniques
1:35:52 – Fighter pilots vs AI pilots
1:38:28 – Google’s People and AI guidebook talks about fairness, interpretability, privacy
1:40:46 – China fronts big cash for chip manufacturing
1:45:00 – A call to tackle climate change, food waste, generating new battery technologies and more with ML
1:45:45 – SECTION 5: Predictions
1:49:22 – A special guest appears
#artificialintelligence #ai #machinelearning
Two questions:
1. What was your favourite AI innovation of the past year?
2. What's your prediction for 2021?
PS, here's the timestamps:
0:00 – Intro and hello
1:31 – AI report review start
2:12 – AI definitions
3:00 – SECTION 1: Research
3:37 – Transformers taking over NLP
5:50 – Universities starting AI-based degrees
6:35 – Only 15% of papers published share their code/data
8:20 – PyTorch outpacing TensorFlow in research papers
11:12 – Bigger models, datasets and compute budgets drive performance
15:40 – Increased performance costing more for incremental improvement
18:13 – Deep learning is getting more efficient
21:11 – AI for conversations is getting better
23:02 – Machine translation for code (Python to C++)
25:32 – Many algorithms starting to beat human baseline for NLP on GLUE test
27:04 – Using Transformers for computer vision
30:11 – AI performs incredibly well on mammography tasks across two regions (US and UK)
31:30 – Causal inference in ML
34:30 – ML for synthesizing new molecules
36:26 – AI starts to read DNA-encoded molecules
39:39 – AI generates tennis matches between any tennis players you want
40:46 – Transformers being used for object detection
41:46 – AI which learns from its dreams
44:26 – Really efficient on-device computer vision models
45:04 – Evolving machine learning algorithms from scratch (AutoML Zero)
46:20 – Federated learning is now booming
47:16 – Privacy-preserving ML
48:35 – Using Gaussian Processes for estimating model uncertainty
50:11 – SECTION 2: Talent
50:12 – Many top companies stealing AI professors
52:02 – Abu Dhabi opens the world's first AI university
54:03 – Many Chinese AI PhD's depart China for other countries
54:52 – US based companies and institutions dominate NeurIPS and ICML (ML conferences)
56:13 – Three times more AI job postings then views for AI roles
56:58 – TensorFlow and Keras have more job postings on LinkedIn than PyTorch
57:18 – SECTION 3: Industry
59:06 – INTERMISSION
59:44 – AI predicting metabolic response to food
1:00:45 – FDA acknowledge lack of policy for AI-driven systems
1:01:54 – Less than 1% of AI-based medical imaging studies are high-quality
1:02:48 – First reimbursement approval for deep learning based medical imagining
1:05:16 – Self-driving mileage in California still well above autonomous driving mileage
1:10:36 – Supervised ML improvements seem to follow an S-curve
1:12:13 – A new kind of approach to self-driving cars
1:17:17 – New AI-first chips (competition for NVIDIA)
1:19:41 – The rise of MLOps
1:21:52 – Computer vision for auto insurance claims
1:23:45 – Using NLP to detect money laundering and terrorist schemes on the web
1:25:17 – Robots in stock factories are picking millions of items per month
1:26:23 – HuggingFace's open-source NLP work is driving NLP's explosion
1:28:09 – SECTION 4: Politics
1:29:59 – Creating a search engine for faces
1:33:29 – GPT3 outputs bias predictions like GPT2
1:33:49 – US military adopting deep reinforcement learning techniques
1:35:52 – Fighter pilots vs AI pilots
1:38:28 – Google's People and AI guidebook talks about fairness, interpretability, privacy
1:40:46 – China fronts big cash for chip manufacturing
1:45:00 – A call to tackle climate change, food waste, generating new battery technologies and more with ML
1:45:45 – SECTION 5: Predictions
1:49:22 – A special guest appears
The first one 😀
2 hours of masterpiece 🔥
woohoo!
this whole year is like Halloween. lol
Early!
7th
I was so sad after finishing Zero to Mastery course with Daniel Bourke's voice and when i start another courses and lecturer's voice doesn't match, was so sad until i found your channel. Alllllrighty then!
you accepted coookies! C++ IS MUCH FASTER AND IF AI CONVERTS PYTHON CODE TO C++, THATS FREAKING COOL! WILD WILD WILD
Future dan
Happy Halloween sir
It’s amazing the bullshit you get fed when you google it, that’s what you mean right?
Background look like qxq lecture
You are my best teacher brother!! You understand how everyone is!! I'm also a typing fan.. One day for sure I'll meet you with almond butter in my hand 😉
These are great. I can see them become a highly anticipated yearly event.
This is excellent! (So proud of my friend Chip Huyen being a part of this).
Favorite AI innovation: GPT-3. Have been using it since July and amazed by its possibilities.
Prediction for 2021: ML models will require less data to create better recommendations algorithms faster.
Thank you for such a comprehensive breakdown!
@Daniel Bourke Do you have a Discord or something similar?
Great video, thanks for the breakdown. You give some nice additions to the content in the slides
One of the best 1hr+ video I have seen in a long time. Keep going, Daniel!!!
you are so passionate about ML. thanks for your effort
As somebody who has just begun learning AI and Machine Learning (Stanford ML Course) you going through this really helps me to understand what I am reading. Cheers bro!
Awesome! Thanks, Daniel!
I am also start getting into MLOps.
Upvote for MLOps video!
Very valuable content, i could have never made it through this summary alone…
many thanks for sharing Daniel, to me kind of worried the fact Pytorch is getting more popular vs TF as I am now taking some courses on ML which are mostly based on the latter for CNNs etc 🙁 and that actually surprised me as I actually believe most of courses are TF based! do you guys think this will be a turning point really?
hey buddy will you explain how to do the math part in learning machine learning?
im with you brother … keep roking…
I would love to see a vide from you about MLOps. Keep up the good work, good sir!
I love how you mispronounce Huawei. 😀
But in the job market, there are still more jobs in CV than NLP(even with the GPT-3)? Is it just because CV can have more application than NLP?
Nice and perfect explanations, by the way which is your chrome extension for new tab ?
9:40 New papers are implemented in tensorflow/addons on github. It's the former tf.contrib. The addons community is friendly but not many people know about the repo. It seems to me that speeding up your code is easier with PyTorch, i.e. refactoring your code into CUDA C++. This may be appealing to researchers who want to train their own custom large language models. And don't forget that facebook ai also sponsors research projects whereas Google mainly put their money on subsidiaries.
Great video. More content on MLops! YES!