Google Cloud Tech
Dale’s Blog → https://goo.gle/3xOeWoK
Classify text with BERT → https://goo.gle/3AUB431
Over the past five years, Transformers, a neural network architecture, have completely transformed state-of-the-art natural language processing. Want to translate text with machine learning? Curious how an ML model could write a poem or an op ed? Transformers can do it all. In this episode of Making with ML, Dale Markowitz explains what transformers are, how they work, and why they’re so impactful. Watch to learn how you can start using transformers in your app!
Chapters:
0:00 – Intro
0:51 – What are transformers?
3:18 – How do transformers work?
7:41 – How are transformers used?
8:35 – Getting started with transformers
Watch more episodes of Making with Machine Learning → https://goo.gle/2YysJRY
Subscribe to Google Cloud Tech → https://goo.gle/GoogleCloudTech
#MakingwithMachineLearning #MakingwithML
product: Cloud – General; fullname: Dale Markowitz; re_ty: Publish;
The pointless background music is so irritating, I gave up
Thanks for the video You mentioned that GPT 3 was trained on 45 terabytes of text. I have seen much smaller numbers, like 570 gig. Can you give me a reference for the training data size. I am working on a project and I would like to cite the correct number. Thanks
I want to ask – when you translate a sentence from one language to another fair enough to know how the sentence will be read in the language it has been translated.
However, bigger question is – did we totally eat up the grammar of the sentence bcz grammar plays differently for individual language,
What’s the point of translating a sentence form one language to another if the grammatical sense of the language is lost?
送我一个云服务器看看实力
When I saw this title, I was hoping to better understand the mathematical workings of transformers such as matrices and the like. Maybe you could do a follow-up video explaining mathematically how transformers work.
thank you for your time
I wish someone could explain the concrete math behind transformers and attention
Stooooooppp with the backtracks!!!!!!!
When I was a kid, I knew the trouble of translation were due to literally translation words, without contextual/ sequential awareness. I knew it's important to distinguish between synonyms. I've imagined there's a button that generate the translation output then you can highlights the you words that doesn't make sense or want improvement on it . then regenerate text translation. this type of nlp probably exist before I program my first hello world (+15y ago)!
They have to add that silly Optimus Prime image on every video about this subject? Are we 6 year olds? I used to love the TV show. When I was six. Then I grew up.
And that's how ChatGPT has born
thank u mam❤🔥
phenomenal video
At minute 2:32 (Approx.)
What are GPUs?
I've heard of CPUs but not GPUs.
1:00 Neural Network
1:08 CNN
1:40 RNN
3:35 Positional Encoding
4:21 Attention
5:58 Self-Attention
Transformers. More than meets the eye
just sneaking in a bit of THE MESSAGE along the way…
Very good high level explanation what are the mayor innovations coming along with transformers but I want to stress that these large language models using Transformer approach impose a risk as well. The large language models show more and more unexplainable phenomena which can impose big risks on society. The arm race that has now started between Microsoft and Google is no good sign for deploying this innovation savely.
How did you sync your talking cadence to the background music?
Great summary of the Transformers technology!
My only criticism: :The backroundmusic got annoying after 3-4 minutes, but that might just be me.
How did you condense so many pieces of information in such a short time? This video is on a next level, I loved it!
Optimum Pride Æ Æ Æ Æ Æ
Dr. Ashish Vaswani is a pioneer and nobody is talking about him. He is a scientist from Google Brain and the first author of the paper that introduced TANSFORMERS, and that is the backbone of all other recent models.
Self-attention is better than attention because it is aware of the connections between all the words in a sentence?
why optimus prime?
yyyyyyyy
I have more respect for Google after watching this Video. Not only did they provided their engineers with the funding to research, but they also let other companies like OpenAI to use said research. And they are opening up the knowledge for the general public with these video series.
2:58 😂😂 how big , really big
woww, she's good at explaining things
3:34
Very well explained. This video is must watch for anyone who wants to demystify the latest LLM technology. Wondering if this could be made into a more generic video with a quick high-level intro on neural networks for those who aren't in the field. I bet there are millions out there who want to get a basic understanding of how ChatGPT/Bard/Claude work without an in-depth technical deep dive.
OMG the BEST transformers video EVER!