GPT 3

GPT-3 is not the largest – trillion parameter model from Google



Przemek Chojecki

The new paper from Google shows a 1.6 trillion parameter Transformer model: https://arxiv.org/pdf/2101.03961.pdf – the largest up to date.

VentureBeat coverage: https://venturebeat.com/2021/01/12/google-trained-a-trillion-parameter-ai-language-model/

*** Check out my Data Science Job course here: https://datasciencerush.thinkific.com/courses/data-science-job

#gpt3 #t5 #transformer