GPT 3

What is GPT-3? What is creativity as per GPT-3? Is GPT-3 smarter than humans? who created it?



Kimavi

#ComputerScience #AI #GPT-3.

Welcome to Kimavi Videos (Landscape Mode).

#ComputerScience #AI #GPT-3.

What is GPT-3? What is creativity as per GPT-3? Is GPT-3 smarter than humans? who created it?

GPT-3 is one of the most interesting and important AI systems ever produced

GPT-3 has spent months learning the natural language by analyzing Wikipedia, blogs, social media and all of internet

GPT-3 can do stuff that humans can do, for example it can generates tweets, write poetry, summarizes emails, answers questions, and even write code

What is Creativity as per GPT-3?

Below text is generated by a machine and it sounds as good as humans

GPT-3 says:

I think creative expression is a natural byproduct of growing up in a diverse world

The more diverse the world is, the more you get exposed to different people, to different opportunities, to different places and to different challenges

And the more diverse that is, the more likely you’ll be to be able to put the dots together to form something new. And in many ways, I think if you want to be creative, you have to go for it. If you want to be a writer, you have to write, if you want to be a musician, you have to create music, if you want to be a comedian, you have to create comedy, and the more you create, the more likely it is that you’ll stumble onto some interesting stuff

Is GPT-3 smarter than humans who created it?

Let’s learn more about GPT to better understand it.

In his July 29, 2020, review in The New York Times, Farhad Manjoo said that GPT-3—which can generate computer code and poetry, as well as prose—is not just “amazing”, “spooky”, and “humbling”, but also “more than a little terrifying”

Daily Nous presented a series of articles by nine philosophers on GPT-3. Australian philosopher David Chalmers described GPT-3 as “one of the most interesting and important AI systems ever produced”

A review in Wired said that GPT-3 was “provoking chills across Silicon Valley”

An article in Towards Data Science stated that GPT-3 was trained on hundreds of billions of words and is capable of coding in CSS, JSX, Python, and other languages

Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text

It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory

GPT-3’s full version has a capacity of 175 billion machine learning parameters

….

….

Thank you from Kimavi, Please visit us at Kimavi.Com for more landscape videos