GPT 3

GPT-J-6B – Most advanced GPT-3 Alternative for AI Text Generation – Intro & Demo



1littlecoder

GPT-3 was in all news its Human-like Text Generation capabilities. Shame it was never opensourced by Open AI (!). Hence EleutherAI with a bunch of researchers set out to create a true GPT-3 open-source alternative and GPT-J-6B is the product of those efforts.

GPT-J-6B is a 6 billion parameter, autoregressive text generation model trained on The Pile.

Useful Links:
✅ Mesh Transformer JAX – https://github.com/kingoflolz/mesh-transformer-jax
✅ GPT-J-6B: 6B JAX-Based Transformer Blogpost by Aran Komatsuzaki – https://arankomatsuzaki.wordpress.com/2021/06/04/gpt-j/
✅Web Demo of GPT-J-6B for Text Generation – https://6b.eleuther.ai/
✅Colab (Python) Notebook – https://colab.research.google.com/github/kingoflolz/mesh-transformer-jax/blob/master/colab_demo.ipynb#scrollTo=nvlAK6RbCJYg
✅JAX – https://jax.readthedocs.io/en/latest/notebooks/quickstart.html

Related Videos:

🎥 AI Text Generation with GPT-3 OpenSource Alternative GPT-Neo Model using Hugging Face Hub
https://www.youtube.com/watch?v=0PuVk6c8Ua8

🎥 AI-Generated Blog Content with GPT-Neo (GPT-3 Alternative) + Gradio | Python ML Web App – https://youtu.be/d_xRYyy2LFM