22.06.21
A new, open, alternative to the super big language model GPT-3 has been released. It's got 6 billion parameters and is trained on The Pile using Jax. It wrote the following about itself :
"Written with the backdrop of the fast-changing era of AI, GPT-3 is a new, open-source project that focuses on building a more human-like open-source text generator. GPT-3 aims to break the GPT (Generative PreTexts from the Chinese Room) and language models are very limited in modeling languages and the creativity of imagination. We are bringing it to mainstream level and promising a huge impact in machine translation, question answering, and code generation.""
Hmm - that's not perfect, is it? A sick parrot? Have a go here.
🛎️ Why this matters: The singularity is far, but at least it's open.