Grok-1: Open Release of 314 Billion Parameter Language Model

Grok-1, a massive language model with 314 billion parameters, is now open to all. Developed by xAI, it’s a raw base model trained on heaps of text, not fine-tuned for specific tasks. Released under the Apache 2.0 license, it’s up for grabs on GitHub. This powerhouse of a model operates on a Mixture-of-Experts setup, with 25% of its weights active per token. Developed using JAX and Rust, wrapped up in October 2023.

1 Like