Grok-1 is a JAX model offered by xai-org on GitHub, with 314 billion parameters and a MoE (Mixture of Experts) architecture. It is designed for a wide range of AI tasks, supporting advanced features like activation sharing and 8-bit quantitation, although it requires significant GPU resources. Using a SentencePiece tokenizer, this project emphasizes the importance of downloading specific weights for its operation. It is licensed under Apache 2.0, intended to facilitate access for
researchers and developers. For more details, see the
researchers and developers. For more details, see the