mirror of
https://github.com/xai-org/grok-1.git
synced 2024-11-23 12:09:52 +03:00
Compare commits
4 Commits
7096b37233
...
ac5ee50129
Author | SHA1 | Date | |
---|---|---|---|
|
ac5ee50129 | ||
|
310e19eee2 | ||
|
1ff4435d25 | ||
|
b0e77734fe |
29
README.md
29
README.md
@ -2,7 +2,8 @@
|
|||||||
|
|
||||||
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
|
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
|
||||||
|
|
||||||
Make sure to download the checkpoint and place `ckpt-0` directory in `checkpoint`.
|
Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
|
||||||
|
|
||||||
Then, run
|
Then, run
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
@ -17,18 +18,42 @@ The script loads the checkpoint and samples from the model on a test input.
|
|||||||
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
|
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
|
||||||
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
|
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
|
||||||
|
|
||||||
|
# Model Specifications
|
||||||
|
|
||||||
|
Grok-1 is currently designed with the following specifications:
|
||||||
|
|
||||||
|
- **Parameters:** 314B
|
||||||
|
- **Architecture:** Mixture of 8 Experts (MoE)
|
||||||
|
- **Experts Utilization:** 2 experts used per token
|
||||||
|
- **Layers:** 64
|
||||||
|
- **Attention Heads:** 48 for queries, 8 for keys/values
|
||||||
|
- **Embedding Size:** 6,144
|
||||||
|
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens
|
||||||
|
- **Additional Features:**
|
||||||
|
- Rotary embeddings (RoPE)
|
||||||
|
- Supports activation sharding and 8-bit quantization
|
||||||
|
- **Maximum Sequence Length (context):** 8,192 tokens
|
||||||
|
|
||||||
# Downloading the weights
|
# Downloading the weights
|
||||||
|
|
||||||
You can download the weights using a torrent client and this magnet link:
|
You can download the weights using a torrent client and this magnet link:
|
||||||
|
|
||||||
```
|
```
|
||||||
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
|
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
|
||||||
```
|
```
|
||||||
|
|
||||||
Or using IPFS CID (blake3):
|
using IPFS CID (blake3):
|
||||||
```
|
```
|
||||||
bafyb4id5xdepghz5pvplhylgz7scjkoefh4d34j4kdw2w5hzrblv3uwbw4
|
bafyb4id5xdepghz5pvplhylgz7scjkoefh4d34j4kdw2w5hzrblv3uwbw4
|
||||||
```
|
```
|
||||||
|
|
||||||
|
or directly using HuggingFace:
|
||||||
|
```
|
||||||
|
git clone https://github.com/xai-org/grok-1.git && cd grok-1
|
||||||
|
pip install huggingface_hub[hf_transfer]
|
||||||
|
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
|
||||||
|
```
|
||||||
|
|
||||||
# License
|
# License
|
||||||
|
|
||||||
The code and associated Grok-1 weights in this release are licensed under the
|
The code and associated Grok-1 weights in this release are licensed under the
|
||||||
|
Loading…
Reference in New Issue
Block a user