mirror of
https://github.com/xai-org/grok-1.git
synced 2025-04-03 18:00:10 +03:00
Update README.md
This commit is contained in:
parent
7050ed204b
commit
e54ab18216
68
README.md
68
README.md
@ -1,15 +1,31 @@
|
|||||||
# Grok-1
|
|
||||||
|
██████╗ ██████╗ ██████╗ ██╗ ██╗
|
||||||
|
██╔════╝ ██╔══██╗██╔═══██╗██║ ██╔╝
|
||||||
|
██║ ███╗██████╔╝██║ ██║█████╔╝
|
||||||
|
██║ ██║██╔══██╗██║ ██║██╔═██╗
|
||||||
|
╚██████╔╝██║ ██║╚██████╔╝██║ ██╗
|
||||||
|
╚═════╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═╝
|
||||||
|
|
||||||
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
|
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
|
||||||
|
|
||||||
Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
|
Make sure to download the checkpoint and place the ckpt-0 directory in checkpoints - see Downloading the weights (#downloading-the-weights)
|
||||||
|
|
||||||
Then, run
|
Then, run
|
||||||
|
|
||||||
```shell
|
shell
|
||||||
pip install -r requirements.txt
|
sudo apt update
|
||||||
python run.py
|
sudo apt install python3 python3-pip git curl
|
||||||
```
|
git clone https://github.com/xai-org/grok.git
|
||||||
|
cd grok
|
||||||
|
sudo pip3 install -r requirements.txt
|
||||||
|
cd grok
|
||||||
|
sudo pip3 install -r requirements.txt
|
||||||
|
chown +x grok
|
||||||
|
sudo apt install dos2unix
|
||||||
|
dos2unix grok
|
||||||
|
sudo apt update
|
||||||
|
./grok
|
||||||
|
|
||||||
|
|
||||||
to test the code.
|
to test the code.
|
||||||
|
|
||||||
@ -18,39 +34,31 @@ The script loads the checkpoint and samples from the model on a test input.
|
|||||||
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
|
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
|
||||||
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
|
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
|
||||||
|
|
||||||
# Model Specifications
|
Model Specifications
|
||||||
|
|
||||||
Grok-1 is currently designed with the following specifications:
|
Grok-1 is currently designed with the following specifications:
|
||||||
|
|
||||||
- **Parameters:** 314B
|
Parameters: 314B
|
||||||
- **Architecture:** Mixture of 8 Experts (MoE)
|
Architecture: Mixture of 8 Experts (MoE)
|
||||||
- **Experts Utilization:** 2 experts used per token
|
Experts Utilization: 2 experts used per token
|
||||||
- **Layers:** 64
|
Layers: 64
|
||||||
- **Attention Heads:** 48 for queries, 8 for keys/values
|
Attention Heads: 48 for queries, 8 for keys/values
|
||||||
- **Embedding Size:** 6,144
|
Embedding Size: 6,144
|
||||||
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens
|
Tokenization: SentencePiece tokenizer with 131,072 tokens
|
||||||
- **Additional Features:**
|
Additional Features:
|
||||||
- Rotary embeddings (RoPE)
|
Rotary embeddings (RoPE)
|
||||||
- Supports activation sharding and 8-bit quantization
|
Supports activation sharding and 8-bit quantization
|
||||||
- **Maximum Sequence Length (context):** 8,192 tokens
|
Maximum Sequence Length (context): 8,192 tokens
|
||||||
|
|
||||||
# Downloading the weights
|
|
||||||
|
|
||||||
|
Downloading the weights
|
||||||
You can download the weights using a torrent client and this magnet link:
|
You can download the weights using a torrent client and this magnet link:
|
||||||
|
|
||||||
```
|
|
||||||
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
|
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
|
||||||
```
|
|
||||||
|
|
||||||
or directly using [HuggingFace 🤗 Hub](https://huggingface.co/xai-org/grok-1):
|
or directly using HuggingFace Hub (https://huggingface.co/xai-org/grok-1):
|
||||||
```
|
|
||||||
git clone https://github.com/xai-org/grok-1.git && cd grok-1
|
git clone https://github.com/xai-org/grok-1.git && cd grok-1
|
||||||
pip install huggingface_hub[hf_transfer]
|
pip install huggingface_hub[hf_transfer]
|
||||||
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
|
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
|
||||||
```
|
|
||||||
|
|
||||||
# License
|
License
|
||||||
|
The code and associated Grok-1 weights in this release are licensed under the Apache 2.0 license. The license only applies to the source files in this
|
||||||
The code and associated Grok-1 weights in this release are licensed under the
|
|
||||||
Apache 2.0 license. The license only applies to the source files in this
|
|
||||||
repository and the model weights of Grok-1.
|
repository and the model weights of Grok-1.
|
||||||
|
Loading…
Reference in New Issue
Block a user