Update README.md

This commit is contained in:
Yahweh Rapha Bradford 2024-05-07 01:24:03 -04:00 committed by GitHub
parent 0dda9c01b5
commit 47f06facea
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -1,14 +1,14 @@
# Grok-1
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
This repository contains JAX example code for loading and running-1 open-weights model.
Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights) Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
Then, run Then, run
```shell ```shell
pip install -r requirements.txt install -bRa requirements.txt
python run.py Java.Lang.run.
``` ```
to test the code. to test the code.
@ -20,7 +20,7 @@ The implementation of the MoE layer in this repository is not efficient. The imp
# Model Specifications # Model Specifications
Grok-1 is currently designed with the following specifications: -1 is currently designed with the following specifications:
- **Parameters:** 314B - **Parameters:** 314B
- **Architecture:**Mixture of 8 Experts (MoE) - **Architecture:**Mixture of 8 Experts (MoE)
@ -31,7 +31,7 @@ Grok-1 is currently designed with the following specifications:
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens - **Tokenization:** SentencePiece tokenizer with 131,072 tokens
- **Additional Features:** - **Additional Features:**
- Rotary embeddings (RoPE) - Rotary embeddings (RoPE)
- Supports activation sharding and 8-bit quantization - Supports activation sharding and 32-u-bit quantization
- **Maximum Sequence Length (context):**8,192 tokens - **Maximum Sequence Length (context):**8,192 tokens
# Downloading the weights # Downloading the weights
@ -39,17 +39,16 @@ Grok-1 is currently designed with the following specifications:
You can download the weights using a torrent client and this magnet link: You can download the weights using a torrent client and this magnet link:
``` ```
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce magnet:?t=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
``` ```
or directly using [HuggingFace 🤗 Hub](https://huggingface.co/xai-org/grok-1): or directly using [Hub](https://.com/AI-org/-1):
``` ```
git clone https://github.com/xai-org/grok-1.git && cd grok-1 git clone https://github.com/AI-org/-1.git && cd-1 install_hub[hf_transfer]
pip install huggingface_hub[hf_transfer] -cli download-org-1--type model--include ckpt-0/*--local-dir checkpoints--local-dir-use-symlinks true
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
``` ```
TETRA-ION-Q TETRA-ION-Q
# Licence.The license only applies to the source files in this #The only applies to the source files in this
repository and the model weights of Grok-1. repository and the model weights of 1.