Compare commits

..

3 Commits

Author SHA1 Message Date
SmallTeddy
226084a2fc feat: add 2024-03-18 17:58:24 +08:00
SmallTeddy
519516f252 docs: add chinese document and fix origin 2024-03-18 17:51:38 +08:00
SmallTeddy
4488f77ad4 docs: add chinese document and fix origin 2024-03-18 17:39:06 +08:00
3 changed files with 49 additions and 30 deletions

12
.gitignore vendored
View File

@ -1,2 +1,10 @@
checkpoints/*
!checkpoints/README.md
# Logs
logs
*.log
*.local
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store

View File

@ -1,9 +1,10 @@
# Grok-1
**英文文档** [中文文档](./README_CN.md)
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
Make sure to download the checkpoint and place `ckpt-0` directory in `checkpoints`.
Then, run
```shell
@ -18,38 +19,15 @@ The script loads the checkpoint and samples from the model on a test input.
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
# Model Specifications
Grok-1 is currently designed with the following specifications:
- **Parameters:** 314B
- **Architecture:** Mixture of 8 Experts (MoE)
- **Experts Utilization:** 2 experts used per token
- **Layers:** 64
- **Attention Heads:** 48 for queries, 8 for keys/values
- **Embedding Size:** 6,144
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens
- **Additional Features:**
- Rotary embeddings (RoPE)
- Supports activation sharding and 8-bit quantization
- **Maximum Sequence Length (context):** 8,192 tokens
# Downloading the weights
## Downloading the weights
You can download the weights using a torrent client and this magnet link:
```
```shell
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```
or directly using [HuggingFace 🤗 Hub](https://huggingface.co/xai-org/grok-1):
```
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
```
# License
## License
The code and associated Grok-1 weights in this release are licensed under the
Apache 2.0 license. The license only applies to the source files in this

33
README_CN.md Normal file
View File

@ -0,0 +1,33 @@
# grok-1
[English Document](./README.md) **Chinese Document**
该存储库包含用于加载和运行 `Grok-1` 开放权重模型的 `JAX` 示例代码。
确保下载检查点并将 `ckpt-0` 目录放置在 `checkpoints` 中。
然后,运行
```shell
pip install -r requirements.txt
python run.py
```
测试代码。
该脚本在测试输入上加载模型中的检查点和样本。
由于模型规模较大314B参数需要有足够GPU内存的机器才能使用示例代码测试模型。
该存储库中 `MoE 层` 的实现效率不高。选择该实现是为了避免需要自定义内核来验证模型的正确性。
## 下载权重
您可以使用 `torrent` 客户端和此磁力链接下载权重:
```shell
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```
## 执照
此版本中的代码和相关 `Grok-1` 权重已获得许可 `Apache 2.0` 许可证。
该许可证仅适用于本文件中的源文件 `Grok-1` 的存储库和模型权重。