Compare commits

..

5 Commits

Author SHA1 Message Date
Szymon Tworkowski
d6d9447e2d
Update huggingface link 2024-03-18 11:40:01 -07:00
Lve Lvee
7207216386
Create .gitignore for checkpoints (#149)
ignore the checkpoints files
2024-03-18 11:01:17 -07:00
Seth Junot
310e19eee2
Corrected checkpoint dir name, download section link 2024-03-18 09:39:02 -07:00
Gareth Paul Jones (GPJ)
1ff4435d25
Update README with Model Specifications (#27)
Added an overview of the model as discussed in response to #14. 

Adding more info on the the model specs before they proceed to download
the checkpoints should help folks ensure they have the necessary
resources to effectively utilize Grok-1.
2024-03-18 09:36:24 -07:00
Szymon Tworkowski
b0e77734fe
Make download instruction more clear (#155) 2024-03-18 09:11:17 -07:00
3 changed files with 30 additions and 49 deletions

12
.gitignore vendored
View File

@ -1,10 +1,2 @@
# Logs
logs
*.log
*.local
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
checkpoints/*
!checkpoints/README.md

View File

@ -1,10 +1,9 @@
# Grok-1
**英文文档** [中文文档](./README_CN.md)
This repository contains JAX example code for loading and running the Grok-1 open-weights model.
Make sure to download the checkpoint and place `ckpt-0` directory in `checkpoints`.
Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
Then, run
```shell
@ -19,15 +18,38 @@ The script loads the checkpoint and samples from the model on a test input.
Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
## Downloading the weights
# Model Specifications
Grok-1 is currently designed with the following specifications:
- **Parameters:** 314B
- **Architecture:** Mixture of 8 Experts (MoE)
- **Experts Utilization:** 2 experts used per token
- **Layers:** 64
- **Attention Heads:** 48 for queries, 8 for keys/values
- **Embedding Size:** 6,144
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens
- **Additional Features:**
- Rotary embeddings (RoPE)
- Supports activation sharding and 8-bit quantization
- **Maximum Sequence Length (context):** 8,192 tokens
# Downloading the weights
You can download the weights using a torrent client and this magnet link:
```shell
```
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```
## License
or directly using [HuggingFace 🤗 Hub](https://huggingface.co/xai-org/grok-1):
```
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
```
# License
The code and associated Grok-1 weights in this release are licensed under the
Apache 2.0 license. The license only applies to the source files in this

View File

@ -1,33 +0,0 @@
# grok-1
[English Document](./README.md) **Chinese Document**
该存储库包含用于加载和运行 `Grok-1` 开放权重模型的 `JAX` 示例代码。
确保下载检查点并将 `ckpt-0` 目录放置在 `checkpoints` 中。
然后,运行
```shell
pip install -r requirements.txt
python run.py
```
测试代码。
该脚本在测试输入上加载模型中的检查点和样本。
由于模型规模较大314B参数需要有足够GPU内存的机器才能使用示例代码测试模型。
该存储库中 `MoE 层` 的实现效率不高。选择该实现是为了避免需要自定义内核来验证模型的正确性。
## 下载权重
您可以使用 `torrent` 客户端和此磁力链接下载权重:
```shell
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```
## 执照
此版本中的代码和相关 `Grok-1` 权重已获得许可 `Apache 2.0` 许可证。
该许可证仅适用于本文件中的源文件 `Grok-1` 的存储库和模型权重。