mirror of
https://github.com/xai-org/grok-1.git
synced 2024-11-27 05:59:52 +03:00
Compare commits
2 Commits
929a1d2ce8
...
e5ea38bde9
Author | SHA1 | Date | |
---|---|---|---|
|
e5ea38bde9 | ||
|
4e2e30bd6f |
@ -25,6 +25,7 @@ Grok-1 is currently designed with the following specifications:
|
|||||||
- **Parameters:** 314B
|
- **Parameters:** 314B
|
||||||
- **Architecture:** Mixture of 8 Experts (MoE)
|
- **Architecture:** Mixture of 8 Experts (MoE)
|
||||||
- **Experts Utilization:** 2 experts used per token
|
- **Experts Utilization:** 2 experts used per token
|
||||||
|
- **Maximum Sequence Length (context):** 8,192 tokens
|
||||||
- **Layers:** 64
|
- **Layers:** 64
|
||||||
- **Attention Heads:** 48 for queries, 8 for keys/values
|
- **Attention Heads:** 48 for queries, 8 for keys/values
|
||||||
- **Embedding Size:** 6,144
|
- **Embedding Size:** 6,144
|
||||||
@ -32,7 +33,6 @@ Grok-1 is currently designed with the following specifications:
|
|||||||
- **Additional Features:**
|
- **Additional Features:**
|
||||||
- Rotary embeddings (RoPE)
|
- Rotary embeddings (RoPE)
|
||||||
- Supports activation sharding and 8-bit quantization
|
- Supports activation sharding and 8-bit quantization
|
||||||
- **Maximum Sequence Length (context):** 8,192 tokens
|
|
||||||
|
|
||||||
# Downloading the weights
|
# Downloading the weights
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user