File size: 1,371 Bytes
a1d5101
 
dcc5e1a
5de83eb
 
 
dcc5e1a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a1d5101
15af4e5
 
8815da2
15af4e5
2c60e8f
 
15af4e5
 
2c60e8f
 
f20f739
15af4e5
 
8815da2
 
 
 
f20f739
8815da2
15af4e5
 
 
 
e35b056
5de83eb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
license: apache-2.0
pipeline_tag: robotics
library_name: grok
tags:
- grok-1
- chemistry
- biology
- legal
- art
- code
- climate
- medical
- not-for-all-audiences
- text-generation-inference
- music
- merge
datasets:
- openai/MMMLU
- fka/awesome-chatgpt-prompts
- HuggingFaceFV/finevideo
- gopipasala/fka-awesome-chatgpt-prompts
metrics:
- accuracy
base_model:
- mattshumer/Reflection-Llama-3.1-70B
- black-forest-labs/FLUX.1-dev
new_version: mattshumer/Reflection-Llama-3.1-70B
---
# Grok-1

This repository contains the weights of the Grok-1 open-weights model. You can find the code in the [GitHub Repository](https://github.com/xai-org/grok-1/tree/main).

# Download instruction
Clone the repo & download the `int8` checkpoint to the `checkpoints` directory by executing this command in the repo root directory:

```shell
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
```

Then, you can run:

```shell
pip install -r requirements.txt
python run.py
```

You should be seeing output from the language model.

Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.

p.s. we're hiring: https://x.ai/careers