grok-1 / README.md
Real-tech's picture
xai-org/grok-1
dcc5e1a verified
|
raw
history blame
1.37 kB
metadata
license: apache-2.0
pipeline_tag: robotics
library_name: grok
tags:
  - grok-1
  - chemistry
  - biology
  - legal
  - art
  - code
  - climate
  - medical
  - not-for-all-audiences
  - text-generation-inference
  - music
  - merge
datasets:
  - openai/MMMLU
  - fka/awesome-chatgpt-prompts
  - HuggingFaceFV/finevideo
  - gopipasala/fka-awesome-chatgpt-prompts
metrics:
  - accuracy
base_model:
  - mattshumer/Reflection-Llama-3.1-70B
  - black-forest-labs/FLUX.1-dev
new_version: mattshumer/Reflection-Llama-3.1-70B

Grok-1

This repository contains the weights of the Grok-1 open-weights model. You can find the code in the GitHub Repository.

Download instruction

Clone the repo & download the int8 checkpoint to the checkpoints directory by executing this command in the repo root directory:

git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False

Then, you can run:

pip install -r requirements.txt
python run.py

You should be seeing output from the language model.

Due to the large size of the model (314B parameters), a multi-GPU machine is required to test the model with the example code.

p.s. we're hiring: https://x.ai/careers