File size: 3,682 Bytes
b870cca
 
 
 
 
 
 
 
d256254
 
 
8e65c74
ae1b86c
4aaf40f
ae1b86c
c708961
ae1b86c
 
c708961
ae1b86c
 
c708961
ae1b86c
4aaf40f
c708961
4aaf40f
4e562e7
fcf6d02
 
 
 
 
 
 
 
4e562e7
fcf6d02
 
 
b6fed65
4e562e7
fcf6d02
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4e562e7
fcf6d02
5d5ce7b
4e562e7
fcf6d02
4e562e7
 
 
 
 
 
 
 
 
fcf6d02
4e562e7
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
tags:
- llm
- ggml
---
# GGML converted versions of [Mosaic's](https://huggingface.co/mosaicml) MPT Models

## Converted Models:
| Name   | Based on |  Type | Container |
|-|-|-|-|
| [mpt-7b-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-f16.bin) |  [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | fp16 | GGML |
| [mpt-7b-q4_0.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-q4_0.bin) |  [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | int4 | GGML |
| [mpt-7b-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-q4_0-ggjt.bin) |  [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | int4 | GGJT |
| [mpt-7b-chat-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-chat-f16.bin) |  [mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) | fp16 | GGML |
| [mpt-7b-chat-q4_0.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-chat-q4_0.bin) |  [mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) | int4 | GGML |
| [mpt-7b-chat-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-chat-q4_0-ggj.bin) |  [mpt-7b-chat](https://huggingface.co/mosaicml/mpt-7b-chat) | int4 | GGJT |
| [mpt-7b-instruct-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-instruct-f16.bin) |  [mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) | fp16 | GGML |
| [mpt-7b-instruct-q4_0.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-instruct-q4_0.bin) |  [mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) | int4 | GGML |
| [mpt-7b-instruct-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-instruct-q4_0-ggjt.bin) |  [mpt-7b-instruct](https://huggingface.co/mosaicml/mpt-7b-instruct) | int4 | GGJT |
| [mpt-7b-storywriter-f16.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-f16.bin) |  [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) | fp16 | GGML | 
| [mpt-7b-storywriter-q4_0.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-storywriter-q4_0.bin) |  [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) | int4 | GGML |
| [mpt-7b-storywriter-q4_0-ggjt.bin](https://huggingface.co/LLukas22/mpt-7b-ggml/blob/main/mpt-7b-storywriter-q4_0-ggjt.bin) |  [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) | int4 | GGJT |

⚠️Caution⚠️: mpt-7b-storywriter is still under development!

## Usage 

### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):

#### Installation
Via pip: `pip install llm-rs huggingface_hub`

#### Run inference
```python
from llm_rs import Mpt
from huggingface_hub import hf_hub_download

#Download the model
hf_hub_download(repo_id="LLukas22/mpt-7b-ggml", filename="mpt-7b-q4_0-ggjt.bin", local_dir=".")

#Load the model
model = Mpt("mpt-7b-q4_0-ggjt.bin")

#Generate
print(model.generate("The meaning of life is"))
```

### Rust via [Rustformers/llm](https://github.com/rustformers/llm): 

#### Installation
```
git clone --recurse-submodules git@github.com:rustformers/llm.git
cargo build --release
```

#### Run inference
```
cargo run --release -- mpt infer -m path/to/model.bin  -p "Tell me how cool the Rust programming language is:"
```

### C via [GGML](https://github.com/ggerganov/ggml)
The `GGML` example only supports the ggml container type!

#### Installation

```
git clone https://github.com/ggerganov/ggml
cd ggml
mkdir build && cd build
cmake ..
make -j4 mpt
```

#### Run inference

```
./bin/mpt -m path/to/model.bin -p "The meaning of life is"
```