Update README.md
Browse files
README.md
CHANGED
@@ -10,9 +10,7 @@ tags:
|
|
10 |
|
11 |
# chronos-13b-v2
|
12 |
|
13 |
-
This is the
|
14 |
-
|
15 |
-
Only use this version for further quantization or if you would like to run in full precision, as long as you have the VRAM required.
|
16 |
|
17 |
This model is primarily focused on chat, roleplay, storywriting, with good reasoning and logic.
|
18 |
|
@@ -27,7 +25,7 @@ Your instruction or question here.
|
|
27 |
Not using the format will make the model perform significantly worse than intended.
|
28 |
|
29 |
## Other Versions
|
30 |
-
[
|
31 |
|
32 |
[GGML Versions provided by @TheBloke]()
|
33 |
|
|
|
10 |
|
11 |
# chronos-13b-v2
|
12 |
|
13 |
+
This is the 4bit GPTQ of **chronos-13b-v2** based on the **LLaMA v2** model. It works with Exllama and AutoGPTQ.
|
|
|
|
|
14 |
|
15 |
This model is primarily focused on chat, roleplay, storywriting, with good reasoning and logic.
|
16 |
|
|
|
25 |
Not using the format will make the model perform significantly worse than intended.
|
26 |
|
27 |
## Other Versions
|
28 |
+
[Original FP16 Model](https://huggingface.co/elinas/chronos-13b-v2)
|
29 |
|
30 |
[GGML Versions provided by @TheBloke]()
|
31 |
|