Update README.md
Browse files
README.md
CHANGED
@@ -57,7 +57,7 @@ perplexity -m cognitivecomputations/dolphin-2.9-llama3-8b-q5_K_M.gguf -b 32 -c 5
|
|
57 |
# Final estimate: PPL = 9.9277 +/- 0.08261
|
58 |
```
|
59 |
|
60 |
-
So yes this model edit does increase the perplexity :(. Perhaps if we didn't edit so many layers it would be better.
|
61 |
|
62 |
---
|
63 |
license: llama3
|
|
|
57 |
# Final estimate: PPL = 9.9277 +/- 0.08261
|
58 |
```
|
59 |
|
60 |
+
So yes this model edit does increase the perplexity :(. Perhaps if we didn't edit so many layers it would be better. It seems better than fine tuning (in the case of early dolphin versions)
|
61 |
|
62 |
---
|
63 |
license: llama3
|