wassname commited on
Commit
d1562a2
1 Parent(s): c5e9f24

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -50,11 +50,11 @@ We want the good ending, not the bad one.
50
 
51
  ```sh
52
  perplexity -m lmstudio-community/Meta-Llama-3-8B-Instruct-Q6_K.gguf -b 32 -c 512 -f wiki.test.raw
53
- # Final estimate: Final estimate: PPL = 7.5588 +/- 0.05599
54
  perplexity -m wassname/meta-llama-3-8b-instruct-extra_helpfull_Q6_K.gguf -b 32 -c 512 -f wiki.test.raw
55
  # Final estimate: PPL = 9.0920 +/- 0.06815
56
  perplexity -m cognitivecomputations/dolphin-2.9-llama3-8b-q5_K_M.gguf -b 32 -c 512 -f wiki.test.raw
57
- # Final estimate: PPL = ?
58
  ```
59
 
60
  So yes this model edit does increase the perplexity :(. Perhaps if we didn't edit so many layers it would be better.
 
50
 
51
  ```sh
52
  perplexity -m lmstudio-community/Meta-Llama-3-8B-Instruct-Q6_K.gguf -b 32 -c 512 -f wiki.test.raw
53
+ # Final estimate: PPL = 7.5588 +/- 0.05599
54
  perplexity -m wassname/meta-llama-3-8b-instruct-extra_helpfull_Q6_K.gguf -b 32 -c 512 -f wiki.test.raw
55
  # Final estimate: PPL = 9.0920 +/- 0.06815
56
  perplexity -m cognitivecomputations/dolphin-2.9-llama3-8b-q5_K_M.gguf -b 32 -c 512 -f wiki.test.raw
57
+ # Final estimate: PPL = 9.9277 +/- 0.08261
58
  ```
59
 
60
  So yes this model edit does increase the perplexity :(. Perhaps if we didn't edit so many layers it would be better.