Dampfinchen commited on
Commit
9135b39
1 Parent(s): 89b01e3

Replace old generation_config.json with latest.

Browse files

The old generation_config.json does not specify <|eot_id|> as the stop token. This makes the model incapable of stopping when running it using transformers, which leads to low scores on the leaderboard V2.0 evaluation. (More info here: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard/discussions/815 )

Files changed (1) hide show
  1. generation_config.json +6 -3
generation_config.json CHANGED
@@ -1,6 +1,9 @@
1
  {
2
- "_from_model_config": true,
3
  "bos_token_id": 128000,
4
- "eos_token_id": 128001,
5
- "transformers_version": "4.40.2"
 
 
 
 
6
  }
 
1
  {
 
2
  "bos_token_id": 128000,
3
+ "eos_token_id": [128001, 128009],
4
+ "do_sample": true,
5
+ "temperature": 0.6,
6
+ "max_length": 4096,
7
+ "top_p": 0.9,
8
+ "transformers_version": "4.40.0.dev0"
9
  }