feat: add eos_token_id to generation_config.json (needed by vllm infer) 1c19c18 verified wxsm commited on Aug 16
Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet (#3) 743a544 verified czczup kosung commited on Jul 7