fix the runtimeError of in-place operation when use transformers for training (#18) 82c7bc9 verified czczup kingsley01 commited on Sep 25
feat: add eos_token_id to generation_config.json (needed by vllm infer) (#12) 989a689 verified czczup wxsm commited on Aug 22
Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet (#3) 743a544 verified czczup kosung commited on Jul 7