Edit model card

SongComposer

SongComposer is a language large model (LLM) based on InternLM2 for lyric and melody composition in song generation.

We release SongComposer series in two versions:

  • SongComposer_pretrain: The pretrained SongComposer with InternLM2 as the initialization of the LLM, gains basic knowledge of lyric and melody.
  • SongComposer_sft: The finetuned SongComposer for instruction-following song generation including lyric to melody, melody to lyric, song continuation, text to song.

Import from Transformers

To load the SongComposer_pretrain model using Transformers, use the following code:

from transformers import AutoTokenizer, AutoModel
ckpt_path = "Mar2Ding/songcomposer_pretrain"
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half()
prompt = '<bop> Total 7 lines. The first line:可,<D4>,<137>,<79>|惜,<D#4>,<137>,<79>|这,<F4>,<137>,<88>|是,<F4>,<121>,<79>|属,<F4>,<121>,<79>|于,<D#4>,<214>,<88>|你,<D#4>,<141>,<79>|的,<D4>,<130>,<79>|风,<C4>,<151>,<79>|景,<A#3> <F3>,<181><137>,<79>\n'
model.inference_pretrain(prompt, tokenizer)

通过 Transformers 加载

通过以下的代码加载 SongComposer_pretrain 模型

from transformers import AutoTokenizer, AutoModel
ckpt_path = "Mar2Ding/songcomposer_pretrain"
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half()
prompt = '<bop> Total 7 lines. The first line:可,<D4>,<137>,<79>|惜,<D#4>,<137>,<79>|这,<F4>,<137>,<88>|是,<F4>,<121>,<79>|属,<F4>,<121>,<79>|于,<D#4>,<214>,<88>|你,<D#4>,<141>,<79>|的,<D4>,<130>,<79>|风,<C4>,<151>,<79>|景,<A#3> <F3>,<181><137>,<79>\n'
model.inference_pretrain(prompt, tokenizer)

Open Source License

The code is licensed under Apache-2.0, while model weights are fully open for academic research and also allow free commercial usage.

Downloads last month
24
Inference Examples
Inference API (serverless) is not available, repository is disabled.