EMusicGen

The model weights for generating ABC melodies by emotions.

Demo

https://www.modelscope.cn/studios/monetjoe/EMusicGen

Maintenance

git clone git@hf.com:monetjoe/EMusicGen
cd EMusicGen

Fine-tuning results

Dataset Loss curve Min eval loss
VGMIDI 0.23854530873296725
EMOPIA 0.26802811984950936
Rough4Q 0.2299637847539768

Usage

from modelscope import snapshot_download

model_dir = snapshot_download("monetjoe/EMusicGen")

Mirror

https://www.modelscope.cn/models/monetjoe/EMusicGen

Cite

@article{Zhou2024EMusicGen,
  title     = {EMusicGen: Emotion-Conditioned Melody Generation in ABC Notation},
  author    = {Monan Zhou, Xiaobing Li, Feng Yu and Wei Li},
  month     = {Sep},
  year      = {2024},
  publisher = {GitHub},
  version   = {0.1},
  url       = {https://github.com/monetjoe/EMusicGen}
}

Reference

[1] Wu, S., & Sun, M. (2023). TunesFormer: Forming Tunes with Control Codes. ArXiv, abs/2301.02884.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Space using monetjoe/EMusicGen 1