--- license: mit --- # EMusicGen The model weights for generating ABC melodies by emotions. ## Demo ## Maintenance ```bash git clone git@hf.com:monetjoe/EMusicGen cd EMusicGen ``` ## Fine-tuning results | Dataset | Loss curve | Min eval loss | | :-----: | :--------------------------------------------------------------------------------------: | :-------------------: | | VGMIDI | ![](https://www.modelscope.cn/models/monetjoe/EMusicGen/resolve/master/vgmidi/loss.jpg) | `0.23854530873296725` | | EMOPIA | ![](https://www.modelscope.cn/models/monetjoe/EMusicGen/resolve/master/emopia/loss.jpg) | `0.26802811984950936` | | Rough4Q | ![](https://www.modelscope.cn/models/monetjoe/EMusicGen/resolve/master/rough4q/loss.jpg) | `0.2299637847539768` | ## Usage ```python from modelscope import snapshot_download model_dir = snapshot_download("monetjoe/EMusicGen") ``` ## Mirror ## Cite ```bibtex @article{Zhou2024EMusicGen, title = {EMusicGen: Emotion-Conditioned Melody Generation in ABC Notation}, author = {Monan Zhou, Xiaobing Li, Feng Yu and Wei Li}, month = {Sep}, year = {2024}, publisher = {GitHub}, version = {0.1}, url = {https://github.com/monetjoe/EMusicGen} } ``` ## Reference [1] [Wu, S., & Sun, M. (2023). TunesFormer: Forming Tunes with Control Codes. ArXiv, abs/2301.02884.](https://arxiv.org/pdf/2301.02884)