John Thickstun
commited on
Commit
•
2e04a3b
1
Parent(s):
32b7e33
readme
Browse files
README.md
CHANGED
@@ -1,3 +1,13 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
|
5 |
+
This is a Medium (360M parameter) Transformer trained for 100k steps on arrival-time encoded music from the [Lakh MIDI dataset](https://colinraffel.com/projects/lmd/). This model was trained with anticipation.
|
6 |
+
|
7 |
+
# References for the Anticipatory Music Transformer
|
8 |
+
|
9 |
+
The full model card is available [here](https://johnthickstun.com/assets/pdf/music-modelcard.pdf).
|
10 |
+
|
11 |
+
Code for using this model is available on [GitHub](https://github.com/jthickstun/anticipation/).
|
12 |
+
|
13 |
+
See the accompanying [blog post](https://crfm.stanford.edu/2023/06/14/anticipatory-music-transformer.html?idx=1#demo-example) for additional discussion of this model.
|