add community resources to README
#25
by
sam-mosaic
- opened
README.md
CHANGED
@@ -89,6 +89,13 @@ from transformers import AutoTokenizer
|
|
89 |
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
|
90 |
```
|
91 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
92 |
## Example Epilogue
|
93 |
|
94 |
The full text of the _The Great Gatsby_ (67873 tokens) was fed to the model, followed by the text "EPILOGUE"
|
|
|
89 |
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
|
90 |
```
|
91 |
|
92 |
+
## Community-Created Resources
|
93 |
+
|
94 |
+
These were not created by MosaicML, but you may find them useful. These links are not an endorsement of the creators or their content.
|
95 |
+
|
96 |
+
- [Oobabooga Running MPT-7B-Storywriter](https://youtu.be/QVVb6Md6huA)
|
97 |
+
- [NEW MPT-7B-StoryWriter CRUSHES GPT-4!](https://www.youtube.com/watch?v=O9Y_ZdsuKWQ&t=649s) - Has a long section on running locally using Oobabooga
|
98 |
+
|
99 |
## Example Epilogue
|
100 |
|
101 |
The full text of the _The Great Gatsby_ (67873 tokens) was fed to the model, followed by the text "EPILOGUE"
|