Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,43 @@
|
|
1 |
---
|
|
|
2 |
license: openrail
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language: en
|
3 |
license: openrail
|
4 |
+
pipeline_tag: text-generation
|
5 |
---
|
6 |
+
# GPT-Neo 1.3B - Muslim Traveler
|
7 |
+
## Model Description
|
8 |
+
GPT-Neo 1.3B-Muslim Traveler is finetuned on EleutherAI's GPT-Neo 1.3B model.
|
9 |
+
## Training data
|
10 |
+
The training data consists of travel texts written by ancient muslim travelers.
|
11 |
+
### How to use
|
12 |
+
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
|
13 |
+
```py
|
14 |
+
>>> from transformers import pipeline
|
15 |
+
>>> generator = pipeline('text-generation', model='arputtick/GPT_Neo_muslim_travel')
|
16 |
+
>>> generator("> You wake up.", do_sample=True, min_length=50)
|
17 |
+
[{'generated_text': '> You wake up"\nYou get out of bed, don your armor and get out of the door in search for new adventures.'}]
|
18 |
+
```
|
19 |
+
### Limitations and Biases
|
20 |
+
GPT-Neo was trained as an autoregressive language model. This means that its core functionality is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work.
|
21 |
+
GPT-Neo was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending on your usecase GPT-Neo may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
|
22 |
+
As with all language models, it is hard to predict in advance how GPT-Neo will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
|
23 |
+
### BibTeX entry and citation info
|
24 |
+
The model is made using the following software:
|
25 |
+
```bibtex
|
26 |
+
@software{gpt-neo,
|
27 |
+
author = {Black, Sid and
|
28 |
+
Leo, Gao and
|
29 |
+
Wang, Phil and
|
30 |
+
Leahy, Connor and
|
31 |
+
Biderman, Stella},
|
32 |
+
title = {{GPT-Neo: Large Scale Autoregressive Language
|
33 |
+
Modeling with Mesh-Tensorflow}},
|
34 |
+
month = mar,
|
35 |
+
year = 2021,
|
36 |
+
note = {{If you use this software, please cite it using
|
37 |
+
these metadata.}},
|
38 |
+
publisher = {Zenodo},
|
39 |
+
version = {1.0},
|
40 |
+
doi = {10.5281/zenodo.5297715},
|
41 |
+
url = {https://doi.org/10.5281/zenodo.5297715}
|
42 |
+
}
|
43 |
+
```
|