Muennighoff
commited on
Commit
•
29e57a5
1
Parent(s):
f8d4850
Update README.md
Browse files
README.md
CHANGED
@@ -130,9 +130,6 @@ widget:
|
|
130 |
|----|-----------|
|
131 |
|
132 |
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
# Intended uses
|
137 |
|
138 |
You can use the models to perform inference on tasks by specifying your query in natural language, and the models will generate a prediction. For instance, you can ask *"Translate this to Chinese: Je t'aime."*, and the model will hopefully generate *"我爱你"*.
|
@@ -140,6 +137,8 @@ You can use the models to perform inference on tasks by specifying your query in
|
|
140 |
# How to use
|
141 |
|
142 |
Here is how to use the model in PyTorch:
|
|
|
|
|
143 |
```python
|
144 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
145 |
|
|
|
130 |
|----|-----------|
|
131 |
|
132 |
|
|
|
|
|
|
|
133 |
# Intended uses
|
134 |
|
135 |
You can use the models to perform inference on tasks by specifying your query in natural language, and the models will generate a prediction. For instance, you can ask *"Translate this to Chinese: Je t'aime."*, and the model will hopefully generate *"我爱你"*.
|
|
|
137 |
# How to use
|
138 |
|
139 |
Here is how to use the model in PyTorch:
|
140 |
+
|
141 |
+
TODO: Better code with auto-precision?
|
142 |
```python
|
143 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
144 |
|