Update README.md
Browse files
README.md
CHANGED
@@ -91,7 +91,7 @@ You can view other LaMini-LM model series as follow. Note that not all models ar
|
|
91 |
### Intended use
|
92 |
We recommend using the model to response to human instructions written in natural language.
|
93 |
|
94 |
-
We now show you how to load and use our model using HuggingFace `
|
95 |
|
96 |
```python
|
97 |
# pip install -q transformers
|
@@ -99,12 +99,12 @@ from transformers import pipeline
|
|
99 |
|
100 |
checkpoint = "{model_name}"
|
101 |
|
102 |
-
model = pipeline('text2text-generation', model
|
103 |
|
104 |
input_prompt = 'Please let me know your thoughts on the given place and why you think it deserves to be visited: \n"Barcelona, Spain"'
|
105 |
-
generated_text =
|
106 |
|
107 |
-
print("Response"
|
108 |
```
|
109 |
|
110 |
## Training Procedure
|
|
|
91 |
### Intended use
|
92 |
We recommend using the model to response to human instructions written in natural language.
|
93 |
|
94 |
+
We now show you how to load and use our model using HuggingFace `pipeline()`.
|
95 |
|
96 |
```python
|
97 |
# pip install -q transformers
|
|
|
99 |
|
100 |
checkpoint = "{model_name}"
|
101 |
|
102 |
+
model = pipeline('text2text-generation', model = checkpoint)
|
103 |
|
104 |
input_prompt = 'Please let me know your thoughts on the given place and why you think it deserves to be visited: \n"Barcelona, Spain"'
|
105 |
+
generated_text = model(input_prompt, max_length=512, do_sample=True)[0]['generated_text']
|
106 |
|
107 |
+
print("Response", generated_text)
|
108 |
```
|
109 |
|
110 |
## Training Procedure
|