Update README.md
Browse files
README.md
CHANGED
@@ -212,6 +212,28 @@ response, history = model.chat(tokenizer, pixel_values, question, generation_con
|
|
212 |
print(f'User: {question}\nAssistant: {response}')
|
213 |
```
|
214 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
215 |
## License
|
216 |
|
217 |
This project is released under the MIT license, while InternLM2 is licensed under the Apache-2.0 license.
|
|
|
212 |
print(f'User: {question}\nAssistant: {response}')
|
213 |
```
|
214 |
|
215 |
+
### Inference with LMDeploy
|
216 |
+
|
217 |
+
Please install the **latest version** of [LMDeploy](https://github.com/InternLM/lmdeploy) for Mono-InternVL support.
|
218 |
+
|
219 |
+
```bash
|
220 |
+
git clone https://github.com/InternLM/lmdeploy.git
|
221 |
+
cd lmdeploy
|
222 |
+
pip install -e .
|
223 |
+
```
|
224 |
+
|
225 |
+
Then run the following code for inference.
|
226 |
+
|
227 |
+
```python
|
228 |
+
from lmdeploy import pipeline
|
229 |
+
from lmdeploy.vl import load_image
|
230 |
+
|
231 |
+
image = load_image('./examples/image1.jpg')
|
232 |
+
pipe = pipeline('OpenGVLab/Mono-InternVL-2B')
|
233 |
+
response = pipe(('describe this image', image))
|
234 |
+
print(response.text)
|
235 |
+
```
|
236 |
+
|
237 |
## License
|
238 |
|
239 |
This project is released under the MIT license, while InternLM2 is licensed under the Apache-2.0 license.
|