cuierfei commited on
Commit
fee5dff
·
verified ·
1 Parent(s): 5bec332

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -499,7 +499,7 @@ print(sess.response.text)
499
  LMDeploy's `api_server` enables models to be easily packed into services with a single command. The provided RESTful APIs are compatible with OpenAI's interfaces. Below are an example of service startup:
500
 
501
  ```shell
502
- lmdeploy serve api_server OpenGVLab/Mini-InternVL-Chat-2B-V1-5 --model-name Mini-InternVL-Chat-2B-V1-5 --backend turbomind --server-port 23333
503
  ```
504
 
505
  To use the OpenAI-style interface, you need to install OpenAI:
@@ -516,7 +516,7 @@ from openai import OpenAI
516
  client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
517
  model_name = client.models.list().data[0].id
518
  response = client.chat.completions.create(
519
- model="Mini-InternVL-Chat-2B-V1-5",
520
  messages=[{
521
  'role':
522
  'user',
@@ -546,7 +546,7 @@ TODO
546
 
547
  ## License
548
 
549
- This project is released under the MIT license, while InternLM is licensed under the Apache-2.0 license.
550
 
551
  ## Citation
552
 
 
499
  LMDeploy's `api_server` enables models to be easily packed into services with a single command. The provided RESTful APIs are compatible with OpenAI's interfaces. Below are an example of service startup:
500
 
501
  ```shell
502
+ lmdeploy serve api_server OpenGVLab/Mini-InternVL-Chat-2B-V1-5 --backend turbomind --server-port 23333
503
  ```
504
 
505
  To use the OpenAI-style interface, you need to install OpenAI:
 
516
  client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
517
  model_name = client.models.list().data[0].id
518
  response = client.chat.completions.create(
519
+ model=model_name,
520
  messages=[{
521
  'role':
522
  'user',
 
546
 
547
  ## License
548
 
549
+ This project is released under the MIT license, while InternLM2 is licensed under the Apache-2.0 license.
550
 
551
  ## Citation
552