ChatGLM2 6b int8 ้ๅๆจกๅ
่ฏฆๆ ๅ่ K024/chatglm-qใ
See K024/chatglm-q for more details.
import torch
from chatglm_q.decoder import ChatGLMDecoder, chat_template
device = torch.device("cuda")
decoder = ChatGLMDecoder.from_pretrained("K024/chatglm2-6b-int8", device=device)
prompt = chat_template([], "ๆๆฏ่ฐ๏ผ")
for text in decoder.generate(prompt):
print(text)
ๆจกๅๆ้ๆ ChatGLM2-6b ่ฎธๅฏๅๅธ๏ผ่ง MODEL LICENSEใ
Model weights are released under the same license as ChatGLM2-6b, see MODEL LICENSE.
- Downloads last month
- 4