ZJU-KnowLM

Built upon LlaMA-13b, this version incorporates pretraining weights from a secondary full-scale pretraining phase using both Chinese and English bilingual data. This augmentation improves the model's comprehension of Chinese. For further details, please refer to this link.

Downloads last month
74
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.