Edit model card

Lion: Adversarial Distillation of Proprietary Large Language Models (EMNLP 2023)

arXiv link: https://arxiv.org/abs/2305.12870
Github: https://github.com/YJiangcm/Lion

Note: To comply with the LLaMA model license, we release Lion weights as delta weights.

Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.