This model is being trained with help of LORA technique on Bulgarian data from: https://www.kaggle.com/datasets/auhide/bulgarian-recipes-dataset/
This LLAMA version is 4bit encoded version of the 16bit LLAMA 2 7B model.
- Downloads last month
- 23
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.