Text Generation
Transformers
PyTorch
Italian
Inference Endpoints
stefanoscotta's picture
Update README.md
8be35b4
|
raw
history blame
No virus
4.99 kB
metadata
license: other
pipeline_tag: text-generation
datasets:
  - cosimoiaia/Loquace-102k
language:
  - it

Model Card for Model ID

An open-source LLaMa language model of 13b parameters fine-tuned to follow instructions in italian.

Model Description

This model is an open-source LLM of 13b parameters based on OpenLLaMA, an open-source replica of Meta AI's LLaMA. The model was fine-tuned in order to follow instructions, as proposed in Alpaca, but using LoRA technique and a bigger dataset of instruction/answers in italian, cosimoiaia/Loquace-102k.

This repository contains the model merged with the LoRA adapters obtained in the fine-tuning procedure.

Uses

Direct Use

[More Information Needed]

Downstream Use [optional]

[More Information Needed]

Out-of-Scope Use

[More Information Needed]

Bias, Risks, and Limitations

[More Information Needed]

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Training Details

Training Data

The model was fine-tinuned on cosimoiaia/Loquace-102k, a dataset of 102k question/answer pairs in italian.

Training Procedure

The fine-tuning procedure was done using LoRA approach following closely what done for fine-tuning models like Alpaca-LoRA.

Training Hyperparameters

Training setting:

  • train epochs=3,

  • learning_rate=3e-4,

  • optimizer="adamw_hf"

  • mixed precision training: float16

LoRA configuration:

  • r= 8

  • lora_alpha=16

  • target_modules=["q_proj","v_proj"]

  • lora_dropout=0.05

  • bias="none"

  • task_type=TaskType.CAUSAL_LM

[More Information Needed]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: 1 NVIDIA A100/40Gb
  • Hours used: 68
  • Cloud Provider: Private Infrastructure
  • Carbon Emitted: 7.34 kg eq. CO2

Technical Specifications [optional]

Model Architecture and Objective

[More Information Needed]

Compute Infrastructure

[More Information Needed]

Hardware

[More Information Needed]

Software

[More Information Needed]

Citation [optional]

BibTeX:

[More Information Needed]

APA:

[More Information Needed]

Glossary [optional]

[More Information Needed]

More Information [optional]

[More Information Needed]

Model Card Authors [optional]

Stefano Scotta (stefano.scotta@rai.it)

Model Card Contact

stefano.scotta@rai.it