atsuki-yamaguchi's picture
Upload README.md with huggingface_hub
21fc847 verified
|
raw
history blame
1.37 kB
metadata
license: llama2
language:
  - el
base_model: meta-llama/Llama-2-7b-hf
library_name: transformers

Llama2 7B for Greek: 100 target vocabulary size + Align target vocabulary initialization + MTP training

This model is built on top of Llama2 7B adapted for Greek using 30K target language sentences sampled from CC-100.

Model Details

  • Vocabulary: This model has an additional 100 target vocabulary.
  • Target vocabulary initialization: The target weights of the embedding and LM head were initialized using Align initialization.
  • Training: This model was additionally pre-trained on 30K target language sentences sampled from CC-100. The training was conducted with the MTP strategies introduced in the paper.

Model Description

  • Language: Greek
  • License: Llama 2 Community License Agreement
  • Fine-tuned from model: meta-llama/Llama-2-7b-hf

Model Sources

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import AutoTokenizer, AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained(
    "atsuki-yamaguchi/Llama-2-7b-hf-el-30K-align-mtp"
)
tokenizer = AutoTokenizer.from_pretrained(
    "atsuki-yamaguchi/Llama-2-7b-hf-el-30K-align-mtp"
)