MobiLlama-05B / README.md
Omkar Thawakar
Update README.md
3fd8e4c verified
|
raw
history blame
1.2 kB
metadata
license: mit
license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE
language:
  - en
pipeline_tag: text-generation
tags:
  - nlp
  - code

Model Summary

MobiLlama-05B is a Small Language Model with 0.5 billion parameters. It was trained using the Amber data sources Amber-Dataset.

How to Use

MobiLlama-05B has been integrated in the development version (4.37.0.dev) of transformers. Until the official version is released through pip, ensure that you are doing one of the following:

  • When loading the model, ensure that trust_remote_code=True is passed as an argument of the from_pretrained() function.

  • Update your local transformers to the development version: pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers. The previous command is an alternative to cloning and installing from the source.

The current transformers version can be verified with: pip list | grep transformers.

Intended Uses

Given the nature of the training data, the Phi-2 model is best suited for prompts using the QA format, the chat format, and the code format.