Edit model card

Maxine-7B-0401-stock, an xtraordinary 7B model

03-22-2024 - To date, louisbrulenaudet/Pearl-34B-ties is the "Best 🀝 base merges and moerges model of around 30B" on the Open LLM Leaderboard.

Configuration

models:
    - model: OpenPipe/mistral-ft-optimized-1227
    - model: MTSAIR/multi_verse_model
    - model: rwitz/experiment26-truthy-iter-0
    - model: MaziyarPanahi/Calme-7B-Instruct-v0.2
merge_method: model_stock
base_model: OpenPipe/mistral-ft-optimized-1227
dtype: bfloat16

Usage

!pip install -qU transformers accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "louisbrulenaudet/Maxine-7B-0401-stock"
messages = [{"role": "user", "content": "What is a large language model?"}]

tokenizer = AutoTokenizer.from_pretrained(model)
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    torch_dtype=torch.float16,
    device_map="auto",
)

outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

Citing & Authors

If you use this code in your research, please use the following BibTeX entry.

@misc{louisbrulenaudet2024,
  author =       {Louis BrulΓ© Naudet},
  title =        {Maxine-7B-0401-stock, an xtraordinary 7B model},
  year =         {2024}
  howpublished = {\url{https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-stock}},
}

Feedback

If you have any feedback, please reach out at louisbrulenaudet@icloud.com.

Downloads last month
95
Safetensors
Model size
7.24B params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for louisbrulenaudet/Maxine-7B-0401-stock

Spaces using louisbrulenaudet/Maxine-7B-0401-stock 5