Text Generation
Transformers
MLX
mistral
juanako
UNA
cybertron
fbl
Inference Endpoints
mc0ps's picture
Upload folder using huggingface_hub
0fa7653 verified
|
raw
history blame
704 Bytes
metadata
license: apache-2.0
library_name: transformers
tags:
  - juanako
  - UNA
  - cybertron
  - fbl
  - mlx
datasets:
  - fblgit/tree-of-knowledge
  - Open-Orca/SlimOrca-Dedup
  - allenai/ultrafeedback_binarized_cleaned

mlx-community/una-cybertron-7b-v2-bf16-4bit-mlx

This model was converted to MLX format from fblgit/una-cybertron-7b-v2-bf16. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/una-cybertron-7b-v2-bf16-4bit-mlx")
response = generate(model, tokenizer, prompt="hello", verbose=True)