File size: 2,314 Bytes
c9f9b72 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 |
---
license_name: apache-2.0
language:
- en
base_model: louisbrulenaudet/Maxine-34B-stock
datasets:
- cognitivecomputations/Dolphin-2.9
- teknium/OpenHermes-2.5
- m-a-p/CodeFeedback-Filtered-Instruction
- cognitivecomputations/dolphin-coder
- cognitivecomputations/samantha-data
- microsoft/orca-math-word-problems-200k
- Locutusque/function-calling-chatml
- internlm/Agent-FLAN
library_name: transformers
tags:
- mlx
- merge
- mergekit
- louisbrulenaudet/Maxine-34B-stock
- ConvexAI/Luminex-34B-v0.2
- fblgit/UNA-34BeagleSimpleMath-32K-v1
- chemistry
- biology
- math
pipeline_tag: text-generation
model-index:
- name: Maxine-34B-stock
results:
- task:
type: text-generation
metrics:
- name: Average
type: Average
value: 77.28
- name: ARC
type: ARC
value: 74.06
- name: GSM8K
type: GSM8K
value: 72.18
- name: Winogrande
type: Winogrande
value: 83.9
- name: TruthfulQA
type: TruthfulQA
value: 70.18
- name: HellaSwag
type: HellaSwag
value: 86.74
source:
name: Open LLM Leaderboard
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
---
<center><img src='https://i.imgur.com/dU9dUh0.png' width='500px'></center>
# mlx-community/Maxine-34B-stock
This model was converted to MLX format from [`louisbrulenaudet/Maxine-34B-stock`]() using mlx-vlm version **0.15.2**.
Refer to the [original model card](louisbrulenaudet/Maxine-34B-stock) for more details on the model.
## Use with mlx
```bash
pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/Maxine-34B-stock --max-tokens 100 --temp 0.0
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Maxine-34B-stock")
response = generate(model, tokenizer, prompt="hello", verbose=True)
```
## Citing & Authors
If you use this code in your research, please use the following BibTeX entry.
```BibTeX
@misc{louisbrulenaudet2024,
author = {Louis Brulé Naudet},
title = {Maxine-34B-stock, an xtraordinary 34B model},
year = {2024}
howpublished = {\url{https://huggingface.co/mlx-community/Maxine-34B-stock}},
}
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |