library_name: transformers | |
language: | |
- en | |
- fr | |
- it | |
- pt | |
- hi | |
- es | |
- th | |
- de | |
base_model: | |
- meta-llama/Llama-3.3-70B-Instruct | |
tags: | |
- exl2 | |
- meta | |
- pytorch | |
- llama | |
- llama-3 | |
license: llama3.3 | |
# Llama-3.3-70B-Instruct - EXL2 2.25bpw | |
This is a 2.25bpw EXL2 quant of [meta-llama/Llama-3.3-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct) | |
Details about the model can be found at the above model page. | |
## EXL2 Version | |
These quants were made with exllamav2 version 0.2.4. Quants made on this version of EXL2 may not work on older versions of the exllamav2 library. | |
If you have problems loading these models, please update Text Generation WebUI to the latest version. | |
## Perplexity Scoring | |
Below are the perplexity scores for the EXL2 models. A lower score is better. | |
| Quant Level | Perplexity Score | | |
|-------------|------------------| | |
| 5.0 | 4.7932 | | |
| 4.5 | 4.8894 | | |
| 4.0 | 5.0079 | | |
| 3.5 | 5.3992 | | |
| 3.0 | 7.2686 | | |
| 2.5 | 10.5543 | | |
| 2.25 | 8.8764 | | |