metadata
license: llama3.2
base_model: meta-llama/Llama-3.2-1B-Instruct
pipeline_tag: text-generation
quanted_by: grimjim
EXL2 quants of meta-llama/Llama-3.2-1B-Instruct by branch:
- 4_0 : 4.0 bits per weight
- 5_0 : 5.0 bits per weight
- 6_0 : 6.0 bits per weight
- 8_0 : 8.0 bits per weight
Make your own EXL2 quants with measurement.json.
Quanted with exllamav2 v0.2.4.