Quantized K2-V2
Collection
Verified models.
•
6 items
•
Updated
This is LLM360/K2-V2-Instruct quantized with LLM Compressor with the recipe in the "recipe.yaml" file. The model is compatible with vLLM (tested: v0.12.0). Tested with an RTX Pro 6000.
Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free. Or you can "buy me a kofi".