Edit model card

Qwen1.5-14B-Chat-8bpw-h8-exl2

This is a 8.0bpw/h8 quantized version of Qwen/Qwen1.5-14B-Chat made with exllamav2.

To run this, make sure you installed the up-to-date version of Exllamav2.

License

This project is distributed under the Tongyi Qianwen LICENSE AGREEMENT. See the LICENSE file for more information.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
378
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including DrNicefellow/Qwen1.5-14B-Chat-8bpw-h8-exl2