Edit model card

mistralai_Mistral-Nemo-Instruct-2407-exl2-4bpw

This is a 4.0bpw quantized version of mistralai/Mistral-Nemo-Instruct-2407 made with exllamav2.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
36
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including DrNicefellow/Mistral-Nemo-Instruct-2407-exl2-4bpw