Bartowski PRO

bartowski

AI & ML interests

Official model curator for https://lmstudio.ai/

Recent Activity

updated a model about 1 hour ago
bartowski/Evathene-v1.3-GGUF
updated a model about 1 hour ago
bartowski/Evathene-v1.3-GGUF
liked a model about 2 hours ago
arcee-ai/Virtuoso-Small
View all activity

Organizations

LM Studio's profile picture Arcee AI's profile picture Qwen's profile picture Crystal Care AI's profile picture NeuroLattice's profile picture Cognitive Computations's profile picture LM Studio Community's profile picture private beta for deeplinks's profile picture Arcee Training Org's profile picture open/ acc's profile picture

Posts 7

view post
Post
1187
Old mixtral model quants may be broken!

Recently Slaren over on llama.cpp refactored the model loader - in a way that's super awesome and very powerful - but with it came breaking of support for "split tensor MoE models", which applies to older mixtral models

You may have seen my upload of one such older mixtral model, ondurbin/bagel-dpo-8x7b-v0.2, and with the newest changes it seems to be able to run without issue

If you happen to run into issues with any other old mixtral models, drop a link here and I'll try to remake them with the new changes so that we can continue enjoying them :)
view post
Post
20637
In regards to the latest mistral model and GGUFs for it:

Yes, they may be subpar and may require changes to llama.cpp to support the interleaved sliding window

Yes, I got excited when a conversion worked and released them ASAP

That said, generation seems to work right now and seems to mimic the output from spaces that are running the original model

I have appended -TEST to the model names in an attempt to indicate that they are not final or perfect, but if people still feel mislead and that it's not the right thing to do, please post (civilly) below your thoughts, I will highly consider pulling the conversions if that's what people think is best. After all, that's what I'm here for, in service to you all !

datasets

None public yet