LoLCATs models.
#369
by
jrell
- opened
This one might be a bit tricky.
Paper: https://hazyresearch.stanford.edu/blog/2024-10-14-lolcats-p1
Checkpoints: https://huggingface.co/collections/hazyresearch/lolcats-670ca4341699355b61238c37
These are base models, but it still would be cool to see how they work.
Thank you for your amazing quants!🫡
As far as I can see, these are not even transformer models, but just the pure weights, without any of the meta data required to use them. So, without even knowing what architecture these are, conversion will be impossible. Maybe when it is supported by transformers, llama will catch up.
mradermacher
changed discussion status to
closed