This is a pure sub-quadtratic linear attention 70B parameter model, linearized from the Meta Llama 3.1 70B model starting point.

Details on this model and how to train your own are provided at: https://github.com/HazyResearch/lolcats/tree/lolcats-scaled

Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including hazyresearch/lolcats-llama-3.1-70b