Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
RichardForests
's Collections
Language Models
CV
RL
Diffusion models
3D/4D Gaussian Splatting
Multimodal
Mamba
NeRF
Transformers & MoE
(3D) Foundation Models
SSL
DL & Software DStructures
Gemma & MoE
Dora
Flash Attention in Triton
Lora variations
Parameter Efficient - LLMs
Robotics - Cross Attention
LLM Agents OS
DMs - Lighting Conditions
Flash Attention in Triton
updated
Mar 19
Upvote
-
mosaicml/mpt-7b-instruct
Text Generation
•
Updated
Mar 5
•
7.37k
•
467
Upvote
-
Share collection
View history
Collection guide
Browse collections