SmolLM-135M-Instruct-layer-pruned-90M-raw
A Layer-Pruned version of SmolLM-Instruct-135M
- Layers are removed from the top of the model (except for the last layer) in order to reduce the parameter count to approximately 99M.
- Downloads last month
- 256
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.