tiny-bert-ranker model card
This model is a fine-tuned version of prajjwal1/bert-tiny as part of our submission to ReNeuIR 2024.
Model Details
Model Description
The model is based on the pre-trained prajjwal1/bert-tiny. It is fine-tuned on a 1GB subset of data extracted from msmarco's Train Triples Small.
Tiny-bert-ranker is part of our investigation into the tradeoffs between efficiency and effectiveness in ranking models. This approach does not involve BM25 score injection or distillation.
- Developed by: Team FSU at ReNeuIR 2024
- Model type: sequence-to-sequence model
- License: mit
- Finetuned from model: prajjwal1/bert-tiny
- Downloads last month
- 13
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.