MASTER: Multi-task Pre-trained Bottlenecked Masked Autoencoders are Better Dense Retrievers
Paper: https://arxiv.org/abs/2212.07841.
Code: https://github.com/microsoft/SimXNS/tree/main/MASTER.
Overview
This is the checkpoint after pretraining on the NQ, TQ, WQ and Squad's Wikipedia corpus. You may use this checkpoint as the initialization for finetuning.
Useage
To load this checkpoint for initialization, you may follow:
from transformers import AutoModel
model = AutoModel.from_pretrained('lx865712528/master-base-pretrained-wiki')
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.