File size: 809 Bytes
1f3d61a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
language:
- en
tags:
- retrieval
- document-rewriting
datasets:
- irds:msmarco-passage
library_name: transformers
---
A DeepCT model based on `bert-base-uncased` and trained on MS MARCO. This is a version of [the checkpoint released by the original authors](http://boston.lti.cs.cmu.edu/appendices/arXiv2019-DeepCT-Zhuyun-Dai/outputs/marco.zip), converted to pytorch format and ready for use in PyTerrier.
## References
- [Dai19]: Zhuyun Dai, Jamie Callan. Context-Aware Sentence/Passage Term Importance Estimation For First Stage Retrieval. https://arxiv.org/abs/1910.10687
- [Macdonald20]: Craig Macdonald, Nicola Tonellotto. Declarative Experimentation in Information Retrieval using PyTerrier. Craig Macdonald and Nicola Tonellotto. In Proceedings of ICTIR 2020. https://arxiv.org/abs/2007.14271
|