File size: 927 Bytes
5e3d154 5c0cce3 37417b8 dca7d5a 37417b8 dca7d5a 148bacb 847fbd6 5c0cce3 847fbd6 5c0cce3 847fbd6 5c0cce3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
license: afl-3.0
---
<strong>Example on how to load and use BOW-BERT: <strong>
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
# load model
model = AutoModelForSequenceClassification.from_pretrained('dmrau/bow-bert')
# load tokenizer
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')
# tokenize query and passage and concatenate them
inp = tokenizer(['this is a query','query a is this'], ['this is a passage', 'passage a is this'], return_tensors='pt')
# get estimated score
print('score', model(**inp).logits[:, 1])
### outputs identical scores for different
### word orders as the model is order invariant:
# scores: [-2.9463, -2.9463]
```
<strong> Cite us:<strong>
```
@article{rau2022role,
title={The Role of Complex NLP in Transformers for Text Ranking?},
author={Rau, David and Kamps, Jaap},
journal={arXiv preprint arXiv:2207.02522},
year={2022}
}
```
|