bow-bert / README.md
dmrau's picture
Update README.md
5c0cce3
|
raw
history blame
927 Bytes
metadata
license: afl-3.0

Example on how to load and use BOW-BERT:

from transformers import AutoModelForSequenceClassification, AutoTokenizer

# load model
model = AutoModelForSequenceClassification.from_pretrained('dmrau/bow-bert')
# load tokenizer
tokenizer = AutoTokenizer.from_pretrained('bert-base-uncased')

# tokenize query and passage and concatenate them
inp = tokenizer(['this is a query','query a is this'], ['this is a passage', 'passage a is this'], return_tensors='pt')
# get estimated score
print('score', model(**inp).logits[:, 1])

### outputs identical scores for different 
### word orders as the model is order invariant:
# scores: [-2.9463, -2.9463]

Cite us:

@article{rau2022role,
  title={The Role of Complex NLP in Transformers for Text Ranking?},
  author={Rau, David and Kamps, Jaap},
  journal={arXiv preprint arXiv:2207.02522},
  year={2022}
}