macaw-11b
Model description
Macaw (Multi-angle c(q)uestion answering) is a ready-to-use model capable of general question answering, showing robustness outside the domains it was trained on. It has been trained in "multi-angle" fashion, which means it can handle a flexible set of input and output "slots" (question, answer, multiple-choice options, context, and explanation) .
Macaw was built on top of T5 and comes in three sizes: macaw-11b, macaw-3b, and macaw-large, as well as an answer-focused version featured on various leaderboards macaw-answer-11b.
See https://github.com/allenai/macaw for more details.
Intended uses & limitations
How to use
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("allenai/macaw-11b")
model = AutoModelForSeq2SeqLM.from_pretrained("allenai/macaw-11b")
input_string = "$answer$ ; $mcoptions$ ; $question$ = What is the color of a cloudy sky?"
input_ids = tokenizer.encode(input_string, return_tensors="pt")
output = model.generate(input_ids, max_length=200)
>>> tokenizer.batch_decode(output, skip_special_tokens=True)
['$answer$ = gray ; $mcoptions$ = (A) blue (B) white (C) grey (D) black']
BibTeX entry and citation info
@article{Tafjord2021Macaw,
title={General-Purpose Question-Answering with {M}acaw},
author={Oyvind Tafjord and Peter Clark},
journal={ArXiv},
year={2021},
volume={abs/2109.02593}
}
- Downloads last month
- 36
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.