question
string | question_id
string | question_source
string | entity_pages
sequence | search_results
sequence | answer
dict |
---|---|---|---|---|---|
Which American-born Sinclair won the Nobel Prize for Literature in 1930? | tc_1 | http://www.triviacountry.com/ | [object Object](...TRUNCATED) | [object Object](...TRUNCATED) | [object Object](...TRUNCATED) |
Where in England was Dame Judi Dench born? | tc_3 | http://www.triviacountry.com/ | {"doc_source":["TagMe","TagMe"],"filename":["England.txt","Judi_Dench.txt"],"title":["England","Judi(...TRUNCATED) | {"description":["Judi Dench, Actress: Skyfall. Judi Dench was born in York, ... Judi Dench was born (...TRUNCATED) | {"aliases":["Park Grove (1895)","York UA","Yorkish","UN/LOCODE:GBYRK","York, UK","Eoforwic","Park Gr(...TRUNCATED) |
In which decade did Billboard magazine first publish and American hit chart? | tc_5 | http://www.triviacountry.com/ | [object Object](...TRUNCATED) | {"description":["Song chart US Billboard. The Billboard magazine has published various music charts (...TRUNCATED) | {"aliases":["30's","30’s","30s","30s AD","30-39"],"normalized_aliases":["30 39","30s","30 s","30s (...TRUNCATED) |
From which country did Angola achieve independence in 1975? | tc_8 | http://www.triviacountry.com/ | {"doc_source":["TagMe","TagMe","Search"],"filename":["Nation_state.txt","Angola.txt","Angolan_Civil_(...TRUNCATED) | {"description":["Angola; Angola from past to present; ... Angola from past to present; Why did Bices(...TRUNCATED) | {"aliases":["Portogało","Republic of Portugal","PORTUGAL","Portekiz","Portugallu","O Papagaio","ISO(...TRUNCATED) |
Which city does David Soul come from? | tc_9 | http://www.triviacountry.com/ | {"doc_source":["TagMe"],"filename":["David_Soul.txt"],"title":["David Soul"],"wiki_context":["David (...TRUNCATED) | {"description":["David Soul, Actor: Starsky and Hutch. David Soul achieved pop icon status as handso(...TRUNCATED) | {"aliases":["Chi-Beria","Sayre language academy","Chicago","Chicago, Illinois","Hog Butcher for the (...TRUNCATED) |
Who won Super Bowl XX? | tc_10 | http://www.triviacountry.com/ | {"doc_source":["TagMe"],"filename":["Super_Bowl_XX.txt"],"title":["Super Bowl XX"],"wiki_context":["(...TRUNCATED) | {"description":["Super Bowl XX Chicago 46, New England 10 . ... (Tom Flores of Raiders was the other(...TRUNCATED) | {"aliases":["Chicago Bears","Chicago Staleys","Decatur Staleys","Chicago Bears football","Chicago be(...TRUNCATED) |
Which was the first European country to abolish capital punishment? | tc_11 | http://www.triviacountry.com/ | {"doc_source":["TagMe","TagMe"],"filename":["Ethnic_groups_in_Europe.txt","Capital_punishment.txt"],(...TRUNCATED) | {"description":["Portugal was the first European country to abolish the death penalty, ... For examp(...TRUNCATED) | {"aliases":["Norvège","Mainland Norway","Norway","Norvege","Noregur","NORWAY","Norwegian state","Et(...TRUNCATED) |
In which country did he widespread use of ISDN begin in 1988? | tc_15 | http://www.triviacountry.com/ | {"doc_source":["TagMe"],"filename":["Integrated_Services_Digital_Network.txt"],"title":["Integrated (...TRUNCATED) | [object Object](...TRUNCATED) | {"aliases":["日本國","State of Japan","Ja-pan","Nihon","Nippon","Japang","Modern–era Japan","Et(...TRUNCATED) |
What is Bruce Willis' real first name? | tc_16 | http://www.triviacountry.com/ | {"doc_source":["TagMe"],"filename":["Bruce_Willis.txt"],"title":["Bruce Willis"],"wiki_context":["Wa(...TRUNCATED) | {"description":["Walter Bruce Willis was born on March 19, ... Was the first actor to ever \"act\" i(...TRUNCATED) | {"aliases":["Walter (TV Series)","Walter","Walter (disambiguation)","Walter (TV series)"],"normalize(...TRUNCATED) |
Which William wrote the novel Lord Of The Flies? | tc_17 | http://www.triviacountry.com/ | {"doc_source":["TagMe"],"filename":["Lord_of_the_Flies.txt"],"title":["Lord of the Flies"],"wiki_con(...TRUNCATED) | {"description":["William Golding: The Man Who Wrote Lord of the Flies by John Carey","Why I Think Wi(...TRUNCATED) | {"aliases":["Golding","Golding (surname)","Golding (disambiguation)"],"normalized_aliases":["golding(...TRUNCATED) |
End of preview. Expand
in Dataset Viewer.
Dataset Card for "trivia_qa"
Dataset Summary
TriviaqQA is a reading comprehension dataset containing over 650K question-answer-evidence triples. TriviaqQA includes 95K question-answer pairs authored by trivia enthusiasts and independently gathered evidence documents, six per question on average, that provide high quality distant supervision for answering the questions.
Supported Tasks and Leaderboards
Languages
English.
Dataset Structure
Data Instances
rc
- Size of downloaded dataset files: 2.67 GB
- Size of the generated dataset: 16.02 GB
- Total amount of disk used: 18.68 GB
An example of 'train' looks as follows.
rc.nocontext
- Size of downloaded dataset files: 2.67 GB
- Size of the generated dataset: 126.27 MB
- Total amount of disk used: 2.79 GB
An example of 'train' looks as follows.
unfiltered
- Size of downloaded dataset files: 3.30 GB
- Size of the generated dataset: 29.24 GB
- Total amount of disk used: 32.54 GB
An example of 'validation' looks as follows.
unfiltered.nocontext
- Size of downloaded dataset files: 632.55 MB
- Size of the generated dataset: 74.56 MB
- Total amount of disk used: 707.11 MB
An example of 'train' looks as follows.
Data Fields
The data fields are the same among all splits.
rc
question
: astring
feature.question_id
: astring
feature.question_source
: astring
feature.entity_pages
: a dictionary feature containing:doc_source
: astring
feature.filename
: astring
feature.title
: astring
feature.wiki_context
: astring
feature.
search_results
: a dictionary feature containing:description
: astring
feature.filename
: astring
feature.rank
: aint32
feature.title
: astring
feature.url
: astring
feature.search_context
: astring
feature.
aliases
: alist
ofstring
features.normalized_aliases
: alist
ofstring
features.matched_wiki_entity_name
: astring
feature.normalized_matched_wiki_entity_name
: astring
feature.normalized_value
: astring
feature.type
: astring
feature.value
: astring
feature.
rc.nocontext
question
: astring
feature.question_id
: astring
feature.question_source
: astring
feature.entity_pages
: a dictionary feature containing:doc_source
: astring
feature.filename
: astring
feature.title
: astring
feature.wiki_context
: astring
feature.
search_results
: a dictionary feature containing:description
: astring
feature.filename
: astring
feature.rank
: aint32
feature.title
: astring
feature.url
: astring
feature.search_context
: astring
feature.
aliases
: alist
ofstring
features.normalized_aliases
: alist
ofstring
features.matched_wiki_entity_name
: astring
feature.normalized_matched_wiki_entity_name
: astring
feature.normalized_value
: astring
feature.type
: astring
feature.value
: astring
feature.
unfiltered
question
: astring
feature.question_id
: astring
feature.question_source
: astring
feature.entity_pages
: a dictionary feature containing:doc_source
: astring
feature.filename
: astring
feature.title
: astring
feature.wiki_context
: astring
feature.
search_results
: a dictionary feature containing:description
: astring
feature.filename
: astring
feature.rank
: aint32
feature.title
: astring
feature.url
: astring
feature.search_context
: astring
feature.
aliases
: alist
ofstring
features.normalized_aliases
: alist
ofstring
features.matched_wiki_entity_name
: astring
feature.normalized_matched_wiki_entity_name
: astring
feature.normalized_value
: astring
feature.type
: astring
feature.value
: astring
feature.
unfiltered.nocontext
question
: astring
feature.question_id
: astring
feature.question_source
: astring
feature.entity_pages
: a dictionary feature containing:doc_source
: astring
feature.filename
: astring
feature.title
: astring
feature.wiki_context
: astring
feature.
search_results
: a dictionary feature containing:description
: astring
feature.filename
: astring
feature.rank
: aint32
feature.title
: astring
feature.url
: astring
feature.search_context
: astring
feature.
aliases
: alist
ofstring
features.normalized_aliases
: alist
ofstring
features.matched_wiki_entity_name
: astring
feature.normalized_matched_wiki_entity_name
: astring
feature.normalized_value
: astring
feature.type
: astring
feature.value
: astring
feature.
Data Splits
name | train | validation | test |
---|---|---|---|
rc | 138384 | 18669 | 17210 |
rc.nocontext | 138384 | 18669 | 17210 |
unfiltered | 87622 | 11313 | 10832 |
unfiltered.nocontext | 87622 | 11313 | 10832 |
Dataset Creation
Curation Rationale
Source Data
Initial Data Collection and Normalization
Who are the source language producers?
Annotations
Annotation process
Who are the annotators?
Personal and Sensitive Information
Considerations for Using the Data
Social Impact of Dataset
Discussion of Biases
Other Known Limitations
Additional Information
Dataset Curators
Licensing Information
The University of Washington does not own the copyright of the questions and documents included in TriviaQA.
Citation Information
@article{2017arXivtriviaqa,
author = {{Joshi}, Mandar and {Choi}, Eunsol and {Weld},
Daniel and {Zettlemoyer}, Luke},
title = "{triviaqa: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension}",
journal = {arXiv e-prints},
year = 2017,
eid = {arXiv:1705.03551},
pages = {arXiv:1705.03551},
archivePrefix = {arXiv},
eprint = {1705.03551},
}
Contributions
Thanks to @thomwolf, @patrickvonplaten, @lewtun for adding this dataset.
- Downloads last month
- 79,586
Homepage:
nlp.cs.washington.edu
Repository:
github.com
Paper:
TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension
Leaderboard:
CodaLab Leaderboard
Size of downloaded dataset files:
9.26 GB
Size of the auto-converted Parquet files (First 5GB per split):
24.4 GB
Number of rows (First 5GB per split):
847,579
Models trained or fine-tuned on mandarjoshi/trivia_qa
Sentence Similarity
•
Updated
•
415M
•
•
894