Datasets:
tokens
sequence | ner_tags
sequence |
---|---|
[
"Sunday",
"the",
"interview",
"with",
"Bob",
"Shapiro",
"."
] | [
15,
0,
0,
0,
1,
2,
0
] |
[
"Newsnight",
"returns",
"to",
"duo",
"action",
"tonight",
"."
] | [
31,
0,
0,
0,
0,
17,
0
] |
[
"The",
"co-hosts",
"are",
"Kira",
"Phillips",
"and",
"Rick",
"Sanchez",
"."
] | [
0,
0,
0,
1,
2,
0,
1,
2,
0
] |
[
"that",
"'s",
"Rick",
"on",
"the",
"left",
"Kira",
"on",
"the",
"right",
"."
] | [
0,
0,
1,
0,
0,
0,
1,
0,
0,
0,
0
] |
[
"and",
"they",
"are",
"in",
"Atlanta",
"."
] | [
0,
0,
0,
0,
9,
0
] |
[
"Double",
"trouble",
"."
] | [
0,
0,
0
] |
[
"watch",
"for",
"the",
"dolphins",
"Larry",
"."
] | [
0,
0,
0,
0,
1,
0
] |
[
"Carry",
"on",
"."
] | [
0,
0,
0
] |
[
"I",
"'m",
"told",
"this",
"guy",
"in",
"Miami",
"saving",
"is",
"schula-esque",
"your",
"friend",
"."
] | [
0,
0,
0,
0,
0,
0,
9,
0,
0,
0,
0,
0,
0
] |
[
"Yeah",
"not",
"this",
"year",
"though",
"."
] | [
0,
0,
0,
0,
0,
0
] |
[
"Well",
"it",
"might",
"take",
"a",
"year",
"."
] | [
0,
0,
0,
0,
15,
16,
0
] |
[
"Good",
"to",
"talk",
"."
] | [
0,
0,
0,
0
] |
[
"Go",
"get",
"them",
"."
] | [
0,
0,
0,
0
] |
[
"Bye",
"."
] | [
0,
0
] |
[
"Alright",
"."
] | [
0,
0
] |
[
"Thanks",
"Larry",
"."
] | [
0,
1,
0
] |
[
"Well",
"tonight",
"a",
"Wal",
"-",
"Mart",
"controversy",
"."
] | [
0,
17,
0,
7,
8,
8,
0,
0
] |
[
"Good",
"evening",
"I",
"'m",
"Kyra",
"Phillips",
"."
] | [
0,
17,
0,
0,
1,
2,
0
] |
[
"And",
"I",
"'m",
"Rick",
"Sanchez",
"."
] | [
0,
0,
0,
1,
2,
0
] |
[
"Newsnight",
"begins",
"right",
"now",
"."
] | [
31,
0,
0,
0,
0
] |
[
"Tonight",
"every",
"story",
"has",
"two",
"sides",
"."
] | [
17,
0,
0,
0,
27,
0,
0
] |
[
"In",
"Wal",
"-",
"Mart",
"'s",
"case",
"both",
"sides",
"are",
"being",
"beamed",
"onto",
"big",
"screens",
"everywhere",
"."
] | [
0,
7,
8,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"One",
"point",
"three",
"million",
"people",
"in",
"America",
"have",
"decided",
"it",
"'s",
"in",
"their",
"best",
"self",
"interest",
"to",
"work",
"at",
"Wal",
"-",
"Mart",
"."
] | [
27,
28,
28,
28,
0,
0,
9,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0
] |
[
"Now",
"are",
"they",
"stupid",
"?"
] | [
0,
0,
0,
0,
0
] |
[
"No",
"."
] | [
0,
0
] |
[
"I",
"keep",
"saying",
"you",
"ca",
"n't",
"buy",
"small",
"town",
"quality",
"of",
"life",
"at",
"a",
"Wal",
"-",
"Mart",
"because",
"they",
"do",
"n't",
"sell",
"it",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
8,
0,
0,
0,
0,
0,
0,
0
] |
[
"But",
"once",
"they",
"steal",
"it",
"from",
"you",
"you",
"ca",
"n't",
"get",
"it",
"back",
"at",
"any",
"price",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Attention",
"shoppers",
"."
] | [
0,
0,
0
] |
[
"is",
"Wal",
"-",
"Mart",
"'s",
"way",
"the",
"wave",
"of",
"the",
"future",
"?"
] | [
0,
7,
8,
8,
8,
0,
0,
0,
0,
0,
0,
0
] |
[
"And",
"it",
"'s",
"a",
"case",
"of",
"dueling",
"realities",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"A",
"city",
"brimming",
"with",
"hope",
"in",
"some",
"places",
"devoid",
"of",
"it",
"in",
"others",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"New",
"Orleans",
"a",
"tale",
"of",
"two",
"cities",
"."
] | [
9,
10,
0,
0,
0,
27,
0,
0
] |
[
"Also",
"this",
"man",
"is",
"a",
"convicted",
"murderer",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Why",
"are",
"there",
"so",
"many",
"who",
"want",
"his",
"life",
"spared",
"?"
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"This",
"is",
"Newsnight",
"."
] | [
0,
0,
31,
0
] |
[
"And",
"we",
"begin",
"tonight",
"with",
"Wal",
"-",
"Mart",
"a",
"place",
"no",
"doubt",
"many",
"of",
"you",
"are",
"likely",
"to",
"shop",
"."
] | [
0,
0,
0,
17,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"When",
"it",
"comes",
"to",
"retail",
"Wal",
"-",
"Mart",
"is",
"the",
"biggest",
"of",
"the",
"big",
"."
] | [
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0
] |
[
"and",
"that",
"makes",
"it",
"a",
"big",
"target",
"as",
"well",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Its",
"corporate",
"policies",
"from",
"buying",
"cheap",
"to",
"selling",
"cheap",
"to",
"providing",
"jobs",
"for",
"tens",
"of",
"thousands",
"of",
"people",
"while",
"paying",
"rock",
"bottom",
"salaries",
"are",
"also",
"reasons",
"it",
"'s",
"a",
"big",
"target",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
27,
28,
28,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"In",
"fact",
"right",
"now",
"Wal",
"-",
"Mart",
"is",
"the",
"subject",
"of",
"not",
"one",
"but",
"two",
"documentaries",
"about",
"to",
"be",
"released",
"one",
"for",
"one",
"against",
"."
] | [
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
27,
0,
27,
0,
0,
0,
0,
0,
27,
0,
27,
0,
0
] |
[
"Two",
"stories",
"one",
"giant",
"store",
"here",
"'s",
"a",
"look",
"."
] | [
27,
0,
27,
0,
0,
0,
0,
0,
0,
0
] |
[
"You",
"drive",
"up",
"all",
"the",
"way",
"up",
"to",
"New",
"York",
"City",
"on",
"Route",
"80",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
9,
10,
10,
0,
5,
6,
0
] |
[
"and",
"you",
"'ll",
"see",
"a",
"Wal",
"-",
"Mart",
"up",
"on",
"the",
"hill",
"."
] | [
0,
0,
0,
0,
7,
8,
8,
8,
0,
0,
0,
0,
0
] |
[
"and",
"then",
"you",
"'ll",
"drive",
"farther",
"into",
"the",
"town",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"and",
"you",
"'ll",
"see",
"uh",
"an",
"empty",
"town",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Just",
"looks",
"like",
"a",
"neutron",
"bomb",
"hit",
"it",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"It",
"'s",
"an",
"explosive",
"documentary",
"that",
"alleges",
"Wal",
"-",
"Mart",
"has",
"a",
"take",
"no",
"prisoners",
"business",
"philosophy",
"that",
"is",
"good",
"for",
"profits",
"but",
"bad",
"for",
"America",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
9,
0
] |
[
"They",
"do",
"n't",
"get",
"it",
"."
] | [
0,
0,
0,
0,
0,
0
] |
[
"When",
"we",
"start",
"talking",
"about",
"quality",
"reliance",
"they",
"start",
"talking",
"about",
"cheap",
"underwear",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"At",
"this",
"point",
"there",
"'s",
"something",
"I",
"should",
"tell",
"you",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"When",
"you",
"do",
"a",
"story",
"on",
"a",
"film",
"that",
"'s",
"as",
"critical",
"of",
"a",
"company",
"as",
"this",
"one",
"is",
"about",
"Wal",
"-",
"Mart",
"I",
"want",
"to",
"talk",
"to",
"that",
"company",
"Wal",
"-",
"Mart",
"to",
"get",
"their",
"reaction",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0
] |
[
"Wal",
"-",
"Mart",
"is",
"not",
"willing",
"to",
"go",
"on",
"camera",
"."
] | [
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"But",
"it",
"did",
"provide",
"us",
"with",
"a",
"written",
"statement",
"saying",
"quote",
"like",
"any",
"company",
"we",
"want",
"to",
"make",
"sure",
"our",
"associates",
"customers",
"and",
"local",
"community",
"feel",
"good",
"about",
"us",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Wal",
"-",
"Mart",
"makes",
"life",
"better",
"for",
"an",
"awful",
"lot",
"of",
"people",
"by",
"providing",
"low",
"prices",
"and",
"good",
"jobs",
"to",
"those",
"who",
"need",
"them",
"."
] | [
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"However",
"we",
"'re",
"the",
"first",
"to",
"admit",
"we",
"'re",
"not",
"perfect",
"and",
"have",
"worked",
"hard",
"to",
"make",
"substantive",
"business",
"changes",
"in",
"areas",
"where",
"we",
"fall",
"short",
"."
] | [
0,
0,
0,
0,
25,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Wal",
"-",
"Mart",
"I",
"think",
"does",
"n't",
"do",
"a",
"good",
"enough",
"job",
"telling",
"its",
"story",
"."
] | [
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"So",
"who",
"is",
"speaking",
"up",
"for",
"Wal",
"-",
"Mart",
"?"
] | [
0,
0,
0,
0,
0,
0,
7,
8,
8,
0
] |
[
"This",
"man",
"."
] | [
0,
0,
0
] |
[
"But",
"uh",
"Wal",
"-",
"Mart",
"has",
"closed",
"some",
"businesses",
"down",
"."
] | [
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0
] |
[
"but",
"it",
"'s",
"created",
"others",
"."
] | [
0,
0,
0,
0,
0,
0
] |
[
"In",
"fact",
"/-"
] | [
0,
0,
0
] |
[
"In",
"fact",
"Ron",
"Galloway",
"is",
"not",
"only",
"speaking",
"up",
"for",
"the",
"retail",
"giant",
"."
] | [
0,
0,
1,
2,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"he",
"'s",
"also",
"producing",
"a",
"documentary",
"in",
"their",
"defense",
"one",
"that",
"depicts",
"Wal",
"-",
"Mart",
"as",
"an",
"ingenious",
"business",
"plan",
"that",
"is",
"good",
"for",
"America",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
9,
0
] |
[
"So",
"many",
"people",
"have",
"so",
"many",
"negative",
"things",
"to",
"say",
"about",
"it",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"but",
"what",
"about",
"the",
"positive",
"things",
"?"
] | [
0,
0,
0,
0,
0,
0,
0
] |
[
"While",
"Galloway",
"'s",
"pro-Wal-Mart",
"film",
"introduces",
"us",
"to",
"grateful",
"employees",
"/-"
] | [
0,
1,
0,
7,
0,
0,
0,
0,
0,
0,
0
] |
[
"Somebody",
"did",
"it",
"for",
"me",
"at",
"a",
"time",
"in",
"my",
"life",
"when",
"I",
"was",
"totally",
"hopeless",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"the",
"anti-Wal",
"-",
"mart",
"film",
"points",
"out",
"that",
"in",
"two",
"thousand",
"one",
"according",
"to",
"court",
"documents",
"workers",
"made",
"on",
"average",
"less",
"than",
"fourteen",
"thousand",
"dollars",
"a",
"year",
",",
"that",
"Wal",
"-",
"Mart",
"'s",
"CEO",
"made",
"twenty",
"-",
"seven",
"million",
"dollars",
"a",
"year",
"."
] | [
0,
0,
7,
8,
0,
0,
0,
0,
0,
15,
16,
16,
0,
0,
0,
0,
0,
0,
0,
0,
21,
22,
22,
22,
22,
0,
0,
0,
0,
7,
8,
8,
8,
0,
0,
21,
22,
22,
22,
22,
0,
0,
0
] |
[
"It",
"also",
"revealed",
"that",
"the",
"owners",
"of",
"Wal",
"-",
"Mart",
"the",
"Walton",
"family",
"has",
"an",
"average",
"net",
"worth",
"at",
"or",
"near",
"one",
"hundred",
"billion",
"dollars",
"give",
"or",
"take",
"a",
"billion",
"here",
"or",
"there",
"depending",
"on",
"the",
"price",
"of",
"their",
"stock",
"at",
"the",
"time",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
1,
0,
0,
0,
0,
0,
0,
0,
0,
0,
21,
22,
22,
22,
0,
0,
0,
27,
28,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Now",
"as",
"for",
"the",
"workers",
"or",
"associates",
"as",
"they",
"'re",
"called",
"by",
"Wal",
"-",
"Mart",
"their",
"average",
"salary",
"is",
"below",
"what",
"the",
"government",
"considers",
"the",
"federal",
"poverty",
"line",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"There",
"were",
"people",
"I",
"'d",
"see",
"that",
"did",
"n't",
"eat",
"nothing",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"they",
"'d",
"take",
"an",
"hour",
"lunch",
"and",
"just",
"sit",
"there",
"."
] | [
0,
0,
0,
0,
17,
0,
0,
0,
0,
0,
0
] |
[
"This",
"young",
"mother",
"that",
"I",
"talked",
"to",
"in",
"LA",
"was",
"on",
"Medi",
"-",
"Cal",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
9,
0,
0,
7,
8,
8,
0
] |
[
"she",
"got",
"on",
"the",
"Wal",
"-",
"Mart",
"plan",
"."
] | [
0,
0,
0,
0,
7,
8,
8,
0,
0
] |
[
"she",
"saw",
"a",
"doctor",
"for",
"the",
"first",
"time",
"."
] | [
0,
0,
0,
0,
0,
0,
25,
0,
0
] |
[
"The",
"anti-Wal-Mart",
"film",
"depicts",
"Wal",
"-",
"Mart",
"as",
"consumed",
"with",
"keeping",
"wages",
"and",
"benefits",
"down",
"and",
"unions",
"out",
"by",
"flying",
"in",
"executives",
"and",
"installing",
"cameras",
"to",
"monitor",
"employees",
"as",
"soon",
"as",
"there",
"'s",
"a",
"hint",
"that",
"workers",
"may",
"be",
"organizing",
"."
] | [
0,
7,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"The",
"retail",
"giant",
"gets",
"most",
"of",
"its",
"merchandise",
"from",
"outside",
"the",
"United",
"States",
"in",
"places",
"like",
"Honduras",
"and",
"Bangladesh",
"where",
"according",
"to",
"the",
"film",
"workers",
"make",
"as",
"little",
"as",
"fifteen",
"cents",
"an",
"hour",
"and",
"work",
"seven",
"days",
"a",
"week",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
9,
10,
10,
0,
0,
0,
9,
0,
9,
0,
0,
0,
0,
0,
0,
0,
21,
22,
22,
22,
22,
0,
0,
0,
0,
15,
16,
0,
0,
0
] |
[
"but",
"by",
"far",
"the",
"largest",
"provider",
"of",
"goods",
"to",
"Wal",
"-",
"Mart",
"is",
"China",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
9,
0
] |
[
"How",
"much",
"does",
"Wal",
"-",
"Mart",
"depend",
"on",
"China",
"?"
] | [
0,
0,
0,
7,
8,
8,
0,
0,
9,
0
] |
[
"Company",
"officials",
"announced",
"that",
"in",
"two",
"thousand",
"four",
"they",
"imported",
"more",
"than",
"eighteen",
"billion",
"dollars",
"in",
"goods",
"from",
"the",
"People",
"'s",
"Republic",
"."
] | [
0,
0,
0,
0,
0,
15,
16,
16,
0,
0,
21,
22,
22,
22,
22,
0,
0,
0,
9,
10,
10,
10,
0
] |
[
"I",
"think",
"Wal",
"-",
"Mart",
"is",
"a",
"straw",
"man",
"for",
"a",
"set",
"of",
"issues",
"that",
"a",
"certain",
"group",
"of",
"people",
"special",
"interests",
"groups",
"have",
"to",
"deal",
"with",
"in",
"American",
"society",
"."
] | [
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
3,
0,
0
] |
[
"Now",
"this",
"is",
"important",
"."
] | [
0,
0,
0,
0,
0
] |
[
"Officials",
"making",
"the",
"anti",
"uh",
"Wal",
"-",
"Mart",
"film",
"say",
"the",
"film",
"is",
"not",
"paid",
"for",
"with",
"union",
"dues",
"."
] | [
0,
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"We",
"asked",
"."
] | [
0,
0,
0
] |
[
"Also",
"Galloway",
"making",
"the",
"pro-Wal-Mart",
"film",
"tells",
"me",
"that",
"he",
"'s",
"received",
"no",
"money",
"and",
"little",
"support",
"in",
"fact",
"from",
"Wal",
"-",
"Mart",
"itself",
"."
] | [
0,
1,
0,
0,
7,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
7,
8,
8,
0,
0
] |
[
"In",
"fact",
"Wal",
"-",
"Mart",
"has",
"no",
"plans",
"at",
"this",
"time",
"to",
"even",
"carry",
"his",
"video",
"in",
"the",
"stores",
"."
] | [
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Kyra",
"?"
] | [
1,
0
] |
[
"Well",
"whatever",
"it",
"does",
"Wal",
"-",
"Mart",
"always",
"seems",
"to",
"be",
"under",
"a",
"microscope",
"."
] | [
0,
0,
0,
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Why",
"?"
] | [
0,
0
] |
[
"Because",
"a",
"company",
"that",
"big",
"that",
"profitable",
"is",
"often",
"imitated",
"by",
"other",
"companies",
"and",
"that",
"may",
"well",
"affect",
"your",
"benefits",
"and",
"your",
"career",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Take",
"health",
"care",
"for",
"example",
"."
] | [
0,
0,
0,
0,
0,
0
] |
[
"A",
"Wal",
"-",
"Mart",
"internal",
"memo",
"recently",
"leaked",
"spelled",
"out",
"just",
"how",
"far",
"the",
"retail",
"giant",
"may",
"be",
"willing",
"to",
"go",
"to",
"keep",
"its",
"health",
"costs",
"low",
"."
] | [
0,
7,
8,
8,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"The",
"question",
"is",
"will",
"other",
"companies",
"follow",
"suit",
"and",
"how",
"might",
"that",
"affect",
"you",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Here",
"'s",
"CNN",
"'s",
"Tom",
"Foreman",
"."
] | [
0,
0,
7,
0,
1,
2,
0
] |
[
"Out",
"in",
"the",
"morning",
"rush",
"of",
"Washington",
"DC",
"like",
"six",
"out",
"of",
"ten",
"Americans",
"Tim",
"Kane",
"gets",
"health",
"insurance",
"through",
"his",
"job",
"."
] | [
0,
0,
0,
17,
0,
0,
9,
10,
0,
27,
0,
0,
27,
3,
1,
2,
0,
0,
0,
0,
0,
0,
0
] |
[
"I",
"think",
"my",
"hands",
"are",
"tied",
"."
] | [
0,
0,
0,
0,
0,
0,
0
] |
[
"I",
"have",
"to",
"go",
"through",
"my",
"employers",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"But",
"as",
"an",
"economist",
"at",
"the",
"conservative",
"Heritage",
"foundation",
"Tim",
"supports",
"an",
"idea",
"that",
"might",
"put",
"other",
"commuters",
"into",
"shock",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
7,
0,
1,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Workplace",
"health",
"plans",
"can",
"be",
"should",
"be",
"and",
"may",
"be",
"going",
"away",
"."
] | [
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
[
"Americans",
"should",
"understand",
"that",
"they",
"would",
"be",
"much",
"better",
"off",
"if",
"businesses",
"were",
"n't",
"providing",
"them",
"health",
"care",
"if",
"they",
"could",
"buy",
"it",
"on",
"the",
"free",
"market",
"cheaply",
"."
] | [
3,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0,
0
] |
Dataset Card for ontonotes_english
Dataset Summary
This is preprocessed version of what I assume is OntoNotes v5.0.
Instead of having sentences stored in files, files are unpacked and sentences are the rows now. Also, fields were renamed in order to match conll2003.
The source of data is from private repository, which in turn got data from another public repository, location of which is unknown :)
Since data from all repositories had no license (creator of the private repository told me so), there should be no licensing issues. But bear in mind, I don't give any guarantees that this is real OntoNotes, and may differ as a result.
Supported Tasks and Leaderboards
- Named Entity Recognition on Ontonotes v5 (English)
- Coreference Resolution on OntoNotes
- Semantic Role Labeling on OntoNotes
Languages
English
Dataset Structure
Data Instances
{
'tokens': ['Well', ',', 'the', 'Hundred', 'Regiments', 'Offensive', 'was', 'divided', 'into', 'three', 'phases', '.'],
'ner_tags': [0, 0, 29, 30, 30, 30, 0, 0, 0, 27, 0, 0]
}
Data Fields
tokens
(List[str]
) :words
in original datasetner_tags
(List[ClassLabel]
) :named_entities
in original dataset. The BIO tags for named entities in the sentence.- tag set :
datasets.ClassLabel(num_classes=37, names=["O", "B-PERSON", "I-PERSON", "B-NORP", "I-NORP", "B-FAC", "I-FAC", "B-ORG", "I-ORG", "B-GPE", "I-GPE", "B-LOC", "I-LOC", "B-PRODUCT", "I-PRODUCT", "B-DATE", "I-DATE", "B-TIME", "I-TIME", "B-PERCENT", "I-PERCENT", "B-MONEY", "I-MONEY", "B-QUANTITY", "I-QUANTITY", "B-ORDINAL", "I-ORDINAL", "B-CARDINAL", "I-CARDINAL", "B-EVENT", "I-EVENT", "B-WORK_OF_ART", "I-WORK_OF_ART", "B-LAW", "I-LAW", "B-LANGUAGE", "I-LANGUAGE",])
- tag set :
Data Splits
train, validation, and test
Dataset Creation
Curation Rationale
[More Information Needed]
Source Data
Initial Data Collection and Normalization
[More Information Needed]
Who are the source language producers?
[More Information Needed]
Annotations
Annotation process
[More Information Needed]
Who are the annotators?
[More Information Needed]
Personal and Sensitive Information
[More Information Needed]
Considerations for Using the Data
Social Impact of Dataset
[More Information Needed]
Discussion of Biases
[More Information Needed]
Other Known Limitations
[More Information Needed]
Additional Information
Dataset Curators
[More Information Needed]
Licensing Information
No license
Citation Information
@inproceedings{pradhan-etal-2013-towards,
title = "Towards Robust Linguistic Analysis using {O}nto{N}otes",
author = {Pradhan, Sameer and
Moschitti, Alessandro and
Xue, Nianwen and
Ng, Hwee Tou and
Bj{\"o}rkelund, Anders and
Uryupina, Olga and
Zhang, Yuchen and
Zhong, Zhi},
booktitle = "Proceedings of the Seventeenth Conference on Computational Natural Language Learning",
month = aug,
year = "2013",
address = "Sofia, Bulgaria",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W13-3516",
pages = "143--152",
}
Contributions
Thanks to the author of private repository, that uploaded this dataset.
- Downloads last month
- 76