Can't load the dataset

#2
by WarmAI - opened

Can't load it with

from datasets import load_dataset
dataset = load_dataset("togethercomputer/RedPajama-Data-1T-Sample")
File .venv/lib/python3.10/site-packages/datasets/table.py:2110, in cast_array_to_feature(array, feature, allow_number_to_str)
   2109     return array_cast(array, feature(), allow_number_to_str=allow_number_to_str)
-> 2110 raise TypeError(f"Couldn't cast array of type\n{array.type}\nto\n{feature}")

TypeError: Couldn't cast array of type
struct<short_book_title: string, publication_date: int64, url: string, title: string>
to
{'timestamp': Value(dtype='timestamp[s]', id=None), 'yymm': Value(dtype='string', id=None), 'arxiv_id': Value(dtype='string', id=None), 'language': Value(dtype='string', id=None), 'url': Value(dtype='string', id=None)}

I unsuccesfully tried it with a few older versions of Datasets and Python 3.7 as well.

Thanks for the report, this should work now! You may need to pass in force_redownload=True the first time:

from datasets import load_dataset
dataset = load_dataset("togethercomputer/RedPajama-Data-1T-Sample", force_redownload=True)

print(dataset["train"][0])

Seems like force_redownload=True doesn't exist and download_mode=DownloadMode.FORCE_REDOWNLOAD doesn't work, so I manually delete the cache.
Now I get a different error:

KeyError                                  Traceback (most recent call last)
File .venv/lib/python3.10/site-packages/datasets/builder.py:1610, in GeneratorBasedBuilder._prepare_split_single(self, gen_kwargs, fpath, file_format, max_shard_size, split_info, check_duplicate_keys, job_id)
  1609 _time = time.time()
-> 1610 for key, record in generator:
  1611     if max_shard_size is not None and writer._num_bytes > max_shard_size:

File ~/.cache/huggingface/modules/datasets_modules/datasets/togethercomputer--RedPajama-Data-1T-Sample/ad13e6c92c3498589dcba4783636203a0eea84adbf1e4b01f7e82eb5f3db0e3d/RedPajama-Data-1T-Sample.py:105, in RedPajama1TSample._generate_examples(self, filepaths)
   102 else:
   103     yield key, {
   104         "text": data["text"],
--> 105         "meta": data["meta"],
   106     }
   107 key += 1

KeyError: 'meta'
File .venv/lib/python3.10/site-packages/datasets/builder.py:1646, in GeneratorBasedBuilder._prepare_split_single(self, gen_kwargs, fpath, file_format, max_shard_size, split_info, check_duplicate_keys, job_id)
   1644     if isinstance(e, SchemaInferenceError) and e.__context__ is not None:
   1645         e = e.__context__
-> 1646     raise DatasetGenerationError("An error occurred while generating the dataset") from e
   1648 yield job_id, True, (total_num_examples, total_num_bytes, writer._features, num_shards, shard_lengths)

DatasetGenerationError: An error occurred while generating the dataset
Generating train split: 366259 examples [00:33, 24646.12 examples/s]

I'm also seeing the "meta": data["meta"], error

Together org

Can you try again now with this:

from datasets import load_dataset

ds = load_dataset("togethercomputer/RedPajama-Data-1T-Sample", download_mode="force_redownload")

Thank you, it works now.

WarmAI changed discussion status to closed

Sign up or log in to comment