Dataset Viewer
Full Screen Viewer
Full Screen
The dataset viewer is not available for this split.
Cannot load the dataset split (in normal download mode) to extract the first rows.
Error code: NormalRowsError Exception: DatasetGenerationError Message: An error occurred while generating the dataset Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 323, in compute compute_first_rows_from_parquet_response( File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response rows_index = indexer.get_rows_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 631, in get_rows_index return RowsIndex( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 512, in __init__ self.parquet_index = self._init_parquet_index( File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 529, in _init_parquet_index response = get_previous_step_or_raise( File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 566, in get_previous_step_or_raise raise CachedArtifactError( libcommon.simple_cache.CachedArtifactError: The previous step failed. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/utils.py", line 92, in get_rows_or_raise return get_rows( File "/src/libs/libcommon/src/libcommon/utils.py", line 183, in decorator return func(*args, **kwargs) File "/src/services/worker/src/worker/utils.py", line 69, in get_rows rows_plus_one = list(itertools.islice(ds, rows_max_number + 1)) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1389, in __iter__ for key, example in ex_iterable: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 282, in __iter__ for key, pa_table in self.generate_tables_fn(**self.kwargs): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 86, in _generate_tables parquet_file = pq.ParquetFile(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 341, in __init__ self.reader.open( File "pyarrow/_parquet.pyx", line 1262, in pyarrow._parquet.ParquetReader.open File "pyarrow/types.pxi", line 88, in pyarrow.lib._datatype_to_pep3118 File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow_hotfix/__init__.py", line 47, in __arrow_ext_deserialize__ raise RuntimeError( RuntimeError: Disallowed deserialization of 'arrow.py_extension_type': storage_type = list<item: list<item: float>> serialized = b'\x80\x04\x95L\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array2DExtensionType\x94\x93\x94M\x88\x13K\x0c\x86\x94\x8c\x07float32\x94\x86\x94R\x94.' pickle disassembly: 0: \x80 PROTO 4 2: \x95 FRAME 76 11: \x8c SHORT_BINUNICODE 'datasets.features.features' 39: \x94 MEMOIZE (as 0) 40: \x8c SHORT_BINUNICODE 'Array2DExtensionType' 62: \x94 MEMOIZE (as 1) 63: \x93 STACK_GLOBAL 64: \x94 MEMOIZE (as 2) 65: M BININT2 5000 68: K BININT1 12 70: \x86 TUPLE2 71: \x94 MEMOIZE (as 3) 72: \x8c SHORT_BINUNICODE 'float32' 81: \x94 MEMOIZE (as 4) 82: \x86 TUPLE2 83: \x94 MEMOIZE (as 5) 84: R REDUCE 85: \x94 MEMOIZE (as 6) 86: . STOP highest protocol among opcodes = 4 Reading of untrusted Parquet or Feather files with a PyExtensionType column allows arbitrary code execution. If you trust this file, you can enable reading the extension type by one of: - upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)` - disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()` We strongly recommend updating your Parquet/Feather files to use extension types derived from `pyarrow.ExtensionType` instead, and register this type explicitly. See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types for more details. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1995, in _prepare_split_single for _, table in generator: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 86, in _generate_tables parquet_file = pq.ParquetFile(f) File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 341, in __init__ self.reader.open( File "pyarrow/_parquet.pyx", line 1262, in pyarrow._parquet.ParquetReader.open File "pyarrow/types.pxi", line 88, in pyarrow.lib._datatype_to_pep3118 File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow_hotfix/__init__.py", line 47, in __arrow_ext_deserialize__ raise RuntimeError( RuntimeError: Disallowed deserialization of 'arrow.py_extension_type': storage_type = list<item: list<item: float>> serialized = b'\x80\x04\x95L\x00\x00\x00\x00\x00\x00\x00\x8c\x1adatasets.features.features\x94\x8c\x14Array2DExtensionType\x94\x93\x94M\x88\x13K\x0c\x86\x94\x8c\x07float32\x94\x86\x94R\x94.' pickle disassembly: 0: \x80 PROTO 4 2: \x95 FRAME 76 11: \x8c SHORT_BINUNICODE 'datasets.features.features' 39: \x94 MEMOIZE (as 0) 40: \x8c SHORT_BINUNICODE 'Array2DExtensionType' 62: \x94 MEMOIZE (as 1) 63: \x93 STACK_GLOBAL 64: \x94 MEMOIZE (as 2) 65: M BININT2 5000 68: K BININT1 12 70: \x86 TUPLE2 71: \x94 MEMOIZE (as 3) 72: \x8c SHORT_BINUNICODE 'float32' 81: \x94 MEMOIZE (as 4) 82: \x86 TUPLE2 83: \x94 MEMOIZE (as 5) 84: R REDUCE 85: \x94 MEMOIZE (as 6) 86: . STOP highest protocol among opcodes = 4 Reading of untrusted Parquet or Feather files with a PyExtensionType column allows arbitrary code execution. If you trust this file, you can enable reading the extension type by one of: - upgrading to pyarrow >= 14.0.1, and call `pa.PyExtensionType.set_auto_load(True)` - disable this error by running `import pyarrow_hotfix; pyarrow_hotfix.uninstall()` We strongly recommend updating your Parquet/Feather files to use extension types derived from `pyarrow.ExtensionType` instead, and register this type explicitly. See https://arrow.apache.org/docs/dev/python/extending_types.html#defining-extension-types-user-defined-types for more details. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/utils.py", line 120, in get_rows_or_raise return get_rows( File "/src/libs/libcommon/src/libcommon/utils.py", line 183, in decorator return func(*args, **kwargs) File "/src/services/worker/src/worker/utils.py", line 53, in get_rows ds = load_dataset( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 2609, in load_dataset builder_instance.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1027, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1122, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
No dataset card yet
New: Create and edit this dataset card directly on the website!
Contribute a Dataset Card- Downloads last month
- 9