Error when loading the dataset
I used the code given by the dataset card page:
from datasets import load_dataset
dataset = load_dataset("HuggingFaceM4/WebSight")
and I get an error:
from datasets import load_dataset
dataset = load_dataset("HuggingFaceM4/WebSight")
Downloading and preparing dataset None/v0.2 to file:///data/data1/syc/huggingface-dataset/HuggingFaceM4___parquet/v0.2-9c4500244f23d791/0.0.0/2a3b91fbd88a2c90d1dbbb32b460cf621d31bd5b05b934492fdef7d8d6f236ec...
Downloading data files: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 246.16it/s]
Extracting data files: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 47.23it/s]
Traceback (most recent call last):
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/builder.py", line 1873, in _prepare_split_single
writer.write_table(table)
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/arrow_writer.py", line 568, in write_table
pa_table = table_cast(pa_table, self._schema)
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/table.py", line 2290, in table_cast
return cast_table_to_schema(table, schema)
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/table.py", line 2248, in cast_table_to_schema
raise ValueError(f"Couldn't cast\n{table.schema}\nto\n{features}\nbecause column names don't match")
ValueError: Couldn't cast
image: struct<bytes: binary, path: string>
child 0, bytes: binary
child 1, path: string
text: string
-- schema metadata --
huggingface: '{"info": {"features": {"image": {"_type": "Image"}, "text":' + 41
to
{'image': Image(decode=True, id=None), 'text': Value(dtype='string', id=None), 'llm_generated_idea': Value(dtype='string', id=None)}
because column names don't match
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "", line 1, in
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/load.py", line 1797, in load_dataset
builder_instance.download_and_prepare(
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/builder.py", line 890, in download_and_prepare
self._download_and_prepare(
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/builder.py", line 985, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/builder.py", line 1746, in _prepare_split
for job_id, done, content in self._prepare_split_single(
File "/home/syc/anaconda3/envs/air/lib/python3.9/site-packages/datasets/builder.py", line 1891, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.builder.DatasetGenerationError: An error occurred while generating the dataset
how can I fix it to used your dataset?
Can you try by specifying the config name?
With dataset = load_dataset("HuggingFaceM4/WebSight", "v0.2")
or dataset = load_dataset("HuggingFaceM4/WebSight", name="v0.2")
or dataset = load_dataset("HuggingFaceM4/WebSight", config_name="v0.2")
(pick the one that works)?
I imagine it's because the version v0.2 has one more column than the version v0.1.
I removed my local cache and try to re-download the dataset with your code, it will take a few time.
I have got the same thoughts according to the error message at the beginning, but I noticed the log says:
"Downloading and preparing dataset None/v0.2 to file:///data/data1/syc/huggingface-dataset/HuggingFaceM4___parquet/v0.2-9c4500244f23d791/0.0.0/2a3b91fbd88a2c90d1dbbb32b460cf621d31bd5b05b934492fdef7d8d6f236ec..."
So I think I loaded the right version but got the above error, and I'm not sure whether your code gonna work...
I've just tried yout code with my local cache, like:
dataset = load_dataset("HuggingFaceM4/WebSight", "v0.2", cache_dir="/data/data1/syc/huggingface-dataset/")
but I got the same error
I removed my local cache and try to re-download the dataset with your code, it will take a few time.
I have got the same thoughts according to the error message at the beginning, but I noticed the log says:
"Downloading and preparing dataset None/v0.2 to file:///data/data1/syc/huggingface-dataset/HuggingFaceM4___parquet/v0.2-9c4500244f23d791/0.0.0/2a3b91fbd88a2c90d1dbbb32b460cf621d31bd5b05b934492fdef7d8d6f236ec..."
So I think I loaded the right version but got the above error, and I'm not sure whether your code gonna work...
I got same error after the re-download with the code:
dataset = load_dataset("HuggingFaceM4/WebSight", "v0.2", cache_dir="/data/data1/syc/huggingface-dataset/")
Hi
@9211sun
I just (downloaded and) created the dataset from scratch with load_dataset("HuggingFaceM4/WebSight", "v0.2")
and it worked fine.
what is your hf datasets library version? I am using 2.14.7. perhaps upgrading here would help?
I am facing the same issue. I am using hf datasets library 3.1.0.
This is the code I am using:
import os
from datasets import load_dataset
current_dir = os.path.dirname(os.path.abspath(file))
ds = load_dataset("HuggingFaceM4/WebSight", "v0.2", cache_dir=os.path.join(current_dir, "cache_v2"))
save_path = os.path.join(current_dir, "saved_dataset")
ds.save_to_disk(save_path)