Datasets:

Modalities:
Text
Formats:
json
Languages:
English
ArXiv:
Libraries:
Datasets
Dask
License:
Dataset Preview
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type string to null
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2011, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 585, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in cast_table_to_schema
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in <listcomp>
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2116, in cast_array_to_feature
                  return array_cast(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1962, in array_cast
                  raise TypeError(f"Couldn't cast array of type {_short_str(array.type)} to {_short_str(pa_type)}")
              TypeError: Couldn't cast array of type string to null
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1524, in compute_config_parquet_and_info_response
                  parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1099, in stream_convert_to_parquet
                  builder._prepare_split(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

img_path
string
modality
string
did
string
src_content
string
txt
null
mbeir_images/cirr_images/train/49/train-1287-0-img0.jpg
image
8:1
{"id": "train-1287-0-img0"}
null
mbeir_images/cirr_images/train/72/train-224-1-img1.jpg
image
8:2
{"id": "train-224-1-img1"}
null
mbeir_images/cirr_images/train/97/train-10958-1-img1.jpg
image
8:3
{"id": "train-10958-1-img1"}
null
mbeir_images/cirr_images/train/10/train-6249-3-img1.jpg
image
8:4
{"id": "train-6249-3-img1"}
null
mbeir_images/cirr_images/train/87/train-7533-0-img1.jpg
image
8:5
{"id": "train-7533-0-img1"}
null
mbeir_images/cirr_images/train/68/train-12504-3-img1.jpg
image
8:6
{"id": "train-12504-3-img1"}
null
mbeir_images/cirr_images/train/14/train-6628-0-img1.jpg
image
8:7
{"id": "train-6628-0-img1"}
null
mbeir_images/cirr_images/test1/test1-437-3-img0.jpg
image
8:8
{"id": "test1-437-3-img0"}
null
mbeir_images/cirr_images/train/52/train-10264-1-img0.jpg
image
8:9
{"id": "train-10264-1-img0"}
null
mbeir_images/cirr_images/dev/dev-257-1-img1.jpg
image
8:10
{"id": "dev-257-1-img1"}
null
mbeir_images/cirr_images/train/36/train-3161-0-img0.jpg
image
8:11
{"id": "train-3161-0-img0"}
null
mbeir_images/cirr_images/train/36/train-10451-1-img0.jpg
image
8:12
{"id": "train-10451-1-img0"}
null
mbeir_images/cirr_images/train/42/train-10643-0-img0.jpg
image
8:13
{"id": "train-10643-0-img0"}
null
mbeir_images/cirr_images/train/32/train-1288-3-img0.jpg
image
8:14
{"id": "train-1288-3-img0"}
null
mbeir_images/cirr_images/train/77/train-544-2-img0.jpg
image
8:15
{"id": "train-544-2-img0"}
null
mbeir_images/cirr_images/train/30/train-8414-3-img0.jpg
image
8:16
{"id": "train-8414-3-img0"}
null
mbeir_images/cirr_images/train/89/train-11375-2-img1.jpg
image
8:17
{"id": "train-11375-2-img1"}
null
mbeir_images/cirr_images/train/29/train-1009-0-img0.jpg
image
8:18
{"id": "train-1009-0-img0"}
null
mbeir_images/cirr_images/train/2/train-9545-3-img1.jpg
image
8:19
{"id": "train-9545-3-img1"}
null
mbeir_images/cirr_images/train/97/train-4640-0-img1.jpg
image
8:20
{"id": "train-4640-0-img1"}
null
mbeir_images/cirr_images/dev/dev-324-1-img0.jpg
image
8:21
{"id": "dev-324-1-img0"}
null
mbeir_images/cirr_images/train/54/train-521-3-img0.jpg
image
8:22
{"id": "train-521-3-img0"}
null
mbeir_images/cirr_images/train/56/train-373-2-img0.jpg
image
8:23
{"id": "train-373-2-img0"}
null
mbeir_images/cirr_images/train/68/train-4789-3-img1.jpg
image
8:24
{"id": "train-4789-3-img1"}
null
mbeir_images/cirr_images/train/32/train-1921-3-img1.jpg
image
8:25
{"id": "train-1921-3-img1"}
null
mbeir_images/cirr_images/train/82/train-2794-3-img0.jpg
image
8:26
{"id": "train-2794-3-img0"}
null
mbeir_images/cirr_images/dev/dev-253-3-img1.jpg
image
8:27
{"id": "dev-253-3-img1"}
null
mbeir_images/cirr_images/train/61/train-6566-2-img1.jpg
image
8:28
{"id": "train-6566-2-img1"}
null
mbeir_images/cirr_images/train/59/train-5689-0-img0.jpg
image
8:29
{"id": "train-5689-0-img0"}
null
mbeir_images/cirr_images/train/56/train-4636-1-img0.jpg
image
8:30
{"id": "train-4636-1-img0"}
null
mbeir_images/cirr_images/train/69/train-939-0-img0.jpg
image
8:31
{"id": "train-939-0-img0"}
null
mbeir_images/cirr_images/train/31/train-2151-1-img0.jpg
image
8:32
{"id": "train-2151-1-img0"}
null
mbeir_images/cirr_images/train/50/train-1192-1-img0.jpg
image
8:33
{"id": "train-1192-1-img0"}
null
mbeir_images/cirr_images/train/45/train-3880-2-img1.jpg
image
8:34
{"id": "train-3880-2-img1"}
null
mbeir_images/cirr_images/train/74/train-7993-0-img1.jpg
image
8:35
{"id": "train-7993-0-img1"}
null
mbeir_images/cirr_images/train/41/train-8307-0-img0.jpg
image
8:36
{"id": "train-8307-0-img0"}
null
mbeir_images/cirr_images/train/17/train-7561-3-img0.jpg
image
8:37
{"id": "train-7561-3-img0"}
null
mbeir_images/cirr_images/dev/dev-108-1-img1.jpg
image
8:38
{"id": "dev-108-1-img1"}
null
mbeir_images/cirr_images/train/38/train-3962-1-img0.jpg
image
8:39
{"id": "train-3962-1-img0"}
null
mbeir_images/cirr_images/test1/test1-317-3-img0.jpg
image
8:40
{"id": "test1-317-3-img0"}
null
mbeir_images/cirr_images/train/35/train-92-0-img0.jpg
image
8:41
{"id": "train-92-0-img0"}
null
mbeir_images/cirr_images/train/27/train-4683-3-img0.jpg
image
8:42
{"id": "train-4683-3-img0"}
null
mbeir_images/cirr_images/test1/test1-335-2-img0.jpg
image
8:43
{"id": "test1-335-2-img0"}
null
mbeir_images/cirr_images/train/12/train-2775-1-img1.jpg
image
8:44
{"id": "train-2775-1-img1"}
null
mbeir_images/cirr_images/train/83/train-9972-3-img0.jpg
image
8:45
{"id": "train-9972-3-img0"}
null
mbeir_images/cirr_images/train/26/train-6700-0-img0.jpg
image
8:46
{"id": "train-6700-0-img0"}
null
mbeir_images/cirr_images/train/73/train-11702-1-img0.jpg
image
8:47
{"id": "train-11702-1-img0"}
null
mbeir_images/cirr_images/train/73/train-8223-2-img1.jpg
image
8:48
{"id": "train-8223-2-img1"}
null
mbeir_images/cirr_images/train/12/train-6173-1-img0.jpg
image
8:49
{"id": "train-6173-1-img0"}
null
mbeir_images/cirr_images/train/21/train-3145-3-img1.jpg
image
8:50
{"id": "train-3145-3-img1"}
null
mbeir_images/cirr_images/train/61/train-11445-2-img1.jpg
image
8:51
{"id": "train-11445-2-img1"}
null
mbeir_images/cirr_images/train/19/train-2838-3-img0.jpg
image
8:52
{"id": "train-2838-3-img0"}
null
mbeir_images/cirr_images/train/27/train-5949-3-img0.jpg
image
8:53
{"id": "train-5949-3-img0"}
null
mbeir_images/cirr_images/train/15/train-5386-3-img1.jpg
image
8:54
{"id": "train-5386-3-img1"}
null
mbeir_images/cirr_images/train/83/train-12584-0-img0.jpg
image
8:55
{"id": "train-12584-0-img0"}
null
mbeir_images/cirr_images/train/17/train-7469-3-img1.jpg
image
8:56
{"id": "train-7469-3-img1"}
null
mbeir_images/cirr_images/dev/dev-388-1-img0.jpg
image
8:57
{"id": "dev-388-1-img0"}
null
mbeir_images/cirr_images/dev/dev-809-2-img1.jpg
image
8:58
{"id": "dev-809-2-img1"}
null
mbeir_images/cirr_images/train/35/train-10491-2-img0.jpg
image
8:59
{"id": "train-10491-2-img0"}
null
mbeir_images/cirr_images/train/21/train-4312-0-img1.jpg
image
8:60
{"id": "train-4312-0-img1"}
null
mbeir_images/cirr_images/train/64/train-11995-1-img1.jpg
image
8:61
{"id": "train-11995-1-img1"}
null
mbeir_images/cirr_images/train/56/train-6000-2-img0.jpg
image
8:62
{"id": "train-6000-2-img0"}
null
mbeir_images/cirr_images/train/88/train-5276-0-img0.jpg
image
8:63
{"id": "train-5276-0-img0"}
null
mbeir_images/cirr_images/train/47/train-2097-3-img1.jpg
image
8:64
{"id": "train-2097-3-img1"}
null
mbeir_images/cirr_images/train/66/train-2787-1-img0.jpg
image
8:65
{"id": "train-2787-1-img0"}
null
mbeir_images/cirr_images/train/73/train-5024-2-img1.jpg
image
8:66
{"id": "train-5024-2-img1"}
null
mbeir_images/cirr_images/train/66/train-5121-1-img1.jpg
image
8:67
{"id": "train-5121-1-img1"}
null
mbeir_images/cirr_images/train/38/train-9684-3-img0.jpg
image
8:68
{"id": "train-9684-3-img0"}
null
mbeir_images/cirr_images/test1/test1-398-0-img0.jpg
image
8:69
{"id": "test1-398-0-img0"}
null
mbeir_images/cirr_images/train/11/train-11458-3-img0.jpg
image
8:70
{"id": "train-11458-3-img0"}
null
mbeir_images/cirr_images/train/21/train-5639-3-img1.jpg
image
8:71
{"id": "train-5639-3-img1"}
null
mbeir_images/cirr_images/train/90/train-4181-3-img0.jpg
image
8:72
{"id": "train-4181-3-img0"}
null
mbeir_images/cirr_images/train/43/train-11856-2-img1.jpg
image
8:73
{"id": "train-11856-2-img1"}
null
mbeir_images/cirr_images/train/50/train-7078-1-img1.jpg
image
8:74
{"id": "train-7078-1-img1"}
null
mbeir_images/cirr_images/test1/test1-242-0-img1.jpg
image
8:75
{"id": "test1-242-0-img1"}
null
mbeir_images/cirr_images/test1/test1-99-3-img0.jpg
image
8:76
{"id": "test1-99-3-img0"}
null
mbeir_images/cirr_images/dev/dev-459-0-img1.jpg
image
8:77
{"id": "dev-459-0-img1"}
null
mbeir_images/cirr_images/train/79/train-8455-2-img1.jpg
image
8:78
{"id": "train-8455-2-img1"}
null
mbeir_images/cirr_images/test1/test1-107-1-img0.jpg
image
8:79
{"id": "test1-107-1-img0"}
null
mbeir_images/cirr_images/dev/dev-187-0-img0.jpg
image
8:80
{"id": "dev-187-0-img0"}
null
mbeir_images/cirr_images/train/90/train-5436-3-img0.jpg
image
8:81
{"id": "train-5436-3-img0"}
null
mbeir_images/cirr_images/train/79/train-5647-3-img1.jpg
image
8:82
{"id": "train-5647-3-img1"}
null
mbeir_images/cirr_images/train/28/train-7550-1-img1.jpg
image
8:83
{"id": "train-7550-1-img1"}
null
mbeir_images/cirr_images/train/55/train-701-1-img0.jpg
image
8:84
{"id": "train-701-1-img0"}
null
mbeir_images/cirr_images/train/86/train-577-1-img0.jpg
image
8:85
{"id": "train-577-1-img0"}
null
mbeir_images/cirr_images/dev/dev-93-3-img0.jpg
image
8:86
{"id": "dev-93-3-img0"}
null
mbeir_images/cirr_images/train/49/train-379-0-img0.jpg
image
8:87
{"id": "train-379-0-img0"}
null
mbeir_images/cirr_images/train/62/train-12466-0-img1.jpg
image
8:88
{"id": "train-12466-0-img1"}
null
mbeir_images/cirr_images/train/65/train-6458-0-img0.jpg
image
8:89
{"id": "train-6458-0-img0"}
null
mbeir_images/cirr_images/train/50/train-4967-3-img0.jpg
image
8:90
{"id": "train-4967-3-img0"}
null
mbeir_images/cirr_images/train/48/train-3826-1-img0.jpg
image
8:91
{"id": "train-3826-1-img0"}
null
mbeir_images/cirr_images/test1/test1-43-0-img1.jpg
image
8:92
{"id": "test1-43-0-img1"}
null
mbeir_images/cirr_images/train/40/train-6687-1-img1.jpg
image
8:93
{"id": "train-6687-1-img1"}
null
mbeir_images/cirr_images/train/80/train-12975-0-img1.jpg
image
8:94
{"id": "train-12975-0-img1"}
null
mbeir_images/cirr_images/train/34/train-1625-1-img1.jpg
image
8:95
{"id": "train-1625-1-img1"}
null
mbeir_images/cirr_images/train/60/train-808-3-img1.jpg
image
8:96
{"id": "train-808-3-img1"}
null
mbeir_images/cirr_images/test1/test1-297-0-img0.jpg
image
8:97
{"id": "test1-297-0-img0"}
null
mbeir_images/cirr_images/train/47/train-11298-3-img0.jpg
image
8:98
{"id": "train-11298-3-img0"}
null
mbeir_images/cirr_images/train/46/train-3850-3-img0.jpg
image
8:99
{"id": "train-3850-3-img0"}
null
mbeir_images/cirr_images/train/26/train-8717-3-img0.jpg
image
8:100
{"id": "train-8717-3-img0"}
null
End of preview.

UniIR: Training and Benchmarking Universal Multimodal Information Retrievers (ECCV 2024)

🌐 Homepage | πŸ€— Model(UniIR Checkpoints) | πŸ€— Paper | πŸ“– arXiv | GitHub

How to download the M-BEIR Dataset

πŸ””News

  • πŸ”₯[2023-12-21]: Our M-BEIR Benchmark is now available for use.

Dataset Summary

M-BEIR, the Multimodal BEnchmark for Instructed Retrieval, is a comprehensive large-scale retrieval benchmark designed to train and evaluate unified multimodal retrieval models (UniIR models). The M-BEIR benchmark comprises eight multimodal retrieval tasks and ten datasets from a variety of domains and sources. Each task is accompanied by human-authored instructions, encompassing 1.5 million queries and a pool of 5.6 million retrieval candidates in total.

Dataset Structure Overview

The M-BEIR dataset is structured into five primary components: Query Data, Candidate Pool, Instructions, Qrels, and Images.

Query Data

Below is the directory structure for the query data:

query/
β”‚
β”œβ”€β”€ train/
β”‚   β”œβ”€β”€ mbeir_cirr_train.jsonl
β”‚   β”œβ”€β”€ mbeir_edis_train.jsonl
β”‚   ...
β”œβ”€β”€ union_train/
β”‚   └── mbeir_union_up_train.jsonl
β”œβ”€β”€ val/
β”‚   β”œβ”€β”€ mbeir_visualnews_task0_val.jsonl
β”‚   β”œβ”€β”€ mbeir_visualnews_task3_val.jsonl
β”‚   ...
└── test/
    β”œβ”€β”€ mbeir_visualnews_task0_test.jsonl
    β”œβ”€β”€ mbeir_visualnews_task3_test.jsonl
    ...

train: Contains all the training data from 8 different datasets formatted in the M-BEIR style.

mbeir_union_up_train.jsonl: This file is the default training data for in-batch contrastive training specifically designed for UniIR models. It aggregates all the data from the train directory and datasets with relatively smaller sizes have been upsampled to balance the training process.

val: Contains separate files for validation queries, organized by task.

test: Contains separate files for test queries, organized by task.

Every M-BEIR query instance has at least one positive candidate data and possibly no negative candidate data Each line in a Query Data file represents a unique query. The structure of each query JSON object is as follows::

{
  "qid": "A unique identifier formatted as {dataset_id}:{query_id}",
  "query_txt": "The text component of the query",
  "query_img_path": "The file path to the associated query image",
  "query_modality": "The modality type of the query (text, image or text,image)",
  "query_src_content": "Additional content from the original dataset, presented as a string by json.dumps()",
  "pos_cand_list": [
    {
      "did": "A unique identifier formatted as {dataset_id}:{doc_id}"
    }
    // ... more positive candidates
  ],
  "neg_cand_list": [
    {
      "did": "A unique identifier formatted as {dataset_id}:{doc_id}"
    }
    // ... more negative candidates
  ]
}

Candidate Pool

The Candidate Pool contains potential matching documents for the queries.

M-BEIR_5.6M

Within the global directory, the default retrieval setting requires models to retrieve positive candidates from a heterogeneous pool encompassing various modalities and domains. The M-BEIR's global candidate pool, comprising 5.6 million candidates, includes the retrieval corpus from all tasks and datasets.

M-BEIR_local

Within the local directory, we provide dataset-task-specific pool as M-BEIR_local. Dataset-task-specific pool contains homogeneous candidates that originate from by the original dataset.

Below is the directory structure for the candidate pool:

cand_pool/
β”‚
β”œβ”€β”€ global/
β”‚   β”œβ”€β”€ mbeir_union_val_cand_pool.jsonl
β”‚   └──mbeir_union_test_cand_pool.jsonl
β”‚
└── local/
    β”œβ”€β”€ mbeir_visualnews_task0_cand_pool.jsonl
    β”œβ”€β”€ mbeir_visualnews_task3_cand_pool.jsonl
    ...

The structure of each candidate JSON object in cand_pool file is as follows::

{
  "did": "A unique identifier for the document, formatted as {dataset_id}:{doc_id}",
  "txt": "The text content of the candidate document",
  "img_path": "The file path to the candidate document's image",
  "modality": "The modality type of the candidate (e.g., text, image or text,image)",
  "src_content": "Additional content from the original dataset, presented as a string by json.dumps()"
}

Instructions

query_instructions.tsv contains human-authorized instructions within the UniIR framework. Each task is accompanied by four human-authored instructions. For detailed usage, please refer to GitHub Repo.

Qrels

Within the qrels directory, you will find qrels for both the validation and test sets. These files serve the purpose of evaluating UniIR models. For detailed information, please refer to GitHub Repo.

How to Use

Downloading the M-BEIR Dataset

Step 1: Install Git Large File Storage (LFS)

Before you begin, ensure that Git LFS is installed on your system. Git LFS is essential for handling large data files. If you do not have Git LFS installed, follow these steps:

Download and install Git LFS from the official website. After installation, run the following command in your terminal to initialize Git LFS:

git lfs install

Step 2: Clone the M-BEIR Dataset Repository

Once Git LFS is set up, you can clone the M-BEIR repo from the current Page. Open your terminal and execute the following command:

git clone https://huggingface.co/datasets/TIGER-Lab/M-BEIR

Please note that the M-BEIR dataset is quite large, and downloading it can take several hours, depending on your internet connection. During this time, your terminal may not show much activity. The terminal might appear stuck, but if there's no error message, the download process is still ongoing.

Decompressing M-BEIR Images

After downloading, you will need to decompress the image files. Follow these steps in your terminal:

# Navigate to the M-BEIR directory
cd path/to/M-BEIR

# Combine the split tar.gz files into one
sh -c 'cat mbeir_images.tar.gz.part-00 mbeir_images.tar.gz.part-01 mbeir_images.tar.gz.part-02 mbeir_images.tar.gz.part-03 > mbeir_images.tar.gz'

# Extract the images from the tar.gz file
tar -xzf mbeir_images.tar.gz

Now, you are ready to use the M-BEIR benchmark.

Dataloader and Evaluation Pipeline

We offer a dedicated dataloader and evaluation pipeline for the M-BEIR benchmark. Please refer to GitHub Repo for detailed information.

Citation

Please cite our paper if you use our data, model or code.

@article{wei2023uniir,
  title={UniIR: Training and Benchmarking Universal Multimodal Information Retrievers},
  author={Wei, Cong and Chen, Yang and Chen, Haonan and Hu, Hexiang and Zhang, Ge and Fu, Jie and Ritter, Alan and Chen, Wenhu},
  journal={arXiv preprint arXiv:2311.17136},
  year={2023}
}
Downloads last month
1,205

Collection including TIGER-Lab/M-BEIR