patomp's picture
update refs
bb91524
---
dataset_info:
features:
- name: image
dtype: image
- name: filepath
dtype: string
- name: sentids
list: int32
- name: filename
dtype: string
- name: imgid
dtype: int32
- name: split
dtype: string
- name: sentences_tokens
list:
list: string
- name: sentences_raw
list: string
- name: sentences_sentid
list: int32
- name: cocoid
dtype: int32
- name: th_sentences_raw
sequence: string
splits:
- name: test
num_bytes: 819234726.0
num_examples: 5000
- name: validation
num_bytes: 807387321.0
num_examples: 5000
- name: train
num_bytes: 18882795327.165
num_examples: 113287
download_size: 20158273111
dataset_size: 20509417374.165
---
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("patomp/thai-mscoco-2014-captions")
dataset
```
output
```python
DatasetDict({
train: Dataset({
features: ['image', 'filepath', 'sentids', 'filename', 'imgid', 'split', 'sentences_tokens', 'sentences_raw', 'sentences_sentid', 'cocoid', 'th_sentences_raw'],
num_rows: 113287
})
validation: Dataset({
features: ['image', 'filepath', 'sentids', 'filename', 'imgid', 'split', 'sentences_tokens', 'sentences_raw', 'sentences_sentid', 'cocoid', 'th_sentences_raw'],
num_rows: 5000
})
test: Dataset({
features: ['image', 'filepath', 'sentids', 'filename', 'imgid', 'split', 'sentences_tokens', 'sentences_raw', 'sentences_sentid', 'cocoid', 'th_sentences_raw'],
num_rows: 5000
})
})
```
A sample
```python
dataset["validation"][0]
```
output
```python
{
"image":<PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=500x336 at 0x7F6C5A83F430>,
"filepath":"COCO_val2014_000000184613.jpg",
"sentids":[474921,479322,479334,481560,483594],
"filename":"COCO_val2014_000000184613.jpg",
"imgid":2,
"split":"val",
"sentences_tokens":[
["a", "child","holding", "a","flowered","umbrella","and","petting","a","yak"],["a","young","man","holding","an","umbrella","next","to","a","herd","of","cattle"],
["a","young","boy","barefoot","holding","an","umbrella","touching","the","horn","of","a","cow"],
["a","young","boy","with","an","umbrella","who","is","touching","the","horn","of","a","cow"],
["a","boy","holding","an","umbrella","while","standing","next","to","livestock"]
],
"sentences_raw":[
"A child holding a flowered umbrella and petting a yak.",
"A young man holding an umbrella next to a herd of cattle.",
"a young boy barefoot holding an umbrella touching the horn of a cow",
"A young boy with an umbrella who is touching the horn of a cow.",
"A boy holding an umbrella while standing next to livestock."
],
"sentences_sentid":[474921,479322,479334,481560,483594],
"cocoid":184613,
"th_sentences_raw":[
"เด็กถือร่มที่มีดอกหนึ่งคันและลูบคลูบลํา",
"ชายหนุ่มคนหนึ่งถือร่มไว้ข้างๆ ฝูงวัว",
"เด็กหนุ่มคนหนึ่งเท้าเปล่าจับร่มจับแตรของวัว",
"เด็กชายที่มีร่มสัมผัสแตรของวัว",
"เด็กชายถือร่มในขณะที่ยืนถัดจากปศุสัตว์"
]
}
```
## Dataset Construction
The dataset contructed from translating the captions of [MS COCO 2014 dataset](https://huggingface.co/datasets/HuggingFaceM4/COCO) [1] to Thai by using [NMT](https://airesearch.in.th/releases/machine-translation-models/) provided by VISTEC-depa Thailand Artificial Intelligence Research Institute [2]. The translated of 3 splits (train, validation and test) dataset was published in the [Huggingface](https://huggingface.co/datasets/patomp/thai-mscoco-2014-captions).
## References
[1] Tsung-Yi Lin, Michael Maire, Serge Belongie, James Hays, Pietro Perona, Deva Ramanan, Piotr Dollár, and C. Lawrence Zitnick. 2014. Microsoft COCO: Common Objects in Context. In Computer Vision – ECCV 2014, Springer International Publishing, Cham, 740–755.
[2] English-Thai Machine Translation Models. (2020, June 23). VISTEC-depa Thailand Artificial Intelligence Research Institute. https://airesearch.in.th/releases/machine-translation-models/