metadata
task_categories:
- translation
- automatic-speech-recognition
language:
- gl
- en
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text_gl
dtype: string
- name: text_en
dtype: string
splits:
- name: train
num_bytes: 1867365673.628
num_examples: 2742
- name: validation
num_bytes: 336601848
num_examples: 496
- name: test
num_bytes: 143321367
num_examples: 212
download_size: 2338654742
dataset_size: 2347288888.6280003
Dataset Details
FLEURS-SpeechT-GL-EN is Galician-to-English dataset for Speech Translation task.
This dataset has been compiled from Google's FLEURS data set. It contains ~10h11m of galician audios along with its text transcriptions and the correspondant English translations.
Preprocessing
This dataset is based on Google's FLEURS speech dataset, by aligning English and Galician data. The alignment process has been performed following ymoslem's FLEURS dataset processing script
English translations quality
To get a sense of the quality of the english text with respect to the galician transcriptions, a Quality Estimation model has been applied.
- QE model: Unbabel/wmt23-cometkiwi-da-xl
- Average QE score: 0.76
Dataset Structure
DatasetDict({
train: Dataset({
features: ['id', 'audio', 'text_gl', 'text_en'],
num_rows: 3450
})
})
Citation
@article{fleurs2022arxiv,
title = {FLEURS: Few-shot Learning Evaluation of Universal Representations of Speech},
author = {Conneau, Alexis and Ma, Min and Khanuja, Simran and Zhang, Yu and Axelrod, Vera and Dalmia, Siddharth and Riesa, Jason and Rivera, Clara and Bapna, Ankur},
journal={arXiv preprint arXiv:2205.12446},
url = {https://arxiv.org/abs/2205.12446},
year = {2022},
Yasmin Moslem preprocessing script: https://github.com/ymoslem/Speech/blob/main/FLEURS-GA-EN.ipynb
Dataset Card Contact
Juan Julián Cea Morán (jjceamoran@gmail.com)