SemTabNet / README.md
exomishra's picture
Adding arxiv details
fc26809 verified
---
license: mit
task_categories:
- feature-extraction
- table-question-answering
- text2text-generation
size_categories:
- 100K<n<1M
language:
- en
pretty_name: SemTabNet
tags:
- information-extraction
- table-understanding
- climate
- ESG
---
# Dataset Card for SemTabNet
This dataset accompanies the following [paper](https://arxiv.org/abs/2406.19102):
```
Title: Statements: Universal Information Extraction from Tables with Large Language Models for ESG KPIs
Authors: Lokesh Mishra, Sohayl Dhibi, Yusik Kim, Cesar Berrospi Ramis, Shubham Gupta, Michele Dolfi, Peter Staar
Venue: Accepted at the NLP4Climate workshop in the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024)
```
In this paper, we propose **STATEMENTS** as a new knowledge model for storing quantiative information in a domain agnotic, uniform structure. The task of converting a raw input (table or text) to Statements is called Statement Extraction (SE). The statement extraction task falls under the category of universal information extraction.
- **Code Repository:** [SemTabNet repository](https://github.com/DS4SD/SemTabNet)
- **Arxiv Paper:** [Statements: Universal Information Extraction from Tables with Large Language Models for ESG KPIs](https://arxiv.org/abs/2406.19102)
- **Point of Contact:** [IBM Research DeepSearch Team](https://ds4sd.github.io)
### Data Splits
There are three tasks supported by this dataset. The data for each three task is split in training, validation, and testing set. Additionally, we also provide the original annotations of the raw tables which are used to construct all other data.
|Task | Train | Test | Valid |
| ----- | ------ | ----- | ---- |
| SE Direct | 103455 | 11682 | 5445 |
|SE Indirect 1D | 72580 | 8489 | 3821 |
|SE Indirect 2D | 93153 | 22839 | 4903 |
### Languages
The text in the dataset is in English.
### Source and Annotations
The source of this dataset and the annotation strategy is described in the paper.
### Citation Information
Arxiv: [https://arxiv.org/abs/2406.19102](https://arxiv.org/abs/2406.19102)
```
```