Datasets:

Modalities:
Image
Size:
< 1K
Libraries:
Datasets
License:
CRPE / README.md
Weiyun1025's picture
Update README.md
82b4c72 verified
|
raw
history blame
1.28 kB
metadata
license: apache-2.0

Circular-based Relation Probing Evaluation (CRPE)

CRPE is a benchmark designed to quantitatively evaluate the object recognition and relation comprehension ability of models. The evaluation is formulated as single-choice questions.

The benchmark consists of four splits: Existence, Subject, Predicate, and Object.

The Existence split evaluates the object recognition ability while the remaining splits are designed to evaluate the capability of relation comprehension, focusing on probing each of the elements in the subject-predicate-object triplets of the scene graph separately. Some data examples are shown below.

crpe.jpg

For a robust evaluation, we adopt CircularEval as our evaluation strategy. Under this setting, a question is considered as correctly answered only when the model consistently predicts the correct answer in each of the N iterations, with N corresponding to the number of choices. In each iteration, a circular shift is applied to both the choices and the answer to form a new query for the model.

See our paper to learn more details!