Understanding Attribution Explanations

Attribution explanations highlight specific parts of a table—such as rows, columns, or cells—that are most relevant to the answer predicted by a TableQA model. These explanations help you understand which information of the input the system considered important when predicting the answer.

Table caption: 1947 Kentucky Wildcats Football Team

Statement to verify: "The Wildcats kept the opposing team scoreless in 4 games."

Game Date Opponent Result Wildcats Points Opponents Record
1 9999-09-20 Ole Miss Loss 7 14 0 - 1
2 9999-09-27 Cincinnati Win 20 0 1 - 1
4 9999-10-11 9 Georgia Win 26 0 3 - 1 , 20
5 9999-10-18 10 Vanderbilt Win 14 0 4 - 1 , 14
9 9999-11-15 Evansville Win 36 0 7 - 2

In this example, the TableQA model has highlighted specific rows and cells to explain its prediction:

These highlights indicate that the system identified four games where the opposing team did not score, verifying the statement as CORRECT. The yellow highlighting shows the relevant rows, while the green highlighting represents the cells containing fine-grained information needed to verify the statement.

By using different colors for highlighting, the system provides a more nuanced explanation:

During the experiment, you will use explanations to guess model prediction. Your task will be to look at the provided explanations and guess what the model says on the Statement (SUPPORTED or NOT SUPPORTED).