spt-absa-bert-400k / README.md
zhang-yice's picture
Update README.md
3186100
metadata
license: cc-by-4.0

SPT-ABSA

We continue to pre-train BERT-base via Sentiment-enhance pre-training (SPT).

  • Title: An Empirical Study of Sentiment-Enhanced Pre-Training for Aspect-Based Sentiment Analysis
  • Author: Yice Zhang, Yifan Yang, Bin Liang, Shiwei Chen, Bing Qin, and Ruifeng Xu
  • Conference: ACL-2023 Finding (Long)

GitHub Repository: https://github.com/HITSZ-HLT/SPT-ABSA

What Did We Do?

Aspect-Based Sentiment Analysis (ABSA) is an important problem in sentiment analysis. Its goal is to recognize opinions and sentiments towards specific aspects from user-generated content. Many research efforts leverage pre-training techniques to learn sentiment-aware representations and achieve significant gains in various ABSA tasks. We conduct an empirical study of SPT-ABSA to systematically investigate and analyze the effectiveness of the existing approaches.

We mainly concentrate on the following questions:

  • (a) what impact do different types of sentiment knowledge have on downstream ABSA tasks?;
  • (b) which knowledge integration method is most effective?; and
  • (c) does injecting non-sentiment-specific linguistic knowledge (e.g., part-of-speech tags and syntactic relations) into pre-training have positive impacts?

Based on the experimental investigation of these questions, we eventually obtain a powerful sentiment-enhanced pre-trained model. The powerful sentiment-enhanced pre-trained model has two versions, namely zhang-yice/spt-absa-bert-400k and zhang-yice/spt-absa-bert-10k, which integrates three types of knowledge:

  • aspect words: masking aspects' context and predicting them.
  • review's rating score: rating prediction.
  • syntax knowledge:
    • part-of-speech,
    • dependency direction,
    • dependency distance.

Experimental Results

image image