zhibinlu commited on
Commit
3783fc5
·
verified ·
1 Parent(s): 275095c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -14,7 +14,7 @@ This model is a VGCN-BERT model based on [DistilBert-base-uncased](https://huggi
14
 
15
  > Much progress has been made recently on text classification with methods based on neural networks. In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. However, their ability of capturing the global information about the vocabulary of a language is more limited. This latter is the strength of Graph Convolutional Networks (GCN). In this paper, we propose VGCN-BERT model which combines the capability of BERT with a Vocabulary Graph Convolutional Network (VGCN). Local information and global information interact through different layers of BERT, allowing them to influence mutually and to build together a final representation for classification. In our experiments on several text classification datasets, our approach outperforms BERT and GCN alone, and achieve higher effectiveness than that reported in previous studies.
16
 
17
- The original implementation is in my gitlab [vgcn-bert repo](https://github.com/Louis-udm/VGCN-BERT), but recently I updated the algorithm and implemented this new version for integrating in HuggingFace Transformers, the new version has the following improvements:
18
  - Greatly speeds up the calculation speed of embedding vocabulary graph convolutinal network (or Word Graph embedding). Taking CoLa as an example, the new model only increases the training time by 11% compared with the base model
19
  - Updated subgraph selection algorithm.
20
  - Currently using DistilBert as the base model, but it is easy to migrate to other models.
 
14
 
15
  > Much progress has been made recently on text classification with methods based on neural networks. In particular, models using attention mechanism such as BERT have shown to have the capability of capturing the contextual information within a sentence or document. However, their ability of capturing the global information about the vocabulary of a language is more limited. This latter is the strength of Graph Convolutional Networks (GCN). In this paper, we propose VGCN-BERT model which combines the capability of BERT with a Vocabulary Graph Convolutional Network (VGCN). Local information and global information interact through different layers of BERT, allowing them to influence mutually and to build together a final representation for classification. In our experiments on several text classification datasets, our approach outperforms BERT and GCN alone, and achieve higher effectiveness than that reported in previous studies.
16
 
17
+ The original implementation is in my GitHub [vgcn-bert repo](https://github.com/Louis-udm/VGCN-BERT), but recently I updated the algorithm and implemented this new version for integrating in HuggingFace Transformers, the new version has the following improvements:
18
  - Greatly speeds up the calculation speed of embedding vocabulary graph convolutinal network (or Word Graph embedding). Taking CoLa as an example, the new model only increases the training time by 11% compared with the base model
19
  - Updated subgraph selection algorithm.
20
  - Currently using DistilBert as the base model, but it is easy to migrate to other models.