Raj-Sanjay-Shah commited on
Commit
ece6598
·
1 Parent(s): af3dc20

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -4
README.md CHANGED
@@ -7,7 +7,7 @@ widget:
7
  - text: "Stocks rallied and the British pound <mask>."
8
  ---
9
 
10
- ##FLANG
11
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
12
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
13
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
@@ -15,11 +15,31 @@ FLANG is a set of large language models for Financial LANGuage tasks. These mode
15
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
16
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
17
 
18
- ##FLANG-Roberta
19
  FLANG-Roberta is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the RoBerta language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
20
 
21
- Contact information
22
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-Roberta related issues and questions.
24
 
25
 
 
7
  - text: "Stocks rallied and the British pound <mask>."
8
  ---
9
 
10
+ ## FLANG
11
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
12
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
13
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
 
15
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
16
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
17
 
18
+ ## FLANG-Roberta
19
  FLANG-Roberta is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the RoBerta language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
20
 
21
+ ## Citation
22
+ Please cite the model with the following citation:
23
+ ```bibtex
24
+ @INPROCEEDINGS{shah-etal-2022-flang,
25
+ author = {Shah, Raj Sanjay and
26
+ Chawla, Kunal and
27
+ Eidnani, Dheeraj and
28
+ Shah, Agam and
29
+ Du, Wendi and
30
+ Chava, Sudheer and
31
+ Raman, Natraj and
32
+ Smiley, Charese and
33
+ Chen, Jiaao and
34
+ Yang, Diyi },
35
+ title = {When FLUE Meets FLANG: Benchmarks and Large Pretrained Language Model for Financial Domain},
36
+ booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
37
+ year = {2022},
38
+ publisher = {Association for Computational Linguistics}
39
+ }
40
+ ```
41
+
42
+ ## Contact information
43
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-Roberta related issues and questions.
44
 
45