Raj-Sanjay-Shah commited on
Commit
1edbf29
·
1 Parent(s): c19819b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -10
README.md CHANGED
@@ -6,7 +6,8 @@ tags:
6
  widget:
7
  - text: "Stocks rallied and the British pound [MASK]."
8
  ---
9
- #FLANG
 
10
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
11
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
12
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
@@ -14,12 +15,13 @@ FLANG is a set of large language models for Financial LANGuage tasks. These mode
14
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
15
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
16
 
17
- #FLANG-BERT
18
  FLANG-BERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the BERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
19
 
20
- #Citation
21
  Please cite the model with the following citation:\
22
- @INPROCEEDINGS{shah-etal-2022-flang,\
 
23
  author = {Shah, Raj Sanjay and
24
  Chawla, Kunal and
25
  Eidnani, Dheeraj and
@@ -29,14 +31,15 @@ Please cite the model with the following citation:\
29
  Raman, Natraj and
30
  Smiley, Charese and
31
  Chen, Jiaao and
32
- Yang, Diyi },\
33
- title = {When FLUE Meets FLANG: Benchmarks and Large Pretrained Language Model for Financial Domain},\
34
- booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},\
35
- year = {2022},\
36
- publisher = {Association for Computational Linguistics}\
37
  }
 
38
 
39
- #Contact information
40
 
41
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-BERT related issues and questions.
42
 
 
6
  widget:
7
  - text: "Stocks rallied and the British pound [MASK]."
8
  ---
9
+
10
+ ## FLANG
11
  FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
12
  [FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
13
  [FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
 
15
  [FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
16
  [Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
17
 
18
+ ## FLANG-BERT
19
  FLANG-BERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the BERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
20
 
21
+ ## Citation
22
  Please cite the model with the following citation:\
23
+ ```bibtex
24
+ @INPROCEEDINGS{shah-etal-2022-flang,
25
  author = {Shah, Raj Sanjay and
26
  Chawla, Kunal and
27
  Eidnani, Dheeraj and
 
31
  Raman, Natraj and
32
  Smiley, Charese and
33
  Chen, Jiaao and
34
+ Yang, Diyi },
35
+ title = {When FLUE Meets FLANG: Benchmarks and Large Pretrained Language Model for Financial Domain},
36
+ booktitle = {Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
37
+ year = {2022},
38
+ publisher = {Association for Computational Linguistics}
39
  }
40
+ ```
41
 
42
+ ## Contact information
43
 
44
  Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-BERT related issues and questions.
45