Raj-Sanjay-Shah
commited on
Commit
·
c19819b
1
Parent(s):
9be731a
Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,7 @@ tags:
|
|
6 |
widget:
|
7 |
- text: "Stocks rallied and the British pound [MASK]."
|
8 |
---
|
9 |
-
|
10 |
FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
|
11 |
[FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
|
12 |
[FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
|
@@ -14,10 +14,10 @@ FLANG is a set of large language models for Financial LANGuage tasks. These mode
|
|
14 |
[FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
|
15 |
[Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
|
16 |
|
17 |
-
|
18 |
FLANG-BERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the BERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
|
19 |
|
20 |
-
|
21 |
Please cite the model with the following citation:\
|
22 |
@INPROCEEDINGS{shah-etal-2022-flang,\
|
23 |
author = {Shah, Raj Sanjay and
|
@@ -36,7 +36,7 @@ Please cite the model with the following citation:\
|
|
36 |
publisher = {Association for Computational Linguistics}\
|
37 |
}
|
38 |
|
39 |
-
|
40 |
|
41 |
Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-BERT related issues and questions.
|
42 |
|
|
|
6 |
widget:
|
7 |
- text: "Stocks rallied and the British pound [MASK]."
|
8 |
---
|
9 |
+
#FLANG
|
10 |
FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:\
|
11 |
[FLANG-BERT](https://huggingface.co/SALT-NLP/FLANG-BERT)\
|
12 |
[FLANG-SpanBERT](https://huggingface.co/SALT-NLP/FLANG-SpanBERT)\
|
|
|
14 |
[FLANG-Roberta](https://huggingface.co/SALT-NLP/FLANG-Roberta)\
|
15 |
[Flang-ELECTRA](https://huggingface.co/SALT-NLP/FLANG-ELECTRA)
|
16 |
|
17 |
+
#FLANG-BERT
|
18 |
FLANG-BERT is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the BERT language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.
|
19 |
|
20 |
+
#Citation
|
21 |
Please cite the model with the following citation:\
|
22 |
@INPROCEEDINGS{shah-etal-2022-flang,\
|
23 |
author = {Shah, Raj Sanjay and
|
|
|
36 |
publisher = {Association for Computational Linguistics}\
|
37 |
}
|
38 |
|
39 |
+
#Contact information
|
40 |
|
41 |
Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-BERT related issues and questions.
|
42 |
|