CultureBERT
commited on
Commit
•
8b29758
1
Parent(s):
edacb3e
Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ The model assigns one of four possible labels:
|
|
14 |
For details on the model and its performance, see Koch and Pasch (2022). Please cite this article when using the model: <br />
|
15 |
Koch, Sebastian; Pasch, Stefan (2022): CultureBERT: Fine-Tuning Transformer Based Language Models for Corporate Culture. Available online at http://arxiv.org/abs/2212.00509.
|
16 |
|
|
|
|
|
17 |
Other References:
|
18 |
|
19 |
[1] Liu, Y.; Ott, M.; Goyal, N.; Du, J.; Joshi, M.; Chen, D.; ... & Stoyanov, V. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
|
|
|
14 |
For details on the model and its performance, see Koch and Pasch (2022). Please cite this article when using the model: <br />
|
15 |
Koch, Sebastian; Pasch, Stefan (2022): CultureBERT: Fine-Tuning Transformer Based Language Models for Corporate Culture. Available online at http://arxiv.org/abs/2212.00509.
|
16 |
|
17 |
+
Please see the following tutorial on how to apply CultureBERT to measure corporate culture in your own text documents: https://github.com/Stefan-Pasch/CultureBERT
|
18 |
+
|
19 |
Other References:
|
20 |
|
21 |
[1] Liu, Y.; Ott, M.; Goyal, N.; Du, J.; Joshi, M.; Chen, D.; ... & Stoyanov, V. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
|