lzw1008 commited on
Commit
3110036
1 Parent(s): ec02933

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +100 -0
README.md CHANGED
@@ -1,3 +1,103 @@
1
  ---
2
  license: mit
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ language:
4
+ - en
5
  ---
6
+
7
+ # Introduction
8
+
9
+ Emot5-large is part of the [EmoLLMs](https://github.com/lzw108/EmoLLMs) project, the first open-source large language model (LLM) series for comprehensive affective analysis with instruction-following capability.
10
+ This model is finetuned based on the t5-large foundation model and the full AAID instruction tuning data.
11
+ The model can be used for affective classification tasks (e.g. sentimental polarity or categorical emotions), and regression tasks (e.g. sentiment strength or emotion intensity).
12
+
13
+ # Ethical Consideration
14
+
15
+ Recent studies have indicated LLMs may introduce some potential
16
+ bias, such as gender gaps. Meanwhile, some incorrect prediction results, and over-generalization
17
+ also illustrate the potential risks of current LLMs. Therefore, there
18
+ are still many challenges in applying the model to real-scenario
19
+ affective analysis systems.
20
+
21
+ ## Models in EmoLLMs
22
+
23
+ There are a series of EmoLLMs, including Emollama-7b, Emollama-chat-7b, Emollama-chat-13b, Emoopt-13b, Emobloom-7b, Emot5-large, Emobart-large.
24
+
25
+ - **Emollama-7b**: This model is finetuned based on the LLaMA2-7B.
26
+ - **Emollama-chat-7b**: This model is finetuned based on the LLaMA2-chat-7B.
27
+ - **Emollama-chat-13b**: This model is finetuned based on the LLaMA2-chat-13B.
28
+ - **Emoopt-13b**: This model is finetuned based on the OPT-13B.
29
+ - **Emobloom-7b**: This model is finetuned based on the Bloomz-7b1-mt.
30
+ - **Emot5-large**: This model is finetuned based on the T5-large.
31
+ - **Emobart-large**: This model is finetuned based on the Bart-large.
32
+
33
+ All models are trained on the full AAID instruction tuning data.
34
+
35
+
36
+
37
+ ## Usage
38
+
39
+ You can use the Emot5-large model in your Python project with the Hugging Face Transformers library. Here is a simple example of how to load the model:
40
+
41
+ ```python
42
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
43
+ tokenizer = AutoTokenizer.from_pretrained('lzw1008/Emot5-large')
44
+ model = AutoModelForSeq2SeqLM.from_pretrained('lzw1008/Emot5-large', device_map='auto')
45
+ ```
46
+
47
+ In this example, AutoTokenizer is used to load the tokenizer, and AutoModelForSeq2SeqLM is used to load the model. The `device_map='auto'` argument is used to automatically
48
+ use the GPU if it's available.
49
+
50
+ ## Prompt examples
51
+
52
+
53
+ ### Emotion intensity
54
+
55
+ Task: Assign a numerical value between 0 (least E) and 1 (most E) to represent the intensity of emotion E expressed in the text.
56
+ Text: @CScheiwiller can't stop smiling 😆😆😆
57
+ Emotion: joy
58
+ Intensity Score:
59
+
60
+ >>0.896
61
+
62
+ ### Sentiment strength
63
+
64
+ Task: Evaluate the valence intensity of the writer's mental state based on the text, assigning it a real-valued score from 0 (most negative) to 1 (most positive).
65
+ Text: Happy Birthday shorty. Stay fine stay breezy stay wavy @daviistuart 😘
66
+ Intensity Score:
67
+
68
+ >>0.879
69
+
70
+ ### Sentiment classification
71
+
72
+ Task: Categorize the text into an ordinal class that best characterizes the writer's mental state, considering various degrees of positive and negative sentiment intensity. 3: very positive mental state can be inferred. 2: moderately positive mental state can be inferred. 1: slightly positive mental state can be inferred. 0: neutral or mixed mental state can be inferred. -1: slightly negative mental state can be inferred. -2: moderately negative mental state can be inferred. -3: very negative mental state can be inferred
73
+ Tweet: Beyoncé resentment gets me in my feelings every time. 😩
74
+ Intensity Class:
75
+
76
+ >>-3: very negative emotional state can be inferred
77
+
78
+ ### Emotion classification
79
+
80
+ Task: Categorize the text's emotional tone as either 'neutral or no emotion' or identify the presence of one or more of the given emotions (anger, anticipation, disgust, fear, joy, love, optimism, pessimism, sadness, surprise, trust).
81
+ Text: Whatever you decide to do make sure it makes you #happy.
82
+ This tweet contains emotions:
83
+
84
+ >>joy, love, optimism
85
+
86
+ The task description can be adjusted according to the specific task.
87
+
88
+ ## License
89
+
90
+ EmoLLMs series are licensed under MIT. For more details, please see the MIT file.
91
+
92
+ ## Citation
93
+
94
+ If you use the series of EmoLLMs in your work, please cite our paper:
95
+
96
+ ```bibtex
97
+ @article{liu2024emollms,
98
+ title={EmoLLMs: A Series of Emotional Large Language Models and Annotation Tools for Comprehensive Affective Analysis},
99
+ author={Liu, Zhiwei and Yang, Kailai and Zhang, Tianlin and Xie, Qianqian and Yu, Zeping and Ananiadou, Sophia},
100
+ journal={arXiv preprint arXiv:2401.08508},
101
+ year={2024}
102
+ }
103
+ ```