Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -1,9 +1,14 @@
|
|
1 |
---
|
|
|
|
|
|
|
2 |
library_name: Transformers
|
3 |
tags:
|
|
|
4 |
- text-classification
|
5 |
-
- transformers
|
6 |
- argilla
|
|
|
|
|
7 |
---
|
8 |
|
9 |
<!-- This model card has been generated automatically according to the information the `ArgillaTrainer` had access to. You
|
@@ -15,7 +20,7 @@ This model has been created with [Argilla](https://docs.argilla.io), trained wit
|
|
15 |
|
16 |
<!-- Provide a quick summary of what the model is/does. -->
|
17 |
|
18 |
-
|
19 |
|
20 |
## Model training
|
21 |
|
@@ -23,22 +28,23 @@ Training the model using the `ArgillaTrainer`:
|
|
23 |
|
24 |
```python
|
25 |
# Load the dataset:
|
26 |
-
dataset = FeedbackDataset.
|
27 |
|
28 |
# Create the training task:
|
29 |
-
task = TrainingTask.for_text_classification(text=dataset.field_by_name("text"), label=dataset.question_by_name("
|
30 |
|
31 |
# Create the ArgillaTrainer:
|
32 |
trainer = ArgillaTrainer(
|
33 |
dataset=dataset,
|
34 |
task=task,
|
35 |
framework="transformers",
|
36 |
-
model="bert-
|
37 |
)
|
38 |
|
39 |
trainer.update_config({
|
40 |
"logging_steps": 1,
|
41 |
-
"num_train_epochs": 1
|
|
|
42 |
})
|
43 |
|
44 |
trainer.train(output_dir="None")
|
@@ -56,14 +62,20 @@ trainer.predict("This is awesome!")
|
|
56 |
|
57 |
<!-- Provide a longer summary of what this model is. -->
|
58 |
|
59 |
-
|
60 |
|
61 |
- **Developed by:** [More Information Needed]
|
62 |
- **Shared by [optional]:** [More Information Needed]
|
63 |
-
- **Model type:** [
|
64 |
-
- **Language(s) (NLP):** [
|
65 |
-
- **License:**
|
66 |
-
- **Finetuned from model [optional]:**
|
|
|
|
|
|
|
|
|
|
|
|
|
67 |
|
68 |
<!--
|
69 |
## Uses
|
@@ -134,7 +146,7 @@ Carbon emissions can be estimated using the [Machine Learning Impact calculator]
|
|
134 |
### Framework Versions
|
135 |
|
136 |
- Python: 3.10.7
|
137 |
-
- Argilla: 1.
|
138 |
|
139 |
<!--
|
140 |
## Citation [optional]
|
|
|
1 |
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
license: apache-2.0
|
5 |
library_name: Transformers
|
6 |
tags:
|
7 |
+
- nlp
|
8 |
- text-classification
|
|
|
9 |
- argilla
|
10 |
+
- transformers
|
11 |
+
dataset_name: argilla/emotion
|
12 |
---
|
13 |
|
14 |
<!-- This model card has been generated automatically according to the information the `ArgillaTrainer` had access to. You
|
|
|
20 |
|
21 |
<!-- Provide a quick summary of what the model is/does. -->
|
22 |
|
23 |
+
This is a sample model finetuned from prajjwal1/bert-tiny.
|
24 |
|
25 |
## Model training
|
26 |
|
|
|
28 |
|
29 |
```python
|
30 |
# Load the dataset:
|
31 |
+
dataset = FeedbackDataset.from_huggingface("argilla/emotion")
|
32 |
|
33 |
# Create the training task:
|
34 |
+
task = TrainingTask.for_text_classification(text=dataset.field_by_name("text"), label=dataset.question_by_name("label"))
|
35 |
|
36 |
# Create the ArgillaTrainer:
|
37 |
trainer = ArgillaTrainer(
|
38 |
dataset=dataset,
|
39 |
task=task,
|
40 |
framework="transformers",
|
41 |
+
model="prajjwal1/bert-tiny",
|
42 |
)
|
43 |
|
44 |
trainer.update_config({
|
45 |
"logging_steps": 1,
|
46 |
+
"num_train_epochs": 1,
|
47 |
+
"output_dir": "tmp"
|
48 |
})
|
49 |
|
50 |
trainer.train(output_dir="None")
|
|
|
62 |
|
63 |
<!-- Provide a longer summary of what this model is. -->
|
64 |
|
65 |
+
Model trained with `ArgillaTrainer` for demo purposes
|
66 |
|
67 |
- **Developed by:** [More Information Needed]
|
68 |
- **Shared by [optional]:** [More Information Needed]
|
69 |
+
- **Model type:** Finetuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) for demo purposes
|
70 |
+
- **Language(s) (NLP):** ['en']
|
71 |
+
- **License:** apache-2.0
|
72 |
+
- **Finetuned from model [optional]:** prajjwal1/bert-tiny
|
73 |
+
### Model Sources [optional]
|
74 |
+
|
75 |
+
<!-- Provide the basic links for the model. -->
|
76 |
+
|
77 |
+
- **Repository:** N/A
|
78 |
+
|
79 |
|
80 |
<!--
|
81 |
## Uses
|
|
|
146 |
### Framework Versions
|
147 |
|
148 |
- Python: 3.10.7
|
149 |
+
- Argilla: 1.19.0-dev
|
150 |
|
151 |
<!--
|
152 |
## Citation [optional]
|