End of training
Browse files
README.md
ADDED
@@ -0,0 +1,110 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
library_name: peft
|
4 |
+
tags:
|
5 |
+
- trl
|
6 |
+
- sft
|
7 |
+
- generated_from_trainer
|
8 |
+
base_model: mistralai/Mistral-7B-v0.1
|
9 |
+
model-index:
|
10 |
+
- name: lc_reddit
|
11 |
+
results: []
|
12 |
+
---
|
13 |
+
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# lc_reddit
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on an unknown dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 2.4421
|
22 |
+
|
23 |
+
## Model description
|
24 |
+
|
25 |
+
More information needed
|
26 |
+
|
27 |
+
## Intended uses & limitations
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Training and evaluation data
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training procedure
|
36 |
+
|
37 |
+
### Training hyperparameters
|
38 |
+
|
39 |
+
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 2e-05
|
41 |
+
- train_batch_size: 1
|
42 |
+
- eval_batch_size: 1
|
43 |
+
- seed: 42
|
44 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
45 |
+
- lr_scheduler_type: cosine
|
46 |
+
- num_epochs: 50
|
47 |
+
|
48 |
+
### Training results
|
49 |
+
|
50 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
51 |
+
|:-------------:|:-----:|:-----:|:---------------:|
|
52 |
+
| 1.893 | 1.0 | 306 | 1.8913 |
|
53 |
+
| 2.0254 | 2.0 | 612 | 1.8820 |
|
54 |
+
| 1.7988 | 3.0 | 918 | 1.8821 |
|
55 |
+
| 1.6315 | 4.0 | 1224 | 1.8907 |
|
56 |
+
| 1.6124 | 5.0 | 1530 | 1.9035 |
|
57 |
+
| 1.7146 | 6.0 | 1836 | 1.9222 |
|
58 |
+
| 1.5807 | 7.0 | 2142 | 1.9383 |
|
59 |
+
| 1.4356 | 8.0 | 2448 | 1.9796 |
|
60 |
+
| 1.4935 | 9.0 | 2754 | 1.9838 |
|
61 |
+
| 1.3923 | 10.0 | 3060 | 2.0112 |
|
62 |
+
| 1.5712 | 11.0 | 3366 | 2.0217 |
|
63 |
+
| 1.5216 | 12.0 | 3672 | 2.0493 |
|
64 |
+
| 1.2513 | 13.0 | 3978 | 2.0731 |
|
65 |
+
| 1.2956 | 14.0 | 4284 | 2.1151 |
|
66 |
+
| 1.5635 | 15.0 | 4590 | 2.1298 |
|
67 |
+
| 1.4956 | 16.0 | 4896 | 2.1525 |
|
68 |
+
| 1.3592 | 17.0 | 5202 | 2.1751 |
|
69 |
+
| 1.1305 | 18.0 | 5508 | 2.1893 |
|
70 |
+
| 1.1396 | 19.0 | 5814 | 2.2454 |
|
71 |
+
| 1.3858 | 20.0 | 6120 | 2.2797 |
|
72 |
+
| 1.3174 | 21.0 | 6426 | 2.2689 |
|
73 |
+
| 1.5609 | 22.0 | 6732 | 2.3098 |
|
74 |
+
| 1.3431 | 23.0 | 7038 | 2.3238 |
|
75 |
+
| 1.3111 | 24.0 | 7344 | 2.3742 |
|
76 |
+
| 1.1365 | 25.0 | 7650 | 2.3727 |
|
77 |
+
| 1.3318 | 26.0 | 7956 | 2.3978 |
|
78 |
+
| 1.3297 | 27.0 | 8262 | 2.3647 |
|
79 |
+
| 1.2178 | 28.0 | 8568 | 2.3971 |
|
80 |
+
| 1.2757 | 29.0 | 8874 | 2.4292 |
|
81 |
+
| 1.236 | 30.0 | 9180 | 2.4170 |
|
82 |
+
| 1.1888 | 31.0 | 9486 | 2.4439 |
|
83 |
+
| 1.0917 | 32.0 | 9792 | 2.4225 |
|
84 |
+
| 1.1148 | 33.0 | 10098 | 2.4166 |
|
85 |
+
| 1.1907 | 34.0 | 10404 | 2.4318 |
|
86 |
+
| 1.1906 | 35.0 | 10710 | 2.4352 |
|
87 |
+
| 1.2238 | 36.0 | 11016 | 2.4471 |
|
88 |
+
| 1.1596 | 37.0 | 11322 | 2.4382 |
|
89 |
+
| 1.2184 | 38.0 | 11628 | 2.4343 |
|
90 |
+
| 1.2428 | 39.0 | 11934 | 2.4422 |
|
91 |
+
| 1.3111 | 40.0 | 12240 | 2.4397 |
|
92 |
+
| 1.2845 | 41.0 | 12546 | 2.4460 |
|
93 |
+
| 1.3173 | 42.0 | 12852 | 2.4428 |
|
94 |
+
| 1.193 | 43.0 | 13158 | 2.4430 |
|
95 |
+
| 1.1774 | 44.0 | 13464 | 2.4425 |
|
96 |
+
| 1.1868 | 45.0 | 13770 | 2.4396 |
|
97 |
+
| 1.2042 | 46.0 | 14076 | 2.4430 |
|
98 |
+
| 1.2833 | 47.0 | 14382 | 2.4398 |
|
99 |
+
| 1.2766 | 48.0 | 14688 | 2.4410 |
|
100 |
+
| 1.4958 | 49.0 | 14994 | 2.4412 |
|
101 |
+
| 1.1868 | 50.0 | 15300 | 2.4421 |
|
102 |
+
|
103 |
+
|
104 |
+
### Framework versions
|
105 |
+
|
106 |
+
- PEFT 0.11.1
|
107 |
+
- Transformers 4.41.2
|
108 |
+
- Pytorch 2.1.0+cu118
|
109 |
+
- Datasets 2.19.2
|
110 |
+
- Tokenizers 0.19.1
|