File size: 10,532 Bytes
4dfbc32
 
 
 
817316c
4dfbc32
 
 
 
 
 
 
 
04ed2ca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4dfbc32
 
 
 
 
 
 
 
7d59314
 
 
 
4dfbc32
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
license: cc-by-sa-4.0
tags:
- generated_from_trainer
base_model: nlpaueb/legal-bert-base-uncased
metrics:
- accuracy
- precision
- recall
model-index:
- name: case-analysis-legal-bert-base-uncased
  results: []
---
## Metrics

- loss: 1.0628
- accuracy: 0.8708
- precision: 0.8661
- recall: 0.8708
- precision_macro: 0.8180
- recall_macro: 0.6890
- macro_fpr: 0.0681
- weighted_fpr: 0.0471
- weighted_specificity: 0.8788
- macro_specificity: 0.9374
- weighted_sensitivity: 0.8708
- macro_sensitivity: 0.6890
- f1_micro: 0.8708
- f1_macro: 0.7165
- f1_weighted: 0.8586
- runtime: 13.9241
- samples_per_second: 32.2460
- steps_per_second: 4.0940


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# case-analysis-legal-bert-base-uncased

This model is a fine-tuned version of [nlpaueb/legal-bert-base-uncased](https://huggingface.co/nlpaueb/legal-bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3822
- Accuracy: 0.8263
- Precision: 0.8205
- Recall: 0.8263
- Precision Macro: 0.6455
- Recall Macro: 0.6413
- Macro Fpr: 0.0910
- Weighted Fpr: 0.0732
- Weighted Specificity: 0.8622
- Macro Specificity: 0.9177
- Weighted Sensitivity: 0.8085
- Macro Sensitivity: 0.6413
- F1 Micro: 0.8085
- F1 Macro: 0.6429
- F1 Weighted: 0.8061

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 224  | 1.0114          | 0.6570   | 0.6179    | 0.6570 | 0.4664          | 0.4075       | 0.1948    | 0.1482       | 0.6727               | 0.8324            | 0.6570               | 0.4075            | 0.6570   | 0.4081   | 0.6166      |
| No log        | 2.0   | 448  | 0.7650          | 0.7751   | 0.7425    | 0.7751 | 0.5566          | 0.5806       | 0.1094    | 0.0882       | 0.8406               | 0.9039            | 0.7751               | 0.5806            | 0.7751   | 0.5659   | 0.7564      |
| 0.7677        | 3.0   | 672  | 0.7342          | 0.7817   | 0.7695    | 0.7817 | 0.5674          | 0.5967       | 0.1041    | 0.0851       | 0.8515               | 0.9083            | 0.7817               | 0.5967            | 0.7817   | 0.5707   | 0.7678      |
| 0.7677        | 4.0   | 896  | 0.7968          | 0.8174   | 0.7766    | 0.8174 | 0.6036          | 0.5893       | 0.0965    | 0.0693       | 0.8368               | 0.9136            | 0.8174               | 0.5893            | 0.8174   | 0.5904   | 0.7921      |
| 0.482         | 5.0   | 1120 | 0.8171          | 0.8085   | 0.7853    | 0.8085 | 0.6366          | 0.6038       | 0.0990    | 0.0732       | 0.8346               | 0.9108            | 0.8085               | 0.6038            | 0.8085   | 0.6141   | 0.7940      |
| 0.482         | 6.0   | 1344 | 0.8910          | 0.8241   | 0.8028    | 0.8241 | 0.6660          | 0.6359       | 0.0875    | 0.0664       | 0.8606               | 0.9212            | 0.8241               | 0.6359            | 0.8241   | 0.6315   | 0.8084      |
| 0.2993        | 7.0   | 1568 | 1.0094          | 0.8040   | 0.8109    | 0.8040 | 0.6742          | 0.6774       | 0.0891    | 0.0751       | 0.8829               | 0.9217            | 0.8040               | 0.6774            | 0.8040   | 0.6742   | 0.8067      |
| 0.2993        | 8.0   | 1792 | 1.1504          | 0.8107   | 0.7968    | 0.8107 | 0.6228          | 0.6330       | 0.0897    | 0.0722       | 0.8708               | 0.9204            | 0.8107               | 0.6330            | 0.8107   | 0.6229   | 0.8016      |
| 0.1367        | 9.0   | 2016 | 1.2533          | 0.8062   | 0.8059    | 0.8062 | 0.6537          | 0.6225       | 0.0948    | 0.0742       | 0.8595               | 0.9164            | 0.8062               | 0.6225            | 0.8062   | 0.6360   | 0.8045      |
| 0.1367        | 10.0  | 2240 | 1.2516          | 0.8174   | 0.8107    | 0.8174 | 0.6621          | 0.6499       | 0.0873    | 0.0693       | 0.8701               | 0.9219            | 0.8174               | 0.6499            | 0.8174   | 0.6554   | 0.8137      |
| 0.1367        | 11.0  | 2464 | 1.3822          | 0.8263   | 0.8205    | 0.8263 | 0.7085          | 0.6696       | 0.0833    | 0.0655       | 0.8711               | 0.9243            | 0.8263               | 0.6696            | 0.8263   | 0.6764   | 0.8195      |
| 0.055         | 12.0  | 2688 | 1.4574          | 0.8018   | 0.8127    | 0.8018 | 0.6369          | 0.6443       | 0.0883    | 0.0761       | 0.8844               | 0.9216            | 0.8018               | 0.6443            | 0.8018   | 0.6399   | 0.8068      |
| 0.055         | 13.0  | 2912 | 1.6634          | 0.7884   | 0.7810    | 0.7884 | 0.6090          | 0.6042       | 0.1002    | 0.0821       | 0.8619               | 0.9126            | 0.7884               | 0.6042            | 0.7884   | 0.6042   | 0.7831      |
| 0.0431        | 14.0  | 3136 | 1.5085          | 0.8285   | 0.8077    | 0.8285 | 0.6476          | 0.6367       | 0.0850    | 0.0645       | 0.8633               | 0.9229            | 0.8285               | 0.6367            | 0.8285   | 0.6382   | 0.8166      |
| 0.0431        | 15.0  | 3360 | 1.6411          | 0.8107   | 0.7936    | 0.8107 | 0.6243          | 0.6262       | 0.0914    | 0.0722       | 0.8626               | 0.9183            | 0.8107               | 0.6262            | 0.8107   | 0.6229   | 0.8014      |
| 0.0135        | 16.0  | 3584 | 1.7483          | 0.8062   | 0.7925    | 0.8062 | 0.6201          | 0.6271       | 0.0923    | 0.0742       | 0.8647               | 0.9177            | 0.8062               | 0.6271            | 0.8062   | 0.6221   | 0.7988      |
| 0.0135        | 17.0  | 3808 | 1.7233          | 0.7973   | 0.7897    | 0.7973 | 0.6148          | 0.6263       | 0.0942    | 0.0781       | 0.8682               | 0.9164            | 0.7973               | 0.6263            | 0.7973   | 0.6201   | 0.7933      |
| 0.0066        | 18.0  | 4032 | 1.6457          | 0.8241   | 0.8042    | 0.8241 | 0.6515          | 0.6388       | 0.0879    | 0.0664       | 0.8522               | 0.9191            | 0.8241               | 0.6388            | 0.8241   | 0.6391   | 0.8115      |
| 0.0066        | 19.0  | 4256 | 1.6614          | 0.8174   | 0.7976    | 0.8174 | 0.6324          | 0.6420       | 0.0865    | 0.0693       | 0.8703               | 0.9219            | 0.8174               | 0.6420            | 0.8174   | 0.6318   | 0.8061      |
| 0.0066        | 20.0  | 4480 | 1.6997          | 0.8129   | 0.8023    | 0.8129 | 0.6435          | 0.6576       | 0.0860    | 0.0712       | 0.8759               | 0.9222            | 0.8129               | 0.6576            | 0.8129   | 0.6462   | 0.8061      |
| 0.0067        | 21.0  | 4704 | 1.6540          | 0.8218   | 0.8000    | 0.8218 | 0.6473          | 0.6380       | 0.0880    | 0.0674       | 0.8560               | 0.9195            | 0.8218               | 0.6380            | 0.8218   | 0.6356   | 0.8088      |
| 0.0067        | 22.0  | 4928 | 1.7329          | 0.8085   | 0.7945    | 0.8085 | 0.6313          | 0.6267       | 0.0930    | 0.0732       | 0.8548               | 0.9158            | 0.8085               | 0.6267            | 0.8085   | 0.6282   | 0.8011      |
| 0.0028        | 23.0  | 5152 | 1.7949          | 0.8062   | 0.8004    | 0.8062 | 0.6365          | 0.6419       | 0.0902    | 0.0742       | 0.8708               | 0.9193            | 0.8062               | 0.6419            | 0.8062   | 0.6389   | 0.8032      |
| 0.0028        | 24.0  | 5376 | 1.8086          | 0.8085   | 0.8026    | 0.8085 | 0.6387          | 0.6429       | 0.0893    | 0.0732       | 0.8715               | 0.9200            | 0.8085               | 0.6429            | 0.8085   | 0.6405   | 0.8054      |
| 0.0001        | 25.0  | 5600 | 1.8326          | 0.8085   | 0.7988    | 0.8085 | 0.6343          | 0.6251       | 0.0934    | 0.0732       | 0.8537               | 0.9155            | 0.8085               | 0.6251            | 0.8085   | 0.6287   | 0.8028      |
| 0.0001        | 26.0  | 5824 | 1.8395          | 0.8085   | 0.7988    | 0.8085 | 0.6343          | 0.6251       | 0.0934    | 0.0732       | 0.8537               | 0.9155            | 0.8085               | 0.6251            | 0.8085   | 0.6287   | 0.8028      |
| 0.0003        | 27.0  | 6048 | 1.8816          | 0.8062   | 0.8039    | 0.8062 | 0.6439          | 0.6388       | 0.0920    | 0.0742       | 0.8621               | 0.9171            | 0.8062               | 0.6388            | 0.8062   | 0.6408   | 0.8046      |
| 0.0003        | 28.0  | 6272 | 1.8956          | 0.8062   | 0.8039    | 0.8062 | 0.6439          | 0.6388       | 0.0920    | 0.0742       | 0.8621               | 0.9171            | 0.8062               | 0.6388            | 0.8062   | 0.6408   | 0.8046      |
| 0.0003        | 29.0  | 6496 | 1.8986          | 0.8062   | 0.8039    | 0.8062 | 0.6439          | 0.6388       | 0.0920    | 0.0742       | 0.8621               | 0.9171            | 0.8062               | 0.6388            | 0.8062   | 0.6408   | 0.8046      |
| 0.0           | 30.0  | 6720 | 1.8999          | 0.8085   | 0.8045    | 0.8085 | 0.6455          | 0.6413       | 0.0910    | 0.0732       | 0.8622               | 0.9177            | 0.8085               | 0.6413            | 0.8085   | 0.6429   | 0.8061      |


### Framework versions

- Transformers 4.40.2
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1