File size: 4,165 Bytes
7306f5a
 
 
 
44822e7
7306f5a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
license: apache-2.0
base_model: facebook/convnextv2-tiny-22k-384
tags:
- image-classification
- generated_from_trainer
model-index:
- name: CheXpert-5-convnextv2-tiny-384
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# CheXpert-5-convnextv2-tiny-384

This model is a fine-tuned version of [facebook/convnextv2-tiny-22k-384](https://huggingface.co/facebook/convnextv2-tiny-22k-384) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1009
- Auroc Atelectasis: 0.7943
- Auroc Cardiomegaly: 0.8187
- Auroc Consolidation: 0.9269
- Auroc Edema: 0.9233
- Auroc Pleural effusion: 0.9315
- Specificity Atelectasis: 0.7891
- Specificity Cardiomegaly: 1.0
- Specificity Consolidation: 0.9948
- Specificity Edema: 0.8407
- Specificity Pleural effusion: 0.8038
- Exact Match: 0.4464
- Hamming Distance: 0.1804

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 2500
- num_epochs: 6

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Auroc Atelectasis | Auroc Cardiomegaly | Auroc Consolidation | Auroc Edema | Auroc Pleural effusion | Specificity Atelectasis | Specificity Cardiomegaly | Specificity Consolidation | Specificity Edema | Specificity Pleural effusion | Exact Match | Hamming Distance |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------:|:------------------:|:-------------------:|:-----------:|:----------------------:|:-----------------------:|:------------------------:|:-------------------------:|:-----------------:|:----------------------------:|:-----------:|:----------------:|
| 0.0891        | 1.0   | 6120  | 0.0893          | 0.7323            | 0.8366             | 0.7020              | 0.8387      | 0.8702                 | 0.4510                  | 0.9661                   | 1.0                       | 0.6596            | 0.5392                       | 0.2616      | 0.2444           |
| 0.0854        | 2.0   | 12240 | 0.0831          | 0.7535            | 0.8556             | 0.7350              | 0.8651      | 0.8881                 | 0.7293                  | 0.9042                   | 0.9936                    | 0.7083            | 0.6259                       | 0.3571      | 0.1973           |
| 0.082         | 3.0   | 18360 | 0.0824          | 0.7683            | 0.8696             | 0.7473              | 0.8720      | 0.8961                 | 0.6956                  | 0.8196                   | 0.9881                    | 0.6087            | 0.6611                       | 0.3298      | 0.2177           |
| 0.0799        | 4.0   | 24480 | 0.0802          | 0.7749            | 0.8720             | 0.7562              | 0.8783      | 0.9005                 | 0.7450                  | 0.8831                   | 0.9608                    | 0.7341            | 0.6984                       | 0.3802      | 0.1880           |
| 0.0759        | 5.0   | 30600 | 0.0793          | 0.7795            | 0.8746             | 0.7583              | 0.8818      | 0.9030                 | 0.7277                  | 0.8948                   | 0.9711                    | 0.7618            | 0.7045                       | 0.3869      | 0.1823           |
| 0.0739        | 6.0   | 36720 | 0.0798          | 0.7787            | 0.8727             | 0.7561              | 0.8812      | 0.9031                 | 0.7461                  | 0.8921                   | 0.9690                    | 0.7487            | 0.7074                       | 0.3886      | 0.1824           |


### Framework versions

- Transformers 4.41.0
- Pytorch 2.1.2
- Datasets 2.19.1
- Tokenizers 0.19.1