File size: 4,871 Bytes
7da0dc4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_4_with_init_sun_syl_wd_0__0020
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_4_with_init_sun_syl_wd_0__0020

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.8703
- Train Accuracy: 0.0174
- Train Wermet: 0.5936
- Train Wermet Syl: 0.5685
- Validation Loss: 2.2644
- Validation Accuracy: 0.0156
- Validation Wermet: 0.6310
- Validation Wermet Syl: 0.5925
- Epoch: 19

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Train Wermet Syl | Validation Loss | Validation Accuracy | Validation Wermet | Validation Wermet Syl | Epoch |
|:----------:|:--------------:|:------------:|:----------------:|:---------------:|:-------------------:|:-----------------:|:---------------------:|:-----:|
| 5.3409     | 0.0111         | 1.3547       | 1.2898           | 3.9789          | 0.0114              | 0.9710            | 0.9563                | 0     |
| 4.7143     | 0.0116         | 0.8622       | 0.8228           | 3.9404          | 0.0113              | 0.9823            | 0.9735                | 1     |
| 4.6752     | 0.0117         | 0.8472       | 0.8057           | 3.9081          | 0.0114              | 0.9579            | 0.9359                | 2     |
| 4.6500     | 0.0117         | 0.8382       | 0.7945           | 3.8820          | 0.0115              | 0.9213            | 0.8856                | 3     |
| 4.6282     | 0.0118         | 0.8286       | 0.7805           | 3.8738          | 0.0114              | 0.9433            | 0.9119                | 4     |
| 4.6095     | 0.0118         | 0.8190       | 0.7696           | 3.8630          | 0.0115              | 0.9117            | 0.8698                | 5     |
| 4.5875     | 0.0119         | 0.7976       | 0.7465           | 3.8341          | 0.0116              | 0.8976            | 0.8552                | 6     |
| 4.5682     | 0.0120         | 0.7753       | 0.7227           | 3.8277          | 0.0116              | 0.9014            | 0.8653                | 7     |
| 4.5376     | 0.0121         | 0.7528       | 0.7005           | 3.7844          | 0.0118              | 0.8332            | 0.7815                | 8     |
| 4.5060     | 0.0122         | 0.7392       | 0.6844           | 3.7537          | 0.0118              | 0.8578            | 0.8152                | 9     |
| 4.4580     | 0.0124         | 0.7221       | 0.6694           | 3.7038          | 0.0120              | 0.8190            | 0.7679                | 10    |
| 4.3989     | 0.0125         | 0.7156       | 0.6636           | 3.6169          | 0.0122              | 0.7979            | 0.7429                | 11    |
| 4.3056     | 0.0128         | 0.7069       | 0.6557           | 3.5098          | 0.0125              | 0.7924            | 0.7396                | 12    |
| 4.1673     | 0.0132         | 0.7054       | 0.6584           | 3.3542          | 0.0128              | 0.7759            | 0.7240                | 13    |
| 3.9762     | 0.0138         | 0.6987       | 0.6559           | 3.1318          | 0.0133              | 0.7644            | 0.7231                | 14    |
| 3.7385     | 0.0145         | 0.6835       | 0.6448           | 2.9144          | 0.0138              | 0.7392            | 0.6955                | 15    |
| 3.5040     | 0.0152         | 0.6644       | 0.6298           | 2.7413          | 0.0142              | 0.7019            | 0.6548                | 16    |
| 3.2728     | 0.0160         | 0.6408       | 0.6101           | 2.5183          | 0.0149              | 0.6798            | 0.6363                | 17    |
| 3.0657     | 0.0167         | 0.6188       | 0.5912           | 2.3594          | 0.0153              | 0.6528            | 0.6103                | 18    |
| 2.8703     | 0.0174         | 0.5936       | 0.5685           | 2.2644          | 0.0156              | 0.6310            | 0.5925                | 19    |


### Framework versions

- Transformers 4.34.0.dev0
- TensorFlow 2.13.0
- Tokenizers 0.13.3