kostissz commited on
Commit
f1e07b7
·
verified ·
1 Parent(s): 96b4e77

End of training

Browse files
README.md ADDED
@@ -0,0 +1,118 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: openai/whisper-small
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - common_voice_17_0
9
+ metrics:
10
+ - wer
11
+ model-index:
12
+ - name: whisper-small-el
13
+ results:
14
+ - task:
15
+ name: Automatic Speech Recognition
16
+ type: automatic-speech-recognition
17
+ dataset:
18
+ name: common_voice_17_0
19
+ type: common_voice_17_0
20
+ config: el
21
+ split: None
22
+ args: el
23
+ metrics:
24
+ - name: Wer
25
+ type: wer
26
+ value: 30.64381658175081
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # whisper-small-el
33
+
34
+ This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the common_voice_17_0 dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.3865
37
+ - Model Preparation Time: 0.0041
38
+ - Wer: 30.6438
39
+
40
+ ## Model description
41
+
42
+ More information needed
43
+
44
+ ## Intended uses & limitations
45
+
46
+ More information needed
47
+
48
+ ## Training and evaluation data
49
+
50
+ More information needed
51
+
52
+ ## Training procedure
53
+
54
+ ### Training hyperparameters
55
+
56
+ The following hyperparameters were used during training:
57
+ - learning_rate: 1e-05
58
+ - train_batch_size: 24
59
+ - eval_batch_size: 8
60
+ - seed: 42
61
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
62
+ - lr_scheduler_type: linear
63
+ - lr_scheduler_warmup_steps: 50
64
+ - training_steps: 2000
65
+ - mixed_precision_training: Native AMP
66
+
67
+ ### Training results
68
+
69
+ | Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer |
70
+ |:-------------:|:-------:|:----:|:---------------:|:----------------------:|:-------:|
71
+ | 0.5511 | 0.3311 | 50 | 0.3685 | 0.0041 | 39.3516 |
72
+ | 0.3132 | 0.6623 | 100 | 0.3168 | 0.0041 | 35.6276 |
73
+ | 0.2709 | 0.9934 | 150 | 0.2897 | 0.0041 | 33.4785 |
74
+ | 0.1634 | 1.3245 | 200 | 0.2829 | 0.0041 | 33.1450 |
75
+ | 0.1551 | 1.6556 | 250 | 0.2746 | 0.0041 | 32.5614 |
76
+ | 0.1559 | 1.9868 | 300 | 0.2683 | 0.0041 | 31.8481 |
77
+ | 0.0818 | 2.3179 | 350 | 0.2735 | 0.0041 | 31.3942 |
78
+ | 0.0808 | 2.6490 | 400 | 0.2735 | 0.0041 | 31.9592 |
79
+ | 0.0799 | 2.9801 | 450 | 0.2765 | 0.0041 | 32.4595 |
80
+ | 0.0451 | 3.3113 | 500 | 0.2922 | 0.0041 | 31.4590 |
81
+ | 0.0436 | 3.6424 | 550 | 0.2892 | 0.0041 | 31.0514 |
82
+ | 0.0436 | 3.9735 | 600 | 0.2902 | 0.0041 | 31.3942 |
83
+ | 0.0241 | 4.3046 | 650 | 0.3117 | 0.0041 | 31.2552 |
84
+ | 0.0212 | 4.6358 | 700 | 0.3162 | 0.0041 | 31.0699 |
85
+ | 0.0226 | 4.9669 | 750 | 0.3172 | 0.0041 | 30.8754 |
86
+ | 0.0127 | 5.2980 | 800 | 0.3521 | 0.0041 | 32.5336 |
87
+ | 0.0125 | 5.6291 | 850 | 0.3432 | 0.0041 | 31.1996 |
88
+ | 0.0123 | 5.9603 | 900 | 0.3463 | 0.0041 | 31.4034 |
89
+ | 0.0077 | 6.2914 | 950 | 0.3764 | 0.0041 | 31.0699 |
90
+ | 0.0071 | 6.6225 | 1000 | 0.3607 | 0.0041 | 32.4317 |
91
+ | 0.0062 | 6.9536 | 1050 | 0.3698 | 0.0041 | 30.8754 |
92
+ | 0.0045 | 7.2848 | 1100 | 0.3758 | 0.0041 | 30.9588 |
93
+ | 0.0035 | 7.6159 | 1150 | 0.3865 | 0.0041 | 30.6438 |
94
+ | 0.0038 | 7.9470 | 1200 | 0.3856 | 0.0041 | 31.2830 |
95
+ | 0.0027 | 8.2781 | 1250 | 0.3800 | 0.0041 | 30.8569 |
96
+ | 0.0021 | 8.6093 | 1300 | 0.3858 | 0.0041 | 30.6901 |
97
+ | 0.0022 | 8.9404 | 1350 | 0.3949 | 0.0041 | 31.1996 |
98
+ | 0.0017 | 9.2715 | 1400 | 0.4020 | 0.0041 | 30.7920 |
99
+ | 0.0016 | 9.6026 | 1450 | 0.4061 | 0.0041 | 30.9588 |
100
+ | 0.0016 | 9.9338 | 1500 | 0.4111 | 0.0041 | 31.0514 |
101
+ | 0.0014 | 10.2649 | 1550 | 0.4067 | 0.0041 | 31.1996 |
102
+ | 0.0013 | 10.5960 | 1600 | 0.4093 | 0.0041 | 31.0144 |
103
+ | 0.0013 | 10.9272 | 1650 | 0.4112 | 0.0041 | 30.8661 |
104
+ | 0.0012 | 11.2583 | 1700 | 0.4126 | 0.0041 | 30.9680 |
105
+ | 0.0012 | 11.5894 | 1750 | 0.4134 | 0.0041 | 30.9588 |
106
+ | 0.0012 | 11.9205 | 1800 | 0.4145 | 0.0041 | 30.9217 |
107
+ | 0.0011 | 12.2517 | 1850 | 0.4155 | 0.0041 | 30.8384 |
108
+ | 0.0011 | 12.5828 | 1900 | 0.4160 | 0.0041 | 30.8939 |
109
+ | 0.0011 | 12.9139 | 1950 | 0.4163 | 0.0041 | 30.8754 |
110
+ | 0.0011 | 13.2450 | 2000 | 0.4164 | 0.0041 | 30.8754 |
111
+
112
+
113
+ ### Framework versions
114
+
115
+ - Transformers 4.48.0
116
+ - Pytorch 2.5.1+cu124
117
+ - Datasets 3.2.0
118
+ - Tokenizers 0.21.0
generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "greek",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.48.0"
254
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8afed759ffde7def79dbf0bc0b823d09f6808067a5c3e37eb4e7a3fa65c5e370
3
  size 966995080
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f05a51252267ae5d556a4915b949f562f385872d7ca471eff1ca30f9bf7b7010
3
  size 966995080
runs/Feb17_16-22-37_kostis/events.out.tfevents.1739837023.kostis.99292.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:308a2e4ae6ac486b75554969f581e21d63369a810d6ab68db4a307b3b11c48fc
3
+ size 472