ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-2_8k
This model is a fine-tuned version of gary109/ai-light-dance_drums_pretrain_wav2vec2-base-new on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-2 dataset.
It achieves the following results on the evaluation set:
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 30
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
Wer |
No log |
1.0 |
9 |
101.8046 |
0.98 |
17.4958 |
2.0 |
18 |
82.4920 |
1.0 |
16.2087 |
3.0 |
27 |
36.1388 |
1.0 |
6.2942 |
4.0 |
36 |
8.3267 |
1.0 |
2.0411 |
5.0 |
45 |
6.8215 |
1.0 |
1.554 |
6.0 |
54 |
5.3847 |
1.0 |
1.6215 |
7.0 |
63 |
4.4645 |
1.0 |
1.4962 |
8.0 |
72 |
3.2211 |
1.0 |
1.3825 |
9.0 |
81 |
2.5513 |
1.0 |
1.3443 |
10.0 |
90 |
2.8582 |
1.0 |
1.3443 |
11.0 |
99 |
2.5446 |
1.0 |
1.3096 |
12.0 |
108 |
2.0211 |
0.9956 |
1.3361 |
13.0 |
117 |
1.8110 |
0.9944 |
1.2862 |
14.0 |
126 |
1.7796 |
0.9933 |
1.2556 |
15.0 |
135 |
1.7301 |
0.9922 |
1.1959 |
16.0 |
144 |
1.4245 |
0.9989 |
1.1161 |
17.0 |
153 |
1.1932 |
0.5678 |
0.8853 |
18.0 |
162 |
1.2726 |
0.4922 |
0.7996 |
19.0 |
171 |
1.0841 |
0.5511 |
0.8165 |
20.0 |
180 |
1.4062 |
0.4411 |
0.8165 |
21.0 |
189 |
1.4219 |
0.3367 |
0.6807 |
22.0 |
198 |
1.2107 |
0.3344 |
0.7315 |
23.0 |
207 |
1.1420 |
0.3189 |
0.6203 |
24.0 |
216 |
1.0770 |
0.3778 |
0.6552 |
25.0 |
225 |
1.1095 |
0.3789 |
0.5618 |
26.0 |
234 |
1.0004 |
0.3478 |
0.5311 |
27.0 |
243 |
0.8811 |
0.3311 |
0.5391 |
28.0 |
252 |
0.8163 |
0.3678 |
0.5275 |
29.0 |
261 |
1.0000 |
0.3311 |
0.4965 |
30.0 |
270 |
0.7320 |
0.37 |
0.4965 |
31.0 |
279 |
0.9643 |
0.3389 |
0.4909 |
32.0 |
288 |
0.7663 |
0.3589 |
0.5218 |
33.0 |
297 |
0.9004 |
0.3489 |
0.4991 |
34.0 |
306 |
0.7342 |
0.38 |
0.4883 |
35.0 |
315 |
0.7959 |
0.3389 |
0.4902 |
36.0 |
324 |
0.6892 |
0.3378 |
0.4447 |
37.0 |
333 |
0.6480 |
0.3333 |
0.4458 |
38.0 |
342 |
0.6198 |
0.3333 |
0.4607 |
39.0 |
351 |
0.6081 |
0.3111 |
0.4352 |
40.0 |
360 |
0.6748 |
0.3156 |
0.4352 |
41.0 |
369 |
0.6885 |
0.3256 |
0.4286 |
42.0 |
378 |
0.6806 |
0.3333 |
0.4314 |
43.0 |
387 |
0.7855 |
0.3222 |
0.4476 |
44.0 |
396 |
0.6569 |
0.3144 |
0.4815 |
45.0 |
405 |
0.5389 |
0.3033 |
0.36 |
46.0 |
414 |
0.5550 |
0.3011 |
0.4516 |
47.0 |
423 |
0.5924 |
0.3144 |
0.3682 |
48.0 |
432 |
0.7275 |
0.3056 |
0.4371 |
49.0 |
441 |
0.7051 |
0.3089 |
0.4004 |
50.0 |
450 |
0.5669 |
0.3078 |
0.4004 |
51.0 |
459 |
0.5029 |
0.3178 |
0.3298 |
52.0 |
468 |
0.6150 |
0.32 |
0.4083 |
53.0 |
477 |
0.5882 |
0.33 |
0.4022 |
54.0 |
486 |
0.7253 |
0.3144 |
0.4465 |
55.0 |
495 |
0.6808 |
0.3111 |
0.3955 |
56.0 |
504 |
0.6002 |
0.3133 |
0.3877 |
57.0 |
513 |
0.7593 |
0.3056 |
0.3486 |
58.0 |
522 |
0.6764 |
0.3189 |
0.3782 |
59.0 |
531 |
0.6772 |
0.3133 |
0.3599 |
60.0 |
540 |
0.8846 |
0.3111 |
0.3599 |
61.0 |
549 |
0.9458 |
0.3233 |
0.3424 |
62.0 |
558 |
0.8399 |
0.3233 |
0.3652 |
63.0 |
567 |
0.8266 |
0.3133 |
0.3327 |
64.0 |
576 |
0.7813 |
0.3078 |
0.3603 |
65.0 |
585 |
0.8066 |
0.3156 |
0.3401 |
66.0 |
594 |
0.7960 |
0.3067 |
0.3797 |
67.0 |
603 |
0.8513 |
0.2989 |
0.3353 |
68.0 |
612 |
0.8319 |
0.2722 |
0.3909 |
69.0 |
621 |
0.8244 |
0.2878 |
0.3263 |
70.0 |
630 |
0.9539 |
0.3022 |
0.3263 |
71.0 |
639 |
1.0030 |
0.2922 |
0.3102 |
72.0 |
648 |
0.9875 |
0.3044 |
0.3577 |
73.0 |
657 |
0.9030 |
0.2978 |
0.2953 |
74.0 |
666 |
0.9392 |
0.2889 |
0.3644 |
75.0 |
675 |
0.9089 |
0.2878 |
0.3231 |
76.0 |
684 |
0.9264 |
0.2844 |
0.3078 |
77.0 |
693 |
1.0536 |
0.2911 |
0.4503 |
78.0 |
702 |
0.9473 |
0.2967 |
0.3492 |
79.0 |
711 |
0.8909 |
0.3089 |
0.347 |
80.0 |
720 |
0.8532 |
0.3067 |
0.347 |
81.0 |
729 |
0.9553 |
0.2833 |
0.2949 |
82.0 |
738 |
1.0111 |
0.2867 |
0.3447 |
83.0 |
747 |
0.9160 |
0.3011 |
0.2878 |
84.0 |
756 |
0.8401 |
0.2989 |
0.3229 |
85.0 |
765 |
0.8815 |
0.2911 |
0.276 |
86.0 |
774 |
0.8802 |
0.2911 |
0.3469 |
87.0 |
783 |
0.9121 |
0.29 |
0.3044 |
88.0 |
792 |
0.8934 |
0.2933 |
0.2885 |
89.0 |
801 |
0.8806 |
0.2967 |
0.3365 |
90.0 |
810 |
0.9037 |
0.2844 |
0.3365 |
91.0 |
819 |
0.9218 |
0.2867 |
0.3239 |
92.0 |
828 |
0.9228 |
0.2844 |
0.3219 |
93.0 |
837 |
0.9167 |
0.2844 |
0.2736 |
94.0 |
846 |
0.9495 |
0.2878 |
0.3587 |
95.0 |
855 |
0.9997 |
0.2844 |
0.3386 |
96.0 |
864 |
0.9977 |
0.2856 |
0.2895 |
97.0 |
873 |
0.9964 |
0.2889 |
0.3496 |
98.0 |
882 |
0.9765 |
0.2889 |
0.2789 |
99.0 |
891 |
0.9713 |
0.2878 |
0.3284 |
100.0 |
900 |
0.9687 |
0.2889 |
Framework versions
- Transformers 4.25.0.dev0
- Pytorch 1.8.1+cu111
- Datasets 2.7.1.dev0
- Tokenizers 0.13.2