Submission results

Leaderboard - PR-AUC-macro

Team Run PR-AUC-macro ROC-AUC-macro External data
1 CP-JKU 3_avg_ensemble 0.154609 0.772913 -
2 CP-JKU 1_shakefaresnet 0.148037 0.771685 -
3 CP-JKU 2_faresnet 0.146333 0.75746 -
4 AMLAG submission2 0.125896 0.752886 -
5 YL-UTokyo run1 0.125564 0.753199 -
6 AMLAG submission1 0.118306 0.732416 -
7 AugLi all 0.117466 0.742474 Audioset
8 CP-JKU 4_crnn 0.117181 0.738059 -
9 AIT-DIL run1 0.112613 0.719132 -
10 TaiInn (Innsbruck) run_2 0.110331 0.718616 -
11 TaiInn (Innsbruck) run_3 0.110326 0.723038 -
12 baseline vggish 0.107734 0.725821 -
13 Taiinn (Taiwan) run_4_fixed_vqvae+cnn 0.107682 0.720728 MSD
14 AugLi ds-all 0.103896 0.72605 Audioset
15 Taiinn (Taiwan) run_3_fixed_vqvae+gru 0.103717 0.714068 MSD
16 CP-JKU 5_resnet34 0.102063 0.716824 -
17 AugLi crnn 0.099903 0.706619 Audioset
18 Taiinn (Taiwan) run_2_vqvae+cnn 0.099407 0.71468 MSD
19 Taiinn (Taiwan) run_1_vqvae+gru 0.098424 0.710325 MSD
20 AugLi ds-5s 0.098092 0.716297 Audioset
21 AugLi ds-1s 0.09725 0.714664 Audioset
22 TaiInn (Innsbruck) run_4 0.089701 0.685237 -
23 TaiInn (Innsbruck) run_5 0.089121 0.683924 -
24 Taiinn (Taiwan) run_5_one_hot_gru 0.08602 0.691638 MSD
25 TaiInn (Innsbruck) run_1 0.079503 0.699827 -
26 MCLAB-CCU h03 0.03419 0.501415 -
27 MCLAB-CCU h01 0.033271 0.489148 -
28 MCLAB-CCU h02 0.033215 0.493727 -
29 baseline popular 0.031924 0.5 -

Leaderboard - F-score-macro

Team Run F-score-macro External data
1 CP-JKU 3_avg_ensemble 0.212419 -
2 CP-JKU 1_shakefaresnet 0.208304 -
3 CP-JKU 2_faresnet 0.203541 -
4 YL-UTokyo run1 0.185618 -
5 AMLAG submission2 0.182957 -
6 AugLi all 0.175019 Audioset
7 CP-JKU 4_crnn 0.173994 -
8 baseline vggish 0.165694 -
9 AugLi ds-all 0.16222 Audioset
10 CP-JKU 5_resnet34 0.157927 -
11 AugLi ds-5s 0.157219 Audioset
12 AugLi ds-1s 0.155207 Audioset
13 AugLi crnn 0.154044 Audioset
14 AMLAG submission1 0.151891 -
15 Taiinn (Taiwan) run_1_vqvae+gru 0.118294 MSD
16 TaiInn (Innsbruck) run_2 0.114185 -
17 TaiInn (Innsbruck) run_3 0.111407 -
18 Taiinn (Taiwan) run_4_fixed_vqvae+cnn 0.10683 MSD
19 TaiInn (Innsbruck) run_1 0.105663 -
20 TaiInn (Innsbruck) run_5 0.101683 -
21 TaiInn (Innsbruck) run_4 0.101493 -
22 Taiinn (Taiwan) run_2_vqvae+cnn 0.101266 MSD
23 Taiinn (Taiwan) run_3_fixed_vqvae+gru 0.090136 MSD
24 Taiinn (Taiwan) run_5_one_hot_gru 0.088374 MSD
25 MCLAB-CCU h01 0.037114 -
26 MCLAB-CCU h03 0.027733 -
27 AIT-DIL run1 0.021825 -
28 MCLAB-CCU h02 0.006186 -
29 baseline popular 0.002642 -

Precision vs recall (macro)

All submissions

AIT-DIL

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
run1 0.112613 0.719132 0.021825 0.115355 0.013485 0.132072 0.763171 0.034736 0.645933 0.017848

AMLAG

Source code: https://github.com/sainathadapa/mediaeval-2019-moodtheme-detection

Paper: https://arxiv.org/abs/1911.07041

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
submission1 0.118306 0.732416 0.151891 0.135673 0.306015 0.150605 0.784128 0.152349 0.098133 0.340428
submission2 0.125896 0.752886 0.182957 0.145545 0.39164 0.151706 0.797624 0.164375 0.10135 0.434691

AugLi

Source code: https://github.com/amirip/AugLi-MediaEval

Paper: http://livrepository.liverpool.ac.uk/id/eprint/3056460

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
all 0.117466 0.742474 0.175019 0.144178 0.377937 0.130642 0.7837 0.15352 0.093847 0.421602
crnn 0.099903 0.706619 0.154044 0.129976 0.324591 0.108971 0.750611 0.149403 0.092015 0.397012
ds-1s 0.09725 0.714664 0.155207 0.123003 0.34234 0.111449 0.770016 0.150942 0.093186 0.397012
ds-5s 0.098092 0.716297 0.157219 0.127849 0.332327 0.116025 0.769734 0.152561 0.094899 0.388815
ds-all 0.103896 0.72605 0.16222 0.134094 0.352494 0.119108 0.775801 0.154919 0.09567 0.406928

CP-JKU

Source code: https://github.com/kkoutini/cpjku_dcase19, https://gitlab.cp.jku.at/shreyan/moodwalk

Paper: https://arxiv.org/abs/1911.05833

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
1_shakefaresnet 0.148037 0.771685 0.208304 0.173265 0.411417 0.177298 0.814717 0.190371 0.119496 0.467874
2_faresnet 0.146333 0.75746 0.203541 0.173915 0.367692 0.168327 0.802426 0.185605 0.11812 0.432972
3_avg_ensemble 0.154609 0.772913 0.212419 0.190088 0.400924 0.17785 0.815949 0.189553 0.11919 0.462718
4_crnn 0.117181 0.738059 0.173994 0.141352 0.350112 0.14534 0.788761 0.164905 0.102799 0.416579
5_resnet34 0.102063 0.716824 0.157927 0.127419 0.375276 0.118358 0.770417 0.141515 0.085169 0.418165

MCLAB-CCU

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
h01 0.033271 0.489148 0.037114 0.022906 0.411852 0.037396 0.549352 0.068349 0.036795 0.479905
h02 0.033215 0.493727 0.006186 0.004721 0.121737 0.028148 0.470795 0.031369 0.019844 0.074828
h03 0.03419 0.501415 0.027733 0.01763 0.389209 0.033155 0.525493 0.060564 0.032752 0.401507

TaiInn (Innsbruck)

Source code: https://github.com/dbis-uibk/MediaEval2019

Paper: https://evazangerle.at/publication/mediaeval-19-inn/mediaeval-19-inn.pdf

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
run_1 0.079503 0.699827 0.105663 0.060806 0.674412 0.102921 0.758694 0.103483 0.055992 0.68165
run_2 0.110331 0.718616 0.114185 0.066914 0.686569 0.137839 0.757217 0.103702 0.056172 0.674114
run_3 0.110326 0.723038 0.111407 0.06475 0.70626 0.117635 0.762722 0.107614 0.058302 0.697911
run_4 0.089701 0.685237 0.101493 0.059276 0.639675 0.105328 0.737998 0.094619 0.05123 0.618191
run_5 0.089121 0.683924 0.101683 0.060113 0.613251 0.10323 0.741256 0.099332 0.053984 0.620968

Taiinn (Taiwan)

Source code: https://github.com/annahung31/moodtheme-tagging

Paper: https://evazangerle.at/publication/mediaeval-19-tai/mediaeval-19-tai.pdf

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
run_1_vqvae+gru 0.098424 0.710325 0.118294 0.089805 0.36059 0.113265 0.761646 0.143879 0.084658 0.478847
run_2_vqvae+cnn 0.099407 0.71468 0.101266 0.080929 0.319114 0.115963 0.764125 0.123313 0.075021 0.346113
run_3_fixed_vqvae+gru 0.103717 0.714068 0.090136 0.066307 0.325505 0.124468 0.76873 0.118401 0.070259 0.376124
run_4_fixed_vqvae+cnn 0.107682 0.720728 0.10683 0.079105 0.309367 0.122313 0.769252 0.152157 0.09192 0.441433
run_5_one_hot_gru 0.08602 0.691638 0.088374 0.063746 0.363041 0.105585 0.75059 0.120869 0.069062 0.483739

YL-UTokyo

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
run1 0.125564 0.753199 0.185618 0.160227 0.345181 0.145698 0.797587 0.177358 0.113482 0.405738

baseline

Source code: https://github.com/MTG/mtg-jamendo-dataset

PR-AUC-macro ROC-AUC-macro F-score-macro precision-macro recall-macro PR-AUC-micro ROC-AUC-micro F-score-micro precision-micro recall-micro
popular 0.031924 0.5 0.002642 0.001427 0.017857 0.034067 0.513856 0.057312 0.079887 0.044685
vggish 0.107734 0.725821 0.165694 0.138216 0.30865 0.140913 0.775029 0.177133 0.116097 0.37348