"He Said She Said" classification challenge (2nd edition)

Give the probability that a text in Polish was written by a man. [ver. 2.0.1]

Git repo URL: git://gonito.net/petite-difference-challenge2 / Branch: master

(Browse at https://gonito.net/gitlist/petite-difference-challenge2.git/master)

Leaderboard

# submitter when ver. description test-A Accuracy test-A Likelihood ×
1 kaczla 2020-07-02 08:55 2.0.1 Polish RoBERTa (base), epoch 5, seq_len 512, active dropout fairseq roberta-pl 0.74332 0.62110 30
2 kubapok 2020-06-19 09:29 2.0.1 pl roberta large active dropout avg 12 runs 0.74406 0.61949 23
3 Damian Litwin 2020-05-24 14:52 2.0.1 self-made NB with probs ISI-2019-063 probabilities 0.65612 0.52537 5
4 Mikolaj Bachorz 2020-05-24 14:17 2.0.0 v5 probabilities 0.64427 0.52134 7
5 [anonymised] 2020-06-07 12:45 2.0.0 XGBoost ready-made ready-made xgboost 0.60039 0.52000 5
6 p/tlen 2020-05-23 21:29 2.0.0 null model null-model 0.50000 0.50000 17
7 Jakub Stefko 2020-06-03 17:39 2.0.0 2nd logistic-regression word2vec 0.49299 0.49679 2
8 Ivan Novgorodtsev 2020-06-05 10:17 2.0.0 svm ready-made svm 0.59623 0.00000 1
# tags test-A Accuracy -C test-A Accuracy +C test-A Accuracy +H test-A Likelihood -C test-A Likelihood +C test-A Likelihood +H test-A Accuracy test-A Likelihood
1 fairseq roberta-pl 0.74244 0.77159 0.77125 0.62032 0.64656 0.64332 0.74332 0.62110
2 0.74349 0.76227 0.77125 0.61881 0.64193 0.63970 0.74406 0.61949
3 roberta-xlm 0.70042 0.72571 0.71500 0.58222 0.60181 0.59521 0.70118 0.58280
4 fairseq roberta N/A N/A N/A N/A N/A N/A 0.69153 0.57068
5 probabilities 0.65572 0.66879 0.64500 0.52524 0.52950 0.52176 0.65612 0.52537
6 ready-made xgboost N/A N/A N/A N/A N/A N/A 0.60039 0.52000
7 null-model N/A N/A N/A N/A N/A N/A 0.50000 0.50000
8 logistic-regression word2vec N/A N/A N/A N/A N/A N/A 0.49299 0.49679
9 baseline N/A N/A N/A N/A N/A N/A 0.50000 0.48990
10 ready-made svm N/A N/A N/A N/A N/A N/A 0.59623 0.00000

Graphs by parameters

eval_batch_size

[direct link]
[direct link]

evaluate_during_training_steps

[direct link]
[direct link]

num_train_epochs

[direct link]
[direct link]

save_steps

[direct link]
[direct link]

seq_len

[direct link]
[direct link]

train_batch_size

[direct link]
[direct link]