Wiki Historian En
Guess the masked date in an wikipedia article. [ver. 1.0.0]
This is a long list of all submissions, if you want to see only the best, click leaderboard.
# | submitter | when | ver. | description | dev-0 MAE-Against-Interval | dev-0 RMSE-Against-Interval | test-A MAE-Against-Interval | test-A RMSE-Against-Interval | |
---|---|---|---|---|---|---|---|---|---|
11 | kubapok | 2023-06-01 11:13 | 1.0.0 | test | 23.12 | 27.32 | 23.35 | 27.51 | |
10 | Jakub Pokrywka | 2023-02-15 16:17 | 1.0.0 | test3 | 23.12 | 27.32 | 23.35 | 27.51 | |
9 | Jakub Pokrywka | 2023-02-15 16:16 | 1.0.0 | test2 | 23.12 | 27.32 | 23.35 | 27.51 | |
8 | kubapok | 2023-02-06 12:54 | 1.0.0 | test | 23.12 | 27.32 | 23.35 | 27.51 | |
2 | kubapok | 2022-07-20 19:59 | 1.0.0 | hf roberta large transformer roberta | 18.97 | 24.07 | 19.13 | 24.25 | |
4 | kubapok | 2022-07-18 20:33 | 1.0.0 | hf linear layer, regular roberta finetunned firstly on roberta challam year prediction, then on wiki historian transformer roberta | 19.12 | 24.40 | 19.28 | 24.59 | |
5 | kubapok | 2022-07-18 20:22 | 1.0.0 | hf linear layer, roberta challam finetunned firstly on roberta challam year prediction, then on wiki historian transformer roberta roberta-challam | 19.74 | 25.00 | 20.02 | 25.31 | |
6 | kubapok | 2022-07-14 19:18 | 1.0.0 | hf challam roberta regression layer on top transformer roberta huggingface-transformers | 20.13 | 25.52 | 20.48 | 25.86 | |
1 | kubapok | 2022-07-11 16:37 | 1.0.0 | hf roberta large epoch 4 (epochs 5 and 6 with no progress) transformer roberta huggingface-transformers | 18.79 | 23.99 | 19.01 | 24.22 | |
3 | kubapok | 2022-07-04 18:33 | 1.0.0 | hf roberta base regression layer on top transformer roberta roberta-base huggingface-transformers | 19.28 | 24.36 | 19.46 | 24.56 | |
13 | kubapok | 2022-07-03 16:51 | 1.0.0 | tfidf with linear regression linear-regression tf-idf | 21.39 | 28.92 | 21.53 | 29.17 | |
16 | kaczla | 2022-05-22 20:00 | 1.0.0 | MLM - prediction with loss of all available dates evaluation_type=loss model=roberta-base transformer huggingface-transformers transformer-encoder lm-loss | 23.56 | 31.99 | 23.69 | 32.12 | |
14 | kaczla | 2022-05-22 20:00 | 1.0.0 | MLM - prediction with loss of all available dates evaluation_type=loss model=roberta-large transformer huggingface-transformers transformer-encoder lm-loss | 21.63 | 29.82 | 21.90 | 30.28 | |
15 | kaczla | 2022-05-22 19:46 | 1.0.0 | MLM - prediction only 1 token evaluation_type=mlm max_tokens=1 model=roberta-base transformer huggingface-transformers transformer-encoder mlm | 22.45 | 31.09 | 22.51 | 31.23 | |
12 | kaczla | 2022-05-22 19:46 | 1.0.0 | MLM - prediction only 1 token evaluation_type=mlm max_tokens=1 model=roberta-large transformer huggingface-transformers transformer-encoder mlm | 19.95 | 28.37 | 19.90 | 28.49 | |
7 | kubapok | 2022-02-12 18:48 | 1.0.0 | mean from train baseline | 23.12 | 27.32 | 23.35 | 27.51 |