Challenging America year prediction

Guess the time when an excerpt was published [ver. 3.0.2]

challenging-america diachronic

# submitter when ver. description dev-0 RMSE test-A RMSE
46 kubapok 2022-01-04 12:18 3.0.0 hf roberta classification 10.79 11.39
7 kubapok 2021-12-24 14:15 3.0.0 hf_roberta_base_as_in_ireland 10.30 12.02
4 kubapok 2021-12-23 12:09 3.0.0 hf_roberta_base_as_in_ireland 12.44 12.54
45 p/tlen 2021-12-16 08:21 3.0.0 RoBERTa Challam fine-tuned batch-size=4 early-stopping=3 learning-rate=1.0e-6 roberta roberta-challam huggingface-transformers 11.03 10.80
44 p/tlen 2021-12-16 08:14 3.0.0 RoBERTa Base fine-tuned batch-size=4 early-stopping=3 learning-rate=1.0e-6 roberta roberta-base huggingface-transformers 10.70 12.07
62 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=9 roberta roberta-challam huggingface-transformers 12.13 11.26
61 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=8 roberta roberta-challam huggingface-transformers 13.28 12.27
60 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=7 roberta roberta-challam huggingface-transformers 12.64 11.70
59 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=6 roberta roberta-challam huggingface-transformers 11.74 10.70
58 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=5 roberta roberta-challam huggingface-transformers 12.64 11.67
57 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=4 roberta roberta-challam huggingface-transformers 12.06 11.50
56 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=3 roberta roberta-challam huggingface-transformers 11.23 10.58
55 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=2 roberta roberta-challam huggingface-transformers 11.91 11.22
54 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=16 roberta roberta-challam huggingface-transformers 11.97 11.22
53 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=15 roberta roberta-challam huggingface-transformers 12.16 11.33
52 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=14 roberta roberta-challam huggingface-transformers 12.12 11.26
51 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=13 roberta roberta-challam huggingface-transformers 12.05 11.34
50 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=12 roberta roberta-challam huggingface-transformers 12.13 11.35
49 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=11 roberta roberta-challam huggingface-transformers 11.94 11.05
48 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=10 roberta roberta-challam huggingface-transformers 11.91 10.99
47 p/tlen 2021-12-15 15:15 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=1 roberta roberta-challam huggingface-transformers 11.44 10.83
43 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=9 roberta roberta-base huggingface-transformers 12.86 14.40
42 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=8 roberta roberta-base huggingface-transformers 12.07 13.44
41 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=7 roberta roberta-base huggingface-transformers 12.36 13.84
40 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=6 roberta roberta-base huggingface-transformers 13.77 15.21
39 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=5 roberta roberta-base huggingface-transformers 12.51 13.19
38 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=4 roberta roberta-base huggingface-transformers 15.97 16.70
37 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=3 roberta roberta-base huggingface-transformers 12.67 13.50
36 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=2 roberta roberta-base huggingface-transformers 14.18 14.46
35 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=11 roberta roberta-base huggingface-transformers 12.05 13.47
34 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=10 roberta roberta-base huggingface-transformers 11.33 12.70
33 p/tlen 2021-12-15 12:27 3.0.0 RoBERTa ChallAm fine-tuned batch-size=32 epochs=1 roberta roberta-base huggingface-transformers 14.80 14.81
32 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=9 roberta roberta-base huggingface-transformers 12.72 13.31
31 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=8 roberta roberta-base huggingface-transformers 12.91 13.66
30 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=7 roberta roberta-base huggingface-transformers 14.32 14.96
29 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=6 roberta roberta-base huggingface-transformers 14.02 14.43
28 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=5 roberta roberta-base huggingface-transformers 13.91 14.33
27 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=4 roberta roberta-base huggingface-transformers 12.34 12.53
26 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=3 roberta roberta-base huggingface-transformers 14.05 14.37
25 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=2 roberta roberta-base huggingface-transformers 16.06 16.21
24 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=16 roberta roberta-base huggingface-transformers 12.59 13.60
23 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=15 roberta roberta-base huggingface-transformers 12.65 13.67
22 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=14 roberta roberta-base huggingface-transformers 12.84 13.80
21 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=13 roberta roberta-base huggingface-transformers 13.48 14.30
20 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=12 roberta roberta-base huggingface-transformers 13.47 14.23
19 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=11 roberta roberta-base huggingface-transformers 13.44 14.26
18 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=10 roberta roberta-base huggingface-transformers 13.80 14.46
17 p/tlen 2021-12-15 10:08 3.0.0 RoBERTa Base fine-tuned batch-size=96 epochs=1 roberta roberta-base huggingface-transformers 17.37 17.26
16 p/tlen 2021-12-14 14:22 3.0.0 RoBERTa ChallAm fine-tuned batch-size=96 epochs=16 roberta roberta-challam huggingface-transformers 11.81 11.63
15 p/tlen 2021-12-14 11:53 3.0.0 RoBERTa base fine-tuned batch-size=96 epochs=16 roberta roberta-base huggingface-transformers 17.37 17.26
14 kubapok 2021-12-14 11:30 3.0.0 hf roberta base epoch1 (fix) 12.60 13.40
12 kubapok 2021-12-13 12:31 3.0.0 hf roberta base epoch1 14.12 14.13
63 kubapok 2021-12-13 12:21 3.0.0 hf challam robertabase without date (checkpoint 395000) epoch1 12.40 11.33
6 p/tlen 2021-08-31 19:42 3.0.0 Bare data 42.76 33.27
5 p/tlen 2021-08-27 18:05 3.0.0 tfidf and linear regression min-df=5 linear-regression scikit-learn tf-idf 21.75 21.49
3 kaczla 2021-08-27 15:54 3.0.0 RoBERTa with linear regression on top [kubapok] model=roberta_large roberta 13.14 12.15
2 kaczla 2021-08-27 15:54 3.0.0 RoBERTa with linear regression on top [kubapok] model=roberta_base roberta 11.51 11.63
1 p/tlen 2021-08-27 09:01 3.0.0 mean from train null-model baseline 37.46 29.04
13 kubapok 2021-07-11 16:43 2.0.0 roberta large with linear regression on top roberta 8.02 8.15
11 kubapok 2021-07-11 16:41 2.0.0 bilstm bilstm 13.77 13.95
10 kubapok 2021-07-11 16:40 2.0.0 roberta base with linear regression on top roberta 10.21 10.19
9 kubapok 2021-07-11 16:38 2.0.0 tfidf and linear regression linear-regression tf-idf 17.03 17.11
8 kubapok 2021-07-11 16:37 2.0.0 mean from train baseline 31.27 31.50

Submission graph

Graphs by parameters

batch-size

[direct link]

epochs

[direct link]