Guess a word in a gap in historic texts

Give a probability distribution for a word in a gap in a corpus of Polish historic texts spanning 1814-2013. This is a challenge for (temporal) language models.

submitter when description dev-0/LogLossHashed dev-1/LogLossHashed test-A/LogLossHashed
EmEm 2018-01-28 07:51 trigrams_fixed lm self-made trigram N/A N/A 19.1101
siulkilulki 2018-01-24 14:39 simple neural network, context 2 words ahead 2 words behind neural-network 5.8672 6.0007 5.7395
kaczla 2018-01-17 11:20 simple neural network - nb_of_epochs=3, batch_size=2048 neural-network 5.8751 5.9999 5.7839
kaczla 2018-01-16 18:52 simple neural network - nb_of_epochs=2 neural-network 5.9285 6.0385 5.8193
kaczla 2018-01-16 18:13 simple neural network - nb_of_epochs=4 neural-network 5.9463 6.0446 5.8514
kaczla 2018-01-16 17:17 simple neural network - decrease batch_size neural-network 6.1810 6.2569 6.0581
patrycja 2018-01-15 18:11 Bigrams model, 100 best words self-made lm bigram stupid N/A 6.3638 6.1097
patrycja 2018-01-09 18:26 ??? bigram lm stupid self-made N/A N/A N/A
patrycja 2018-01-09 18:08 Bigrams model, 100 best words lm self-made stupid bigram N/A N/A N/A
p/tlen 2018-01-03 06:07 a very simple (non-recurrent) neural network, looking one word behind and one word ahead (train on all data), dictionary size=40000 neural-network 5.9766 6.0881 5.8648
EmEm 2018-01-02 18:14 'trigrams' lm self-made trigram N/A N/A 14.5507
EmEm 2018-01-02 17:26 'trigrams' N/A N/A N/A
p/tlen 2018-01-02 16:23 a very simple (non-recurrent) neural network, looking one word behind and one word ahead neural-network 5.9794 6.0982 5.8990
siulkilulki 2017-12-13 14:54 unigram with temporal info, best 100, two periods (1813, 1913) (1913, 2014) self-made lm unigram temporal 6.1654 6.1828 6.0816
siulkilulki 2017-12-13 14:44 unigram with temporal info, best 100, 2 periods (1813, 1913) (1913, 2014) lm unigram temporal self-made 6.1717 6.2016 6.0893
siulkilulki 2017-12-13 14:41 unigram with temporal model, 25 best self-made 6.2397 6.2592 6.1729
kaczla 2017-12-12 20:45 3-gram with prune, best 1, best oov ready-made kenlm lm 6.1260 6.2991 6.1896
kaczla 2017-12-12 20:42 3-gram with prune, best 2, best oov ready-made lm kenlm 5.9662 6.1685 6.0105
kaczla 2017-12-12 20:41 3-gram with prune, best 3, best oov ready-made kenlm lm 5.8803 6.0738 5.9181
kaczla 2017-12-12 20:38 3-gram with prune, best 5, best oov ready-made kenlm lm 5.8022 5.9837 5.8182
kaczla 2017-12-12 20:37 3-gram with prune, best 10, best oov ready-made kenlm lm 5.7428 5.9032 5.7196
kaczla 2017-12-12 20:35 3-gram with prune, best 15, best oov ready-made kenlm lm 5.7367 5.8767 5.7006
kaczla 2017-12-12 20:32 3-gram with prune, best 25, best oov ready-made lm kenlm 5.7500 5.8788 5.7052
kaczla 2017-12-12 19:19 3-gram with prune, best 1 ready-made kenlm lm 6.1473 6.3361 6.2166
kaczla 2017-12-12 19:17 3-gram with prune, best 2 ready-made lm kenlm 6.1808 6.4349 6.2362
kaczla 2017-12-12 19:14 3-gram with prune, best 3 ready-made lm kenlm 6.2590 6.5174 6.3085
kaczla 2017-12-05 21:39 3-gram with prune, best 5 ready-made lm kenlm 6.4040 6.6586 6.4228
kaczla 2017-12-05 21:38 3-gram with prune, best 10 ready-made lm kenlm 6.6364 6.8789 6.5879
kaczla 2017-12-05 21:35 3-gram with prune, best 15 kenlm lm ready-made 6.7882 7.0033 6.7119
kaczla 2017-12-05 21:33 3-gram with prune, best 25 ready-made kenlm lm 6.9749 7.1766 6.8763
kaczla 2017-12-05 21:30 3-gram with prune, best 50 lm kenlm ready-made 7.2401 7.4038 7.1059
kaczla 2017-12-05 21:24 3-gram with prune, best 100 lm kenlm ready-made 7.4523 7.6464 7.3087
mmalisz 2017-06-29 22:47 Order 4 N/A N/A 6.2111
mmalisz 2017-06-29 18:38 order 2 N/A N/A 6.3262
mmalisz 2017-06-29 15:12 Update source code; kenlm order=3 tokenizer.perl from moses. best 100 results, text mode. lm kenlm ready-made N/A N/A 6.1898
mmalisz 2017-06-29 15:08 added wildcard N/A N/A 6.1898
mmalisz 2017-06-29 12:29 first 100 N/A N/A Infinity
mmalisz 2017-06-28 13:23 top 100 N/A N/A N/A
Durson 2017-06-28 08:47 test 2 ready-made neural-network N/A N/A 6.8956
Durson 2017-06-27 19:14 first test ready-made neural-network N/A N/A 7.5236
mmalisz 2017-06-15 23:29 First try N/A N/A N/A
EmEm 2017-05-16 04:31 zad 16 self-made lm N/A N/A 6.8056
tamazaki 2017-04-24 16:42 unigramy, n=100, v3 self-made lm 6.1745 6.1841 6.0733
tamazaki 2017-04-24 16:32 unigramy, n=100, v2 self-made lm 8.0610 8.0714 7.8460
tamazaki 2017-04-24 16:29 unigramy, n=100 N/A N/A N/A
tamazaki 2017-04-24 16:24 unigramy, n=1000 7.6808 7.7246 N/A
tamazaki 2017-04-24 15:14 unigramy (dobre kodowanie) v2 self-made lm 7.3661 7.3596 7.2467
tamazaki 2017-04-24 15:11 unigramy (dobre kodowanie) N/A N/A N/A
tamazaki 2017-04-23 17:57 Unigram (problem kodowania) 7.3661 7.3596 7.2467
tamazaki 2017-04-23 17:53 Unigram (problem kodowania) 7.3661 N/A N/A
tamazaki 2017-04-23 17:46 Unigram (problem kodowania) N/A N/A N/A
tamazaki 2017-04-23 17:43 Unigram (problem kodowania) N/A N/A N/A
p/tlen 2017-04-10 06:22 uniform probability except for comma stupid 6.9116 6.9585 6.9169
p/tlen 2017-04-10 06:18 uniform probability stupid 6.9315 6.9315 6.9315