Guess a word in a gap in historic texts

Give a probability distribution for a word in a gap in a corpus of Polish historic texts spanning 1814-2013. This is a challenge for (temporal) language models.

Log in to write a comment

Paulina Lester   2019-11-30 22:48
submitted a solution:3gram outfile format fix
[anonymised]   2019-11-27 10:19
submitted a solution:Simple bigram model
Jędrzej Furmann   2019-11-27 10:04
submitted a solution:Simple trigram
Paulina Lester   2019-11-27 09:51
submitted a solution:fixed trigrams
Jędrzej Furmann   2019-11-27 09:43
submitted a solution:Simple trigram
Jędrzej Furmann   2019-11-26 16:31
submitted a solution:poprawione
kubapok   2019-11-25 10:42
submitted a solution:year aware 4 splits statistical
kubapok   2019-11-25 10:36
submitted a solution:year aware 2 splits
[anonymised]   2019-11-20 17:07
submitted a solution:better bigram solution, nananana
Jędrzej Furmann   2019-11-19 23:02
submitted a solution:Add files via upload
Jędrzej Furmann   2019-11-19 10:16
submitted a solution:Simple trigram
Jędrzej Furmann   2019-11-19 10:02
submitted a solution:Simple trigram
Jędrzej Furmann   2019-11-19 09:56
submitted a solution:Add files via upload
Jędrzej Furmann   2019-11-19 09:36
submitted a solution:Add files via upload
[anonymised]   2019-11-18 16:30
submitted a solution:bigram solution, sialala
kubapok   2019-11-17 08:31
submitted a solution:self made LM 3grams with fallback to 2grams and 1grams
[anonymised]   2019-11-13 16:32
submitted a solution:Simple bigram model
452107   2019-11-13 12:29
submitted a solution:My bigram guess a word solution
[anonymised]   2019-11-13 09:33
submitted a solution:Simple bigram model
kubapok   2019-11-12 06:52
submitted a solution:bigram model, equal distribution
kubapok   2019-11-11 18:14
submitted a solution:stupid solution
kubapok   2019-11-11 11:45
submitted a solution:very baseline
p/tlen   2019-05-24 09:39
submitted a solution:LM model used (applica-lm-retro-gap-transformer-bpe-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin)
p/tlen   2019-05-19 02:53
submitted a solution:LM model trained on 20190519 (applica-lm-retro-gap-transformer-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin)
p/tlen   2019-05-17 19:14
submitted a solution:LM model used (model.bin)
Gabi   2019-05-17 13:48
submitted a solution:Add preprocessed data with BPE
p/tlen   2019-05-16 10:31
submitted a solution:LM model trained on 20190516 (applica-lm-retro-gap-transformer-bpe-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin)
p/tlen   2019-05-14 04:53
submitted a solution:LM model trained on 20190514 (applica-lm-retro-gap-transformer-bpe-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin)
p/tlen   2019-05-11 00:49
submitted a solution:LM model used (model.bin)
p/tlen   2019-05-10 13:58
submitted a solution:LM model used (model.bin)
p/tlen   2019-05-10 01:27
submitted a solution:LM model used (model.bin)
p/tlen   2019-05-08 20:38
submitted a solution:LM model used (model.bin)
p/tlen   2019-05-08 11:24
submitted a solution:LM model used (model.bin)
p/tlen   2019-05-08 08:59
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-17 11:25
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-12 04:40
submitted a solution:LM model used (bi-transformer.bin)
p/tlen   2019-04-10 22:39
submitted a solution:LM model used (transformer-sumo.bin)
p/tlen   2019-04-10 12:41
submitted a solution:LM model used (bi-partially-casemarker-transformer.bin)
p/tlen   2019-04-10 09:54
submitted a solution:LM model used (bi-transformer.bin)
p/tlen   2019-04-10 00:09
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-06 11:23
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-06 00:39
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-05 15:29
submitted a solution:LM model trained on 20190405 (applica-lm-retro-gap-transformer-frage-rvt-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin)
p/tlen   2019-04-01 13:14
submitted a solution:LM model used (model.bin)
p/tlen   2019-04-01 10:16
submitted a solution:LM model trained on 20190331 (applica-lm-retro-gap-bilstm-case-marker-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-31 23:09
submitted a solution:LM model trained on 20190331 (applica-lm-retro-gap-bilstm-case-marker-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-30 12:13
submitted a solution:LM model used (model.bin)
p/tlen   2019-03-30 05:29
submitted a solution:LM model trained on 20190330 (applica-lm-retro-gap-transformer-frage-casemarker-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-29 02:33
submitted a solution:LM model trained on 20190329 (applica-lm-retro-gap-bilstm-frage-fixed-vocab-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-21 14:11
submitted a solution:per-period models combined (100/50)
p/tlen   2019-03-21 09:04
submitted a solution:per-period models combined (100/50)
p/tlen   2019-03-21 08:31
submitted a solution:two BiLSTMs, one for each 100 years
p/tlen   2019-03-20 04:14
submitted a solution:LM model trained on 20190320 (applica-lm-retro-gap-train-1864-1963-bilstm-frage-1814-1913-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-19 21:11
submitted a solution:LM model trained on 20190319 (applica-lm-retro-gap-train-1914-2013-bilstm-frage-1914-2013-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-16 19:24
submitted a solution:LM model trained on 20190316 (applica-lm-retro-gap-train-1814-1913-bilstm-frage-1814-1913-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-03-10 17:09
submitted a solution:LM model trained on 20190310 (applica-lm-retro-gap-transformer-frage-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin)
p/tlen   2019-02-22 22:52
submitted a solution:LM model used (model.bin)
p/tlen   2019-02-18 15:14
submitted a solution:LM model trained on 20190218 (applica-lm-retro-gap-retro-gap-frage-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-02-08 23:03
submitted a solution:LM model trained on 20190208 (applica-lm-train-tokenized-lowercased-shuffled-bilstm-all-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-02-07 07:58
submitted a solution:LM model trained on 20190207 (applica-lm-train-tokenized-lowercased-shuffled-bilstm-all-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin)
p/tlen   2019-02-02 09:49
submitted a solution:LM model trained on 20190202 (applica-lm-retro-gap-bilstm-word-preproc=minimalistic-bidirectional-lang=pl-5.1.9.0.bin)
p/tlen   2019-01-30 19:56
submitted a solution:LM model trained on 20190130 (applica-lm-retro-gap-transformer-word-preproc=minimalistic-left_to_right-lang=pl-5.1.9.0.bin)
p/tlen   2019-01-28 05:16
submitted a solution:LM model trained on 20190128 (applica-lm-retro-gap-transformer-word-preproc=minimalistic-left_to_right-lang=pl-5.1.9.0.bin)
p/tlen   2019-01-09 16:30
submitted a solution:LM model trained on 20190109 (applica-lm-retro-gap-bilstm-cnn-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin)
p/tlen   2018-12-30 05:32
submitted a solution:LM model trained on 20181230 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin)
p/tlen   2018-12-27 21:24
submitted a solution:LM model trained on 20181227 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin)
p/tlen   2018-12-27 10:00
submitted a solution:LM model trained20181225 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=en-5.1.8.0.bin)
p/tlen   2018-09-02 20:20
submitted a solution:simple 2-layer LSTM, left-to-right
EmEm   2018-01-28 07:51
submitted a solution:trigrams_fixed
siulkilulki   2018-01-24 14:39
submitted a solution:simple neural network, context 2 words ahead 2 words behind
kaczla   2018-01-17 11:20
submitted a solution:simple neural network - nb_of_epochs=3, batch_size=2048
kaczla   2018-01-16 18:52
submitted a solution:simple neural network - nb_of_epochs=2
kaczla   2018-01-16 18:13
submitted a solution:simple neural network - nb_of_epochs=4
kaczla   2018-01-16 17:17
submitted a solution:simple neural network - decrease batch_size
patrycja   2018-01-15 18:11
submitted a solution:Bigrams model, 100 best words
patrycja   2018-01-09 18:26
submitted a solution:???
patrycja   2018-01-09 18:08
submitted a solution:Bigrams model, 100 best words
p/tlen   2018-01-03 06:07
submitted a solution:a very simple (non-recurrent) neural network, looking one word behind and one word ahead (train on all data), dictionary size=40000
EmEm   2018-01-02 18:14
submitted a solution:'trigrams'
EmEm   2018-01-02 17:26
submitted a solution:'trigrams'
p/tlen   2018-01-02 16:23
submitted a solution:a very simple (non-recurrent) neural network, looking one word behind and one word ahead
siulkilulki   2017-12-13 14:54
submitted a solution:unigram with temporal info, best 100, two periods (1813, 1913) (1913, 2014)
siulkilulki   2017-12-13 14:44
submitted a solution:unigram with temporal info, best 100, 2 periods (1813, 1913) (1913, 2014)
siulkilulki   2017-12-13 14:41
submitted a solution:unigram with temporal model, 25 best
kaczla   2017-12-12 20:45
submitted a solution:3-gram with prune, best 1, best oov
kaczla   2017-12-12 20:42
submitted a solution:3-gram with prune, best 2, best oov
kaczla   2017-12-12 20:41
submitted a solution:3-gram with prune, best 3, best oov
kaczla   2017-12-12 20:38
submitted a solution:3-gram with prune, best 5, best oov
kaczla   2017-12-12 20:37
submitted a solution:3-gram with prune, best 10, best oov
kaczla   2017-12-12 20:35
submitted a solution:3-gram with prune, best 15, best oov
kaczla   2017-12-12 20:32
submitted a solution:3-gram with prune, best 25, best oov
kaczla   2017-12-12 19:19
submitted a solution:3-gram with prune, best 1
kaczla   2017-12-12 19:17
submitted a solution:3-gram with prune, best 2
kaczla   2017-12-12 19:14
submitted a solution:3-gram with prune, best 3
kaczla   2017-12-05 21:39
submitted a solution:3-gram with prune, best 5
kaczla   2017-12-05 21:38
submitted a solution:3-gram with prune, best 10
kaczla   2017-12-05 21:35
submitted a solution:3-gram with prune, best 15
kaczla   2017-12-05 21:33
submitted a solution:3-gram with prune, best 25
kaczla   2017-12-05 21:30
submitted a solution:3-gram with prune, best 50
kaczla   2017-12-05 21:24
submitted a solution:3-gram with prune, best 100
mmalisz   2017-06-29 22:47
submitted a solution:Order 4
mmalisz   2017-06-29 18:38
submitted a solution:order 2
mmalisz   2017-06-29 15:12
submitted a solution:Update source code; kenlm order=3 tokenizer.perl from moses. best 100 results, text mode.
mmalisz   2017-06-29 15:08
submitted a solution:added wildcard
mmalisz   2017-06-29 12:29
submitted a solution:first 100
mmalisz   2017-06-28 13:23
submitted a solution:top 100
Durson   2017-06-28 08:47
submitted a solution:test 2
Durson   2017-06-27 19:14
submitted a solution:first test
mmalisz   2017-06-15 23:29
submitted a solution:First try
EmEm   2017-05-16 04:31
submitted a solution:zad 16
tamazaki   2017-04-24 16:42
submitted a solution:unigramy, n=100, v3
tamazaki   2017-04-24 16:32
submitted a solution:unigramy, n=100, v2
tamazaki   2017-04-24 16:29
submitted a solution:unigramy, n=100
tamazaki   2017-04-24 16:24
submitted a solution:unigramy, n=1000
tamazaki   2017-04-24 15:14
submitted a solution:unigramy (dobre kodowanie) v2
tamazaki   2017-04-24 15:11
submitted a solution:unigramy (dobre kodowanie)
tamazaki   2017-04-23 17:57
submitted a solution:Unigram (problem kodowania)
tamazaki   2017-04-23 17:53
submitted a solution:Unigram (problem kodowania)
tamazaki   2017-04-23 17:46
submitted a solution:Unigram (problem kodowania)
tamazaki   2017-04-23 17:43
submitted a solution:Unigram (problem kodowania)
p/tlen   2017-04-10 06:22
submitted a solution:uniform probability except for comma
p/tlen   2017-04-10 06:18
submitted a solution:uniform probability