Guess a word in a gap in historic texts
Give a probability distribution for a word in a gap in a corpus of Polish historic texts spanning 1814-2013. This is a challenge for (temporal) language models. [ver. 1.0.0]
This is a long list of all submissions, if you want to see only the best, click leaderboard.
# | submitter | when | ver. | description | dev-0 LogLossHashed | dev-1 LogLossHashed | test-A LogLossHashed | |
---|---|---|---|---|---|---|---|---|
59 | [anonymized] | 2021-02-08 06:15 | 1.0.0 | solution self-made lm | 6.1745 | 6.1841 | 6.0733 | |
151 | [anonymized] | 2021-02-04 20:29 | 1.0.0 | ngram lm pytorch-nn | 6.9101 | N/A | 6.9123 | |
83 | [anonymized] | 2021-02-03 07:57 | 1.0.0 | updated bigram self-made lm bigram | 6.2850 | 6.3862 | 6.2673 | |
95 | [anonymized] | 2021-01-27 10:23 | 1.0.0 | TAU22 lm pytorch-nn | 6.4696 | 6.4803 | 6.4151 | |
94 | [anonymized] | 2021-01-25 00:41 | 1.0.0 | TAU22 lm pytorch-nn | 6.4696 | 6.4803 | 6.4151 | |
96 | [anonymized] | 2021-01-25 00:31 | 1.0.0 | TAU22 lm pytorch-nn | 6.4696 | 6.4803 | 6.4201 | |
167 | [anonymized] | 2021-01-16 16:29 | 1.0.0 | lets try pytorch lm pytorch-nn | 6.9606 | 6.9616 | 6.9731 | |
88 | [anonymized] | 2021-01-13 02:38 | 1.0.0 | v10 lm temporal pytorch-nn | 6.3869 | 6.3945 | 6.3330 | |
93 | [anonymized] | 2021-01-13 02:33 | 1.0.0 | v9 | 6.4401 | 6.4464 | 6.3899 | |
91 | [anonymized] | 2021-01-13 02:28 | 1.0.0 | v8 | 6.3870 | 6.3946 | 6.3335 | |
92 | [anonymized] | 2021-01-13 02:17 | 1.0.0 | Merge remote-tracking branch 'origin/TAU-020' into TAU-020 | 6.4409 | 6.4468 | 6.3899 | |
89 | [anonymized] | 2021-01-13 01:51 | 1.0.0 | following_words;x_size=100;epochs=5;lr=0.001 lm pytorch-nn | 6.3749 | 6.3775 | 6.3331 | |
253 | [anonymized] | 2021-01-13 01:41 | 1.0.0 | following_words;x_size=100;epochs=5;lr=0.001 lm pytorch-nn | 6.3749 | 6.3775 | N/A | |
109 | [anonymized] | 2021-01-13 00:00 | 1.0.0 | v7 | 6.6176 | 6.6106 | 6.5937 | |
108 | [anonymized] | 2021-01-12 23:55 | 1.0.0 | v7 | 6.6056 | 6.6039 | 6.5885 | |
104 | [anonymized] | 2021-01-12 23:43 | 1.0.0 | v7 | 6.5773 | 6.5716 | 6.5581 | |
169 | [anonymized] | 2021-01-12 22:40 | 1.0.0 | TAU22 lm pytorch-nn | 6.9961 | 6.9974 | 7.0113 | |
119 | [anonymized] | 2021-01-12 22:26 | 1.0.0 | v5 | 6.7626 | 6.7311 | 6.7022 | |
149 | [anonymized] | 2021-01-12 17:48 | 1.0.0 | run.py update lm pytorch-nn | 6.9139 | 6.9013 | 6.9054 | |
148 | [anonymized] | 2021-01-12 17:44 | 1.0.0 | nn-gap-v1.0 | 6.9139 | 6.9013 | 6.9054 | |
105 | [anonymized] | 2021-01-12 17:36 | 1.0.0 | first solution 1 epoch 1000 texts best 15 lm pytorch-nn | 6.6239 | 6.6617 | 6.5711 | |
87 | [anonymized] | 2021-01-12 16:06 | 1.0.0 | v4 | 6.3869 | 6.3945 | 6.3330 | |
90 | [anonymized] | 2021-01-12 15:56 | 1.0.0 | v3 | 6.3870 | 6.3946 | 6.3335 | |
126 | [anonymized] | 2021-01-12 09:11 | 1.0.0 | v3 | 6.7637 | 6.7738 | 6.7407 | |
150 | [anonymized] | 2021-01-12 01:33 | 1.0.0 | v3 | 6.9303 | 6.9267 | 6.9063 | |
70 | [anonymized] | 2021-01-11 22:53 | 1.0.0 | Solution lm pytorch-nn | 6.1759 | 6.3140 | 6.1656 | |
214 | [anonymized] | 2021-01-11 01:00 | 1.0.0 | v2 | 7.3623 | 7.4396 | 7.3444 | |
143 | [anonymized] | 2021-01-11 00:40 | 1.0.0 | v1+years | 6.8733 | 6.8783 | 6.8607 | |
140 | [anonymized] | 2021-01-11 00:19 | 1.0.0 | v1 | 6.8453 | 6.8709 | 6.8412 | |
82 | [anonymized] | 2021-01-09 21:10 | 1.0.0 | 2 left, 2 right context lm pytorch-nn | N/A | 6.3009 | 6.2379 | |
61 | [anonymized] | 2021-01-08 18:34 | 1.0.0 | pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.1274 | 6.1896 | 6.0819 | |
63 | [anonymized] | 2021-01-06 18:37 | 1.0.0 | pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.1365 | 6.1994 | 6.0920 | |
64 | [anonymized] | 2021-01-06 16:31 | 1.0.0 | pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.1448 | 6.1987 | 6.0943 | |
67 | [anonymized] | 2021-01-06 15:39 | 1.0.0 | pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.1803 | 6.2305 | 6.1330 | |
68 | [anonymized] | 2021-01-06 15:06 | 1.0.0 | second try pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.1962 | 6.2449 | 6.1592 | |
73 | [anonymized] | 2021-01-06 14:31 | 1.0.0 | first try pytorch neural ngram model (3 previous words) lm pytorch-nn | 6.2277 | 6.2578 | 6.1803 | |
252 | [anonymized] | 2020-12-16 09:09 | 1.0.0 | first try self-made lm | N/A | N/A | N/A | |
127 | [anonymized] | 2020-12-16 08:52 | 1.0.0 | poprawka tetragram self-made lm tetragram | 6.7562 | 6.7703 | 6.7517 | |
128 | [anonymized] | 2020-12-16 07:47 | 1.0.0 | tetragram self-made lm tetragram | 6.7562 | 6.7703 | 6.7611 | |
74 | [anonymized] | 2020-12-16 07:16 | 1.0.0 | python bigram self-made lm bigram | 6.1865 | 6.3105 | 6.1837 | |
223 | [anonymized] | 2020-12-15 22:14 | 1.0.0 | RandLM first ready-made randlm | 31.3001 | 33.2617 | 30.1634 | |
216 | [anonymized] | 2020-12-13 14:17 | 1.0.0 | solution self-made lm trigram | N/A | N/A | 7.5152 | |
251 | [anonymized] | 2020-12-13 14:05 | 1.0.0 | change a | N/A | N/A | N/A | |
250 | [anonymized] | 2020-12-13 13:41 | 1.0.0 | add test | N/A | N/A | N/A | |
224 | [anonymized] | 2020-12-09 21:19 | 1.0.0 | bigram | N/A | N/A | Infinity | |
249 | [anonymized] | 2020-12-09 09:51 | 1.0.0 | model-size=10k self-made lm interpolation | N/A | N/A | N/A | |
248 | [anonymized] | 2020-12-09 09:47 | 1.0.0 | model-size=10k self-made lm interpolation | N/A | N/A | N/A | |
247 | [anonymized] | 2020-12-09 09:45 | 1.0.0 | model-size=10k self-made lm interpolation | N/A | N/A | N/A | |
246 | [anonymized] | 2020-12-09 09:30 | 1.0.0 | model-size=10k self-made lm interpolation | N/A | N/A | N/A | |
97 | [anonymized] | 2020-12-08 16:27 | 1.0.0 | solution self-made lm bigram | 6.4696 | 6.4797 | 6.4201 | |
153 | [anonymized] | 2020-12-08 15:50 | 1.0.0 | finally self-made lm bigram | 6.9443 | 7.0105 | 6.9236 | |
213 | [anonymized] | 2020-12-08 13:38 | 1.0.0 | please work better self-made lm bigram | 7.2346 | 7.3019 | 7.2404 | |
215 | [anonymized] | 2020-12-08 13:10 | 1.0.0 | improved bigram (well, actually not, but the code is right now) self-made lm bigram | 7.4204 | 7.4774 | 7.4397 | |
168 | [anonymized] | 2020-12-08 11:06 | 1.0.0 | bigrams solution self-made lm bigram | 6.9956 | 7.0675 | 7.0056 | |
165 | [anonymized] | 2020-12-08 09:18 | 1.0.0 | trigrams-v1.8 self-made lm trigram | 6.9468 | 6.9443 | 6.9510 | |
172 | [anonymized] | 2020-12-08 08:52 | 1.0.0 | trigrams-v1.7 | 7.0831 | 7.1273 | 7.0428 | |
175 | [anonymized] | 2020-12-07 13:16 | 1.0.0 | trigrams-v1.6 | 7.1635 | 7.2105 | 7.1446 | |
170 | [anonymized] | 2020-12-07 13:09 | 1.0.0 | trigrams-v1.5 | 7.0347 | 7.0889 | 7.0170 | |
220 | [anonymized] | 2020-12-07 12:33 | 1.0.0 | trigrams-v1.4 | 9.5093 | 9.3352 | 9.4255 | |
218 | [anonymized] | 2020-12-07 12:25 | 1.0.0 | trigrams-v1.3 | 8.5079 | 8.4386 | 8.4943 | |
219 | [anonymized] | 2020-12-07 12:17 | 1.0.0 | trigrams-v1.2 | 9.1281 | 8.9452 | 9.1356 | |
221 | [anonymized] | 2020-12-07 11:44 | 1.0.0 | trigrams | 9.6720 | 9.4952 | 9.5646 | |
69 | [anonymized] | 2020-12-04 00:27 | 1.0.0 | solution self-made lm bigram | 6.1705 | 6.3034 | 6.1610 | |
101 | [anonymized] | 2020-12-03 03:55 | 1.0.0 | v23.3 ready-made kenlm lm | 6.2619 | 6.4126 | 6.4908 | |
102 | [anonymized] | 2020-12-03 03:53 | 1.0.0 | v23-6 | 6.2715 | 6.4253 | 6.5054 | |
100 | [anonymized] | 2020-12-03 03:51 | 1.0.0 | v23-5 | 6.2619 | 6.4126 | 6.4908 | |
110 | [anonymized] | 2020-12-02 22:19 | 1.0.0 | v23-4 | 6.3706 | 6.5414 | 6.6331 | |
111 | [anonymized] | 2020-12-02 22:04 | 1.0.0 | v23-3 | 6.3755 | 6.5455 | 6.6354 | |
113 | [anonymized] | 2020-12-02 16:23 | 1.0.0 | Merge remote-tracking branch 'origin/TAU-011' into TAU-011 | 6.3739 | 6.5459 | 6.6385 | |
117 | [anonymized] | 2020-12-02 15:52 | 1.0.0 | v27 | 6.5396 | 6.6421 | 6.6733 | |
123 | [anonymized] | 2020-12-02 13:04 | 1.0.0 | Trigram slef-made self-made lm trigram | 6.7004 | 6.7900 | 6.7172 | |
122 | [anonymized] | 2020-12-02 09:28 | 1.0.0 | Trigram self-made self-made lm trigram | 6.7004 | 6.7900 | 6.7172 | |
118 | [anonymized] | 2020-12-02 08:14 | 1.0.0 | v26 | 6.5515 | 6.6369 | 6.6791 | |
129 | [anonymized] | 2020-12-02 08:10 | 1.0.0 | v25 | 6.7038 | 6.7513 | 6.7635 | |
146 | [anonymized] | 2020-12-02 07:51 | 1.0.0 | v24 | 6.8643 | 6.8778 | 6.8834 | |
145 | [anonymized] | 2020-12-02 07:51 | 1.0.0 | Merge remote-tracking branch 'origin/TAU-011' into TAU-011 | 6.8643 | 6.8778 | 6.8834 | |
112 | [anonymized] | 2020-12-02 02:12 | 1.0.0 | v23 ready-made kenlm lm | 6.3738 | 6.5459 | 6.6385 | |
115 | [anonymized] | 2020-12-02 01:57 | 1.0.0 | v22 | 6.3786 | 6.5490 | 6.6404 | |
114 | [anonymized] | 2020-12-02 01:47 | 1.0.0 | v21 | 6.3733 | 6.5457 | 6.6396 | |
120 | [anonymized] | 2020-12-02 01:04 | 1.0.0 | v20 | 6.4265 | 6.6039 | 6.7040 | |
116 | [anonymized] | 2020-12-02 00:59 | 1.0.0 | v19 | 6.4001 | 6.5745 | 6.6726 | |
103 | [anonymized] | 2020-12-01 22:55 | 1.0.0 | trigram lm self-made lm trigram | 6.6292 | 6.6930 | 6.5452 | |
130 | [anonymized] | 2020-12-01 18:52 | 1.0.0 | v18 | 6.5392 | 6.6772 | 6.7758 | |
132 | [anonymized] | 2020-12-01 18:51 | 1.0.0 | v17 | 6.5426 | 6.6808 | 6.7786 | |
217 | [anonymized] | 2020-12-01 18:08 | 1.0.0 | v16 | 7.3384 | 7.4831 | 7.6228 | |
131 | [anonymized] | 2020-12-01 18:03 | 1.0.0 | v15 | 6.5407 | 6.6787 | 6.7770 | |
135 | [anonymized] | 2020-12-01 17:48 | 1.0.0 | v14 | 6.5694 | 6.7099 | 6.8102 | |
142 | [anonymized] | 2020-12-01 17:40 | 1.0.0 | v13 | 6.6046 | 6.7479 | 6.8504 | |
166 | [anonymized] | 2020-12-01 17:35 | 1.0.0 | v12 | 6.7042 | 6.8543 | 6.9623 | |
171 | [anonymized] | 2020-12-01 17:20 | 1.0.0 | v11 | 6.7770 | 6.9314 | 7.0428 | |
176 | [anonymized] | 2020-12-01 17:15 | 1.0.0 | v10 | 6.8756 | 7.0352 | 7.1509 | |
174 | [anonymized] | 2020-11-30 20:45 | 1.0.0 | v9 | 7.2720 | 7.2878 | 7.1423 | |
141 | [anonymized] | 2020-11-30 19:36 | 1.0.0 | v8 | 6.9470 | 6.9615 | 6.8461 | |
222 | [anonymized] | 2020-11-28 13:15 | 1.0.0 | v3 | 9.6603 | 9.7094 | 9.9624 | |
154 | [anonymized] | 2020-11-27 23:27 | 1.0.0 | v1 | 7.0538 | 6.9977 | 6.9289 | |
1 | kubapok | 2020-10-13 08:42 | 1.0.0 | regular roberta epoch 190 | 4.2993 | 4.5402 | 4.3285 | |
204 | kubapok | 2020-10-11 20:32 | 1.0.0 | roberta first token embedding epoch 190 | 4.3091 | 4.5435 | 4.3397 | |
207 | kubapok | 2020-10-11 20:17 | 1.0.0 | roberta first token embedding epoch 182 | 4.3449 | 4.5755 | 4.3700 | |
208 | kubapok | 2020-10-10 08:39 | 1.0.0 | regular roberta epoch 182 | 4.3448 | 4.5758 | 4.3747 | |
203 | kubapok | 2020-10-09 10:56 | 1.0.0 | roberta z embeddingiem pierwszy token | 4.4648 | 4.4763 | 4.2801 | |
202 | kubapok | 2020-10-08 20:21 | 1.0.0 | regularna roberta | 4.4606 | 4.4763 | 4.2801 | |
201 | kubapok | 2020-06-29 21:54 | 1.0.0 | polish roberta year aware, but not so clever, further training | 4.2580 | 4.4763 | 4.2801 | |
205 | kubapok | 2020-06-22 20:02 | 1.0.0 | polish roberta finetunned year aware, but not so clever | 4.3035 | 4.5484 | 4.3466 | |
210 | kubapok | 2020-06-13 21:35 | 1.0.0 | polish roberta base no finetunning | 4.8964 | 5.1527 | 4.9802 | |
209 | kubapok | 2020-06-13 20:43 | 1.0.0 | polish roberta large no finetunning | 4.7162 | 4.9455 | 4.8072 | |
206 | kubapok | 2020-06-02 17:38 | 1.0.0 | polish roberta finetunned low lr transformer fairseq | 4.3361 | 4.5718 | 4.3698 | |
211 | kubapok | 2020-02-29 16:05 | 1.0.0 | bigger transformer, longer training | 5.0819 | N/A | 5.0309 | |
212 | kubapok | 2020-02-25 20:33 | 1.0.0 | transformer year unaware | 5.2537 | N/A | 5.1726 | |
155 | [anonymized] | 2020-01-16 18:12 | 1.0.0 | IRLSTM 3-gram lm | N/A | N/A | 6.9314 | |
156 | [anonymized] | 2020-01-15 19:39 | 1.0.0 | IRSTLM | N/A | N/A | 6.9314 | |
157 | [anonymized] | 2020-01-15 19:37 | 1.0.0 | IRSTLM | N/A | N/A | 6.9314 | |
200 | [anonymized] | 2020-01-15 19:35 | 1.0.0 | IRLSTM | N/A | N/A | NaN | |
199 | [anonymized] | 2020-01-15 19:31 | 1.0.0 | IRLSTM | N/A | N/A | NaN | |
164 | [anonymized] | 2020-01-15 19:20 | 1.0.0 | IRSTLM | N/A | N/A | 6.9315 | |
163 | [anonymized] | 2020-01-15 19:09 | 1.0.0 | IRSTLM | N/A | N/A | 6.9315 | |
160 | [anonymized] | 2020-01-15 19:04 | 1.0.0 | IRSTLM | N/A | N/A | 6.9314 | |
159 | [anonymized] | 2020-01-15 19:02 | 1.0.0 | IRSTLM | N/A | N/A | 6.9314 | |
158 | [anonymized] | 2020-01-15 18:59 | 1.0.0 | IRSTLM | N/A | N/A | 6.9314 | |
245 | [anonymized] | 2020-01-15 10:25 | 1.0.0 | IRSTLM | N/A | N/A | N/A | |
180 | [anonymized] | 2020-01-15 10:25 | 1.0.0 | IRSTLM epsilon=0 | N/A | N/A | 7.2184 | |
244 | [anonymized] | 2020-01-15 10:20 | 1.0.0 | IRSTLM | N/A | N/A | N/A | |
179 | [anonymized] | 2020-01-15 10:20 | 1.0.0 | IRSTLM epsilon=0 | N/A | N/A | 7.2184 | |
243 | [anonymized] | 2020-01-15 09:56 | 1.0.0 | IRSTLM | N/A | N/A | N/A | |
178 | [anonymized] | 2020-01-15 09:56 | 1.0.0 | IRSTLM epsilon=0.01 | N/A | N/A | 7.2184 | |
242 | [anonymized] | 2020-01-15 09:53 | 1.0.0 | IRSTLM | N/A | N/A | N/A | |
177 | [anonymized] | 2020-01-15 09:53 | 1.0.0 | IRSTLM epsilon=0.01 | N/A | N/A | 7.2184 | |
241 | [anonymized] | 2020-01-15 09:49 | 1.0.0 | IRSTLM epsilon=0.01 | N/A | N/A | N/A | |
240 | [anonymized] | 2020-01-15 09:49 | 1.0.0 | IRSTLM | N/A | N/A | N/A | |
188 | [anonymized] | 2020-01-13 14:54 | 1.0.0 | IRSTLM last result lm | N/A | N/A | 8.4198 | |
192 | [anonymized] | 2020-01-09 23:04 | 1.0.0 | IRSTLM third solution | N/A | N/A | 12.7659 | |
194 | [anonymized] | 2020-01-09 22:20 | 1.0.0 | IRSTLM third solution | N/A | N/A | 13.9740 | |
193 | [anonymized] | 2020-01-09 22:12 | 1.0.0 | IRSTLM third solution | N/A | N/A | 13.2266 | |
190 | [anonymized] | 2020-01-09 21:15 | 1.0.0 | IRSTLM second solution | N/A | N/A | 10.7553 | |
191 | [anonymized] | 2020-01-09 19:21 | 1.0.0 | IRSTLM second solution | N/A | N/A | 12.5241 | |
239 | [anonymized] | 2020-01-08 09:21 | 1.0.0 | IRLSTM first solution lm | N/A | N/A | N/A | |
238 | [anonymized] | 2019-12-10 20:35 | 1.0.0 | solution lm | N/A | N/A | N/A | |
133 | [anonymized] | 2019-11-30 22:48 | 1.0.0 | 3gram outfile format fix lm trigram | N/A | N/A | 6.8032 | |
72 | [anonymized] | 2019-11-27 10:19 | 1.0.0 | Simple bigram model lm | 6.1832 | 6.3083 | 6.1802 | |
237 | [anonymized] | 2019-11-27 10:04 | 1.0.0 | Simple trigram lm | N/A | N/A | N/A | |
49 | kubapok | 2019-11-25 10:42 | 1.0.0 | year aware 4 splits statistical | 5.8219 | N/A | 5.7973 | |
40 | kubapok | 2019-11-25 10:36 | 1.0.0 | year aware 2 splits | 5.7372 | N/A | 5.6667 | |
124 | [anonymized] | 2019-11-20 17:07 | 1.0.0 | better bigram solution, nananana lm | 6.7205 | 6.7565 | 6.7249 | |
183 | [anonymized] | 2019-11-18 16:30 | 1.0.0 | bigram solution, sialala lm | 7.3424 | 7.2223 | 7.2732 | |
38 | kubapok | 2019-11-17 08:31 | 1.0.0 | self made LM 3grams with fallback to 2grams and 1grams | 5.6790 | N/A | 5.6063 | |
198 | [anonymized] | 2019-11-13 16:32 | 1.0.0 | Simple bigram model lm | Infinity | Infinity | Infinity | |
125 | [anonymized] | 2019-11-13 12:29 | 1.0.0 | My bigram guess a word solution lm | 6.7279 | 6.7631 | 6.7309 | |
236 | [anonymized] | 2019-11-13 09:33 | 1.0.0 | Simple bigram model lm | N/A | N/A | N/A | |
66 | kubapok | 2019-11-12 06:52 | 1.0.0 | bigram model, equal distribution | N/A | N/A | 6.1264 | |
78 | kubapok | 2019-11-11 18:14 | 1.0.0 | stupid solution | N/A | N/A | 6.2078 | |
162 | kubapok | 2019-11-11 11:45 | 1.0.0 | very baseline | N/A | N/A | 6.9315 | |
21 | p/tlen | 2019-05-24 09:39 | 1.0.0 | LM model used (applica-lm-retro-gap-transformer-bpe-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin) attention-dropout=0.1 attention-heads=8 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=15 early-stopping-type=iteration [...] lm word-level transformer | 5.2179 | 5.4184 | 5.1606 | |
184 | p/tlen | 2019-05-19 02:53 | 1.0.0 | LM model trained on 20190519 (applica-lm-retro-gap-transformer-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin) best-epoch=90 bptt=50 chunksize=10000 clip=0.25 early-stopping=15 early-stopping-type=iteration epochs=100 epochs-done=90 [...] lm word-level | 7.3604 | 7.3470 | 7.3037 | |
48 | p/tlen | 2019-05-17 19:14 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 early-stopping-type=iteration [...] lm word-level transformer | 5.8551 | 5.9289 | 5.7924 | |
20 | p/tlen | 2019-05-16 10:31 | 1.0.0 | LM model trained on 20190516 (applica-lm-retro-gap-transformer-bpe-bigger-preproc=minimalistic-left_to_right-lang=pl-5.3.0.0.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=2 best-epoch=80 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 [...] lm word-level transformer left-to-right bpe | 5.2138 | 5.4159 | 5.1584 | |
23 | p/tlen | 2019-05-14 04:53 | 1.0.0 | LM model trained on 20190514 (applica-lm-retro-gap-transformer-bpe-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=2 best-epoch=75 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 [...] lm word-level transformer left-to-right bpe | 5.2746 | 5.4622 | 5.2050 | |
25 | p/tlen | 2019-05-11 00:49 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=2 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer left-to-right bpe | 5.3047 | 5.4820 | 5.2355 | |
32 | p/tlen | 2019-05-10 13:58 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=2 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer left-to-right bpe | 5.3522 | 5.5277 | 5.2825 | |
28 | p/tlen | 2019-05-10 01:27 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 early-stopping-type=iteration [...] lm word-level transformer | 5.3083 | 5.4859 | 5.2453 | |
27 | p/tlen | 2019-05-08 20:38 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 early-stopping-type=iteration [...] lm word-level transformer | 5.3070 | 5.4849 | 5.2377 | |
30 | p/tlen | 2019-05-08 11:24 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=0 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer left-to-right bpe | 5.3200 | 5.4907 | 5.2505 | |
26 | p/tlen | 2019-05-08 08:59 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 beam-search-depth=1 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] word-level transformer left-to-right bpe | 5.3071 | 5.4841 | 5.2370 | |
106 | p/tlen | 2019-04-17 11:25 | 1.0.0 | LM model used (model.bin) attention-dropout=0.1 attention-heads=8 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 early-stopping-type=iteration [...] lm word-level transformer | 6.5591 | 6.6602 | 6.5816 | |
24 | p/tlen | 2019-04-12 04:40 | 1.0.0 | LM model used (bi-transformer.bin) aggregator=MIN lm word-level transformer | 5.3190 | 5.5181 | 5.2311 | |
5 | p/tlen | 2019-04-12 04:40 | 1.0.0 | LM model used (bi-transformer.bin) aggregator=MAX lm word-level transformer | 4.9537 | 5.1992 | 4.9031 | |
4 | p/tlen | 2019-04-12 04:40 | 1.0.0 | LM model used (bi-transformer.bin) aggregator=RMS lm word-level transformer | 4.9381 | 5.1868 | 4.8886 | |
3 | p/tlen | 2019-04-12 04:40 | 1.0.0 | LM model used (bi-transformer.bin) aggregator=MEAN lm word-level transformer | 4.9128 | 5.1678 | 4.8653 | |
2 | p/tlen | 2019-04-12 04:40 | 1.0.0 | LM model used (bi-transformer.bin) aggregator=GEO lm word-level transformer | 4.9048 | 5.1608 | 4.8570 | |
139 | p/tlen | 2019-04-10 22:39 | 1.0.0 | LM model used (transformer-sumo.bin) lm word-level transformer | 6.8188 | 6.8351 | 6.8170 | |
9 | p/tlen | 2019-04-10 12:41 | 1.0.0 | LM model used (bi-partially-casemarker-transformer.bin) lm word-level transformer | 5.0075 | 5.2557 | 4.9933 | |
8 | p/tlen | 2019-04-10 09:54 | 1.0.0 | LM model used (bi-transformer.bin) lm word-level transformer | 4.9959 | 5.2376 | 4.9875 | |
22 | p/tlen | 2019-04-10 00:09 | 1.0.0 | LM model used (model.bin) lm word-level transformer right-to-left | 5.2604 | 5.4692 | 5.2021 | |
136 | p/tlen | 2019-04-06 11:23 | 1.0.0 | LM model used (model.bin) lm word-level transformer | 6.8933 | 6.7577 | 6.8104 | |
138 | p/tlen | 2019-04-06 00:39 | 1.0.0 | LM model used (model.bin) lm word-level transformer | 6.8781 | 6.7650 | 6.8168 | |
137 | p/tlen | 2019-04-05 15:29 | 1.0.0 | LM model trained on 20190405 (applica-lm-retro-gap-transformer-frage-rvt-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin) attention-dropout=0.1 attention-heads=8 best-epoch=65 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer | 6.8781 | 6.7650 | 6.8168 | |
29 | p/tlen | 2019-04-01 13:14 | 1.0.0 | LM model used (model.bin) lm word-level transformer left-to-right | 5.3162 | 5.5091 | 5.2502 | |
14 | p/tlen | 2019-04-01 10:16 | 1.0.0 | LM model trained on 20190331 (applica-lm-retro-gap-bilstm-case-marker-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) lm word-level bilstm casemarker | 5.0902 | 5.3020 | 5.0304 | |
35 | p/tlen | 2019-03-31 23:09 | 1.0.0 | LM model trained on 20190331 (applica-lm-retro-gap-bilstm-case-marker-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=97 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=20 early-stopping-type=iteration enc-dropout=0.5 [...] lm word-level bilstm casemarker | 5.4263 | 5.6073 | 5.3378 | |
84 | p/tlen | 2019-03-30 12:13 | 1.0.0 | LM model used (model.bin) lm word-level transformer | 6.4247 | 6.4674 | 6.2930 | |
99 | p/tlen | 2019-03-30 05:29 | 1.0.0 | LM model trained on 20190330 (applica-lm-retro-gap-transformer-frage-casemarker-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin) attention-dropout=0.1 attention-heads=8 best-epoch=20 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer | 6.6148 | 6.6728 | 6.4872 | |
12 | p/tlen | 2019-03-29 02:33 | 1.0.0 | LM model trained on 20190329 (applica-lm-retro-gap-bilstm-frage-fixed-vocab-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=103 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=20 early-stopping-type=iteration enc-dropout=0.5 [...] lm word-level bilstm | 5.0710 | 5.2996 | 5.0022 | |
6 | p/tlen | 2019-03-21 14:11 | 1.0.0 | per-period models combined (100/50) bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=8 early-stopping-type=epoch enc-dropout=0.5 enc-highways=0 [...] lm word-level bilstm | 5.0234 | 5.2742 | 4.9768 | |
7 | p/tlen | 2019-03-21 09:04 | 1.0.0 | per-period models combined (100/50) bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=8 early-stopping-type=epoch enc-dropout=0.5 enc-highways=0 [...] lm word-level bilstm | 5.0241 | 5.2751 | 4.9769 | |
10 | p/tlen | 2019-03-21 08:31 | 1.0.0 | two BiLSTMs, one for each 100 years bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=8 early-stopping-type=epoch enc-dropout=0.5 enc-highways=0 [...] lm word-level bilstm | 5.0337 | 5.2924 | 4.9956 | |
17 | p/tlen | 2019-03-20 04:14 | 1.0.0 | LM model trained on 20190320 (applica-lm-retro-gap-train-1864-1963-bilstm-frage-1814-1913-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=35 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=8 early-stopping-type=epoch enc-dropout=0.5 [...] lm word-level bilstm | 5.1606 | 5.3497 | 5.0825 | |
13 | p/tlen | 2019-03-19 21:11 | 1.0.0 | LM model trained on 20190319 (applica-lm-retro-gap-train-1914-2013-bilstm-frage-1914-2013-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=63 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=20 early-stopping-type=iteration enc-dropout=0.5 [...] lm word-level bilstm | 5.1546 | 5.3037 | 5.0229 | |
31 | p/tlen | 2019-03-16 19:24 | 1.0.0 | LM model trained on 20190316 (applica-lm-retro-gap-train-1814-1913-bilstm-frage-1814-1913-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=32 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=8 early-stopping-type=epoch enc-dropout=0.5 [...] lm word-level bilstm | 5.2199 | 5.4381 | 5.2651 | |
189 | p/tlen | 2019-03-10 17:09 | 1.0.0 | LM model trained on 20190310 (applica-lm-retro-gap-transformer-frage-preproc=minimalistic-left_to_right-lang=pl-5.2.0.0.bin) attention-dropout=0.1 attention-heads=8 best-epoch=74 bptt=50 chunksize=10000 clip=0.25 dropout=0.1 early-stopping=20 [...] lm word-level transformer | 9.5350 | 9.4320 | 9.2231 | |
19 | p/tlen | 2019-02-22 22:52 | 1.0.0 | LM model used (model.bin) lm word-level bilstm | 5.1855 | 5.3806 | 5.1346 | |
11 | p/tlen | 2019-02-18 15:14 | 1.0.0 | LM model trained on 20190218 (applica-lm-retro-gap-retro-gap-frage-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=95 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=20 early-stopping-type=iteration enc-dropout=0.5 [...] lm word-level bilstm frage | 5.0696 | 5.2951 | 5.0006 | |
37 | p/tlen | 2019-02-08 23:03 | 1.0.0 | LM model trained on 20190208 (applica-lm-train-tokenized-lowercased-shuffled-bilstm-all-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=1 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=40 enc-dropout=0.5 enc-highways=0 [...] lm word-level bilstm | 5.4361 | 5.5855 | 5.3822 | |
36 | p/tlen | 2019-02-07 07:58 | 1.0.0 | LM model trained on 20190207 (applica-lm-train-tokenized-lowercased-shuffled-bilstm-all-preproc=minimalistic-bidirectional-lang=pl-5.2.0.0.bin) best-epoch=1 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 early-stopping=10 enc-dropout=0.5 enc-highways=0 [...] lm word-level bilstm | 5.4315 | 5.5867 | 5.3780 | |
15 | p/tlen | 2019-02-02 09:49 | 1.0.0 | LM model trained on 20190202 (applica-lm-retro-gap-bilstm-word-preproc=minimalistic-bidirectional-lang=pl-5.1.9.0.bin) best-epoch=79 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 epochs=80 [...] lm word-level bilstm | 5.1253 | 5.3486 | 5.0603 | |
39 | p/tlen | 2019-01-30 19:56 | 1.0.0 | LM model trained on 20190130 (applica-lm-retro-gap-transformer-word-preproc=minimalistic-left_to_right-lang=pl-5.1.9.0.bin) attention-heads=8 best-epoch=80 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 [...] lm word-level transformer | 5.7237 | 5.8370 | 5.6476 | |
42 | p/tlen | 2019-01-28 05:16 | 1.0.0 | LM model trained on 20190128 (applica-lm-retro-gap-transformer-word-preproc=minimalistic-left_to_right-lang=pl-5.1.9.0.bin) attention-heads=8 best-epoch=48 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 [...] lm word-level transformer | 5.7902 | 5.8972 | 5.7039 | |
34 | p/tlen | 2019-01-09 16:30 | 1.0.0 | LM model trained on 20190109 (applica-lm-retro-gap-bilstm-cnn-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin) best-epoch=50 bptt=35 char-emb-size=16 chunksize=10000 clip=0.25 dropout=0.3 enc-dropout=0.3 enc-highways=2 [...] lm char-n-grams bilstm | 5.4438 | 5.5773 | 5.3202 | |
16 | p/tlen | 2018-12-30 05:32 | 1.0.0 | LM model trained on 20181230 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin) best-epoch=49 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 epochs=50 [...] lm word-level bilstm | 5.1365 | 5.3476 | 5.0632 | |
33 | p/tlen | 2018-12-27 21:24 | 1.0.0 | LM model trained on 20181227 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=pl-5.1.8.0.bin) best-epoch=1 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 epochs=1 [...] lm word-level bilstm | 5.4143 | 5.5581 | 5.3177 | |
18 | p/tlen | 2018-12-27 10:00 | 1.0.0 | LM model trained20181225 (applica-lm-retro-gap-bilstm-preproc=minimalistic-bidirectional-lang=en-5.1.8.0.bin) best-epoch=39 bptt=35 chunksize=10000 clip=0.25 dropout=0.5 enc-dropout=0.5 enc-highways=0 epochs=40 [...] lm | 5.2085 | N/A | 5.1303 | |
45 | p/tlen | 2018-09-02 20:20 | 1.0.0 | simple 2-layer LSTM, left-to-right epochs=1 neural-network lm lstm left-to-right | 5.8425 | 5.8765 | 5.7359 | |
196 | [anonymized] | 2018-01-28 07:51 | 1.0.0 | trigrams_fixed self-made lm trigram | N/A | N/A | 19.1101 | |
46 | [anonymized] | 2018-01-24 14:39 | 1.0.0 | simple neural network, context 2 words ahead 2 words behind neural-network | 5.8672 | 6.0007 | 5.7395 | |
47 | kaczla | 2018-01-17 11:20 | 1.0.0 | simple neural network - nb_of_epochs=3, batch_size=2048 neural-network | 5.8751 | 5.9999 | 5.7839 | |
51 | kaczla | 2018-01-16 18:52 | 1.0.0 | simple neural network - nb_of_epochs=2 neural-network | 5.9285 | 6.0385 | 5.8193 | |
52 | kaczla | 2018-01-16 18:13 | 1.0.0 | simple neural network - nb_of_epochs=4 neural-network | 5.9463 | 6.0446 | 5.8514 | |
57 | kaczla | 2018-01-16 17:17 | 1.0.0 | simple neural network - decrease batch_size neural-network | 6.1810 | 6.2569 | 6.0581 | |
65 | [anonymized] | 2018-01-15 18:11 | 1.0.0 | Bigrams model, 100 best words stupid self-made lm bigram | N/A | 6.3638 | 6.1097 | |
235 | [anonymized] | 2018-01-09 18:26 | 1.0.0 | ??? stupid self-made lm bigram | N/A | N/A | N/A | |
234 | [anonymized] | 2018-01-09 18:08 | 1.0.0 | Bigrams model, 100 best words stupid self-made lm bigram | N/A | N/A | N/A | |
53 | p/tlen | 2018-01-03 06:07 | 1.0.0 | a very simple (non-recurrent) neural network, looking one word behind and one word ahead (train on all data), dictionary size=40000 neural-network | 5.9766 | 6.0881 | 5.8648 | |
195 | [anonymized] | 2018-01-02 18:14 | 1.0.0 | 'trigrams' self-made lm trigram | N/A | N/A | 14.5507 | |
233 | [anonymized] | 2018-01-02 17:26 | 1.0.0 | 'trigrams' | N/A | N/A | N/A | |
54 | p/tlen | 2018-01-02 16:23 | 1.0.0 | a very simple (non-recurrent) neural network, looking one word behind and one word ahead neural-network | 5.9794 | 6.0982 | 5.8990 | |
60 | [anonymized] | 2017-12-13 14:54 | 1.0.0 | unigram with temporal info, best 100, two periods (1813, 1913) (1913, 2014) self-made lm temporal unigram | 6.1654 | 6.1828 | 6.0816 | |
62 | [anonymized] | 2017-12-13 14:44 | 1.0.0 | unigram with temporal info, best 100, 2 periods (1813, 1913) (1913, 2014) self-made lm temporal unigram | 6.1717 | 6.2016 | 6.0893 | |
71 | [anonymized] | 2017-12-13 14:41 | 1.0.0 | unigram with temporal model, 25 best self-made | 6.2397 | 6.2592 | 6.1729 | |
75 | kaczla | 2017-12-12 20:45 | 1.0.0 | 3-gram with prune, best 1, best oov ready-made kenlm lm | 6.1260 | 6.2991 | 6.1896 | |
56 | kaczla | 2017-12-12 20:42 | 1.0.0 | 3-gram with prune, best 2, best oov ready-made kenlm lm | 5.9662 | 6.1685 | 6.0105 | |
55 | kaczla | 2017-12-12 20:41 | 1.0.0 | 3-gram with prune, best 3, best oov ready-made kenlm lm | 5.8803 | 6.0738 | 5.9181 | |
50 | kaczla | 2017-12-12 20:38 | 1.0.0 | 3-gram with prune, best 5, best oov ready-made kenlm lm | 5.8022 | 5.9837 | 5.8182 | |
44 | kaczla | 2017-12-12 20:37 | 1.0.0 | 3-gram with prune, best 10, best oov ready-made kenlm lm | 5.7428 | 5.9032 | 5.7196 | |
41 | kaczla | 2017-12-12 20:35 | 1.0.0 | 3-gram with prune, best 15, best oov ready-made kenlm lm | 5.7367 | 5.8767 | 5.7006 | |
43 | kaczla | 2017-12-12 20:32 | 1.0.0 | 3-gram with prune, best 25, best oov ready-made kenlm lm | 5.7500 | 5.8788 | 5.7052 | |
80 | kaczla | 2017-12-12 19:19 | 1.0.0 | 3-gram with prune, best 1 ready-made kenlm lm | 6.1473 | 6.3361 | 6.2166 | |
81 | kaczla | 2017-12-12 19:17 | 1.0.0 | 3-gram with prune, best 2 ready-made kenlm lm | 6.1808 | 6.4349 | 6.2362 | |
85 | kaczla | 2017-12-12 19:14 | 1.0.0 | 3-gram with prune, best 3 ready-made kenlm lm | 6.2590 | 6.5174 | 6.3085 | |
98 | kaczla | 2017-12-05 21:39 | 1.0.0 | 3-gram with prune, best 5 ready-made kenlm lm | 6.4040 | 6.6586 | 6.4228 | |
107 | kaczla | 2017-12-05 21:38 | 1.0.0 | 3-gram with prune, best 10 ready-made kenlm lm | 6.6364 | 6.8789 | 6.5879 | |
121 | kaczla | 2017-12-05 21:35 | 1.0.0 | 3-gram with prune, best 15 ready-made kenlm lm | 6.7882 | 7.0033 | 6.7119 | |
144 | kaczla | 2017-12-05 21:33 | 1.0.0 | 3-gram with prune, best 25 ready-made kenlm lm | 6.9749 | 7.1766 | 6.8763 | |
173 | kaczla | 2017-12-05 21:30 | 1.0.0 | 3-gram with prune, best 50 ready-made kenlm lm | 7.2401 | 7.4038 | 7.1059 | |
185 | kaczla | 2017-12-05 21:24 | 1.0.0 | 3-gram with prune, best 100 ready-made kenlm lm | 7.4523 | 7.6464 | 7.3087 | |
79 | [anonymized] | 2017-06-29 22:47 | 1.0.0 | Order 4 | N/A | N/A | 6.2111 | |
86 | [anonymized] | 2017-06-29 18:38 | 1.0.0 | order 2 | N/A | N/A | 6.3262 | |
77 | [anonymized] | 2017-06-29 15:12 | 1.0.0 | Update source code; kenlm order=3 tokenizer.perl from moses. best 100 results, text mode. ready-made kenlm lm | N/A | N/A | 6.1898 | |
76 | [anonymized] | 2017-06-29 15:08 | 1.0.0 | added wildcard | N/A | N/A | 6.1898 | |
197 | [anonymized] | 2017-06-29 12:29 | 1.0.0 | first 100 | N/A | N/A | Infinity | |
232 | [anonymized] | 2017-06-28 13:23 | 1.0.0 | top 100 | N/A | N/A | N/A | |
147 | [anonymized] | 2017-06-28 08:47 | 1.0.0 | test 2 ready-made neural-network | N/A | N/A | 6.8956 | |
186 | [anonymized] | 2017-06-27 19:14 | 1.0.0 | first test ready-made neural-network | N/A | N/A | 7.5236 | |
231 | [anonymized] | 2017-06-15 23:29 | 1.0.0 | First try | N/A | N/A | N/A | |
134 | [anonymized] | 2017-05-16 04:31 | 1.0.0 | zad 16 self-made lm | N/A | N/A | 6.8056 | |
58 | [anonymized] | 2017-04-24 16:42 | 1.0.0 | unigramy, n=100, v3 self-made lm | 6.1745 | 6.1841 | 6.0733 | |
187 | [anonymized] | 2017-04-24 16:32 | 1.0.0 | unigramy, n=100, v2 self-made lm | 8.0610 | 8.0714 | 7.8460 | |
230 | [anonymized] | 2017-04-24 16:29 | 1.0.0 | unigramy, n=100 | N/A | N/A | N/A | |
229 | [anonymized] | 2017-04-24 16:24 | 1.0.0 | unigramy, n=1000 | 7.6808 | 7.7246 | N/A | |
182 | [anonymized] | 2017-04-24 15:14 | 1.0.0 | unigramy (dobre kodowanie) v2 self-made lm | 7.3661 | 7.3596 | 7.2467 | |
228 | [anonymized] | 2017-04-24 15:11 | 1.0.0 | unigramy (dobre kodowanie) | N/A | N/A | N/A | |
181 | [anonymized] | 2017-04-23 17:57 | 1.0.0 | Unigram (problem kodowania) | 7.3661 | 7.3596 | 7.2467 | |
227 | [anonymized] | 2017-04-23 17:53 | 1.0.0 | Unigram (problem kodowania) | 7.3661 | N/A | N/A | |
225 | [anonymized] | 2017-04-23 17:46 | 1.0.0 | Unigram (problem kodowania) | N/A | N/A | N/A | |
226 | [anonymized] | 2017-04-23 17:43 | 1.0.0 | Unigram (problem kodowania) | N/A | N/A | N/A | |
152 | p/tlen | 2017-04-10 06:22 | 1.0.0 | uniform probability except for comma stupid | 6.9116 | 6.9585 | 6.9169 | |
161 | p/tlen | 2017-04-10 06:18 | 1.0.0 | uniform probability stupid | 6.9315 | 6.9315 | 6.9315 |