Challenging America word-gap prediction
Guess a word in a gap. [ver. 3.0.0]
This is a long list of all submissions, if you want to see only the best, click leaderboard.
# | submitter | when | ver. | description | dev-0 PerplexityHashed | test-A PerplexityHashed | |
---|---|---|---|---|---|---|---|
414 | s444383 | 2023-09-29 13:26 | 3.0.0 | 2 bigram | 771.30 | 804.54 | |
391 | s444383 | 2023-09-29 13:14 | 3.0.0 | first commit bigram | 664.30 | 711.96 | |
25 | Marcin Kostrzewski | 2023-09-25 10:16 | 3.0.0 | GRU, both left and right context used, embedding_size=128, hidden_size=256, gru_layers=4, epochs=10 | 161.69 | 175.95 | |
409 | Marcin Kostrzewski | 2023-09-24 23:57 | 3.0.0 | gpt-2, no fine tuning, rightcontext gpt2 | 717.83 | 772.76 | |
81 | Marcin Kostrzewski | 2023-09-24 23:30 | 3.0.0 | gpt-2, fine tuned with both left and right context, inference using left context gpt2 fine-tuned | 232.55 | 262.79 | |
727 | Marcin Kostrzewski | 2023-09-24 23:27 | 3.0.0 | gpt-2, fine tuned with both left and right context, inference using left context gpt2 fine-tuned | N/A | N/A | |
180 | Marcin Kostrzewski | 2023-07-24 20:40 | 3.0.0 | 8_1 hidden_size=128 | 296.96 | 342.72 | |
178 | Marcin Kostrzewski | 2023-07-24 20:40 | 3.0.0 | 8_1 hidden_size=512 | 295.64 | 340.18 | |
147 | s478815 | 2023-07-02 15:37 | 3.0.0 | gpt-2, left gpt2 | 247.38 | 312.41 | |
105 | s478815 | 2023-07-02 14:55 | 3.0.0 | gpt-2, fine tuning gpt2 | 230.07 | 279.51 | |
512 | Wiktor Bombola | 2023-07-02 10:53 | 3.0.0 | Zad9 hexagram neural-network bow | 1589.14 | 1620.44 | |
474 | Wiktor Bombola | 2023-07-01 08:52 | 3.0.0 | gpt-finetuned with left context left-to-right gpt2 fine-tuned | 1049.69 | 1048.83 | |
138 | Cezary | 2023-07-01 06:31 | 3.0.0 | fine tune gpt2 using left context top=15 gpt2 fine-tuned | 274.76 | 309.08 | |
421 | Jakub Eichner | 2023-06-30 19:50 | 3.0.0 | Flan T5 zero shot solution transformer t5 | 722.50 | 817.70 | |
514 | Wiktor Bombola | 2023-06-30 12:13 | 3.0.0 | Zadanie 7 bigram model neural-network bigram | 1640.99 | 1626.02 | |
513 | Wiktor Bombola | 2023-06-30 12:11 | 3.0.0 | Zadanie 7 bigram model neural-network bigram | 1640.99 | 1626.02 | |
285 | Adam Wojdyła | 2023-06-30 08:38 | 3.0.0 | gpt2 finetuned, top=50, l+r context, 25000 train size top=50 neural-network | 425.72 | 458.49 | |
40 | s444455 | 2023-06-29 23:48 | 3.0.0 | gpt2 | 182.00 | 203.53 | |
565 | s478840 | 2023-06-29 21:45 | 3.0.0 | s478840 | 28307.01 | 33091.12 | |
454 | Marcin Kostrzewski | 2023-06-29 19:35 | 3.0.0 | gpt-2, fine tuned with both left and right context, inference using left context gpt2 fine-tuned | 985.08 | 977.20 | |
726 | Marcin Kostrzewski | 2023-06-29 19:23 | 3.0.0 | gpt-2, fine tuned with both left and right context, inference using left context gpt2 fine-tuned | N/A | N/A | |
725 | Marcin Kostrzewski | 2023-06-29 18:25 | 3.0.0 | gpt-2, fine tuned with both left and right context, inference using left context gpt2 fine-tuned | N/A | N/A | |
724 | Wiktor Bombola | 2023-06-29 17:20 | 3.0.0 | const wildcard | 1032.10 | N/A | |
723 | Wiktor Bombola | 2023-06-29 17:04 | 3.0.0 | asdf | 2734.50 | N/A | |
722 | Wiktor Bombola | 2023-06-29 16:39 | 3.0.0 | gpt version | 2700.38 | N/A | |
42 | [anonymized] | 2023-06-29 13:13 | 3.0.0 | gpt2 finetuned gpt2 fine-tuned | 197.06 | 214.52 | |
352 | s478840 | 2023-06-28 23:21 | 3.0.0 | s478840 batch-size=8000 embed-size=150 vocab-size=30000 neural-network bigram | 507.70 | 569.03 | |
173 | s478840 | 2023-06-28 21:38 | 3.0.0 | s478840 gpt2 fine-tuned | 295.46 | 332.90 | |
119 | s478840 | 2023-06-28 19:43 | 3.0.0 | s478840 | 253.48 | 290.36 | |
473 | Wiktor Bombola | 2023-06-28 19:35 | 3.0.0 | test-calculated | 1043.67 | 1042.73 | |
511 | Wiktor Bombola | 2023-06-28 19:31 | 3.0.0 | zad8-trigram | 1043.67 | 1595.91 | |
510 | Wiktor Bombola | 2023-06-28 19:15 | 3.0.0 | trigram fixed probs !=1 | 1752.13 | 1595.91 | |
509 | Wiktor Bombola | 2023-06-28 17:46 | 3.0.0 | const wildcard | 1031.72 | 1595.91 | |
508 | Wiktor Bombola | 2023-06-28 17:21 | 3.0.0 | asdf | 1731.59 | 1595.91 | |
507 | Wiktor Bombola | 2023-06-28 16:52 | 3.0.0 | zad8-trigram-infile | 1575.26 | 1595.91 | |
506 | Wiktor Bombola | 2023-06-28 16:46 | 3.0.0 | zad8-trigram | 1575.26 | 1595.91 | |
134 | Adam Wojdyła | 2023-06-28 15:51 | 3.0.0 | s444507 top=30 neural-network | 260.29 | 303.80 | |
47 | Jakub Eichner | 2023-06-28 15:09 | 3.0.0 | GPT-2 fine tuning word gap prediction v2 top=30 | 200.54 | 226.59 | |
107 | Jakub | 2023-06-28 09:21 | 3.0.0 | x oputput | N/A | 280.90 | |
721 | Jakub | 2023-06-28 09:19 | 3.0.0 | Fix output | N/A | N/A | |
720 | Jakub | 2023-06-28 09:14 | 3.0.0 | Fixed output | N/A | N/A | |
719 | Jakub | 2023-06-28 09:11 | 3.0.0 | Fixed output | N/A | N/A | |
718 | Jakub | 2023-06-28 09:10 | 3.0.0 | Fixed output t5 | N/A | N/A | |
717 | Jakub | 2023-06-28 09:07 | 3.0.0 | Updated inference t5 | N/A | N/A | |
378 | Jakub | 2023-06-28 08:38 | 3.0.0 | Fix formatting t5 | N/A | 670.53 | |
716 | Jakub | 2023-06-28 08:37 | 3.0.0 | Fix output t5 | N/A | N/A | |
715 | Jakub | 2023-06-28 08:36 | 3.0.0 | Fix output t5 | N/A | N/A | |
714 | Jakub | 2023-06-28 08:34 | 3.0.0 | Remove double words at teh beginning t5 | N/A | N/A | |
713 | Jakub | 2023-06-28 08:29 | 3.0.0 | Remove double words t5 | N/A | N/A | |
712 | Jakub | 2023-06-28 08:23 | 3.0.0 | Flan t-5 reults t5 | N/A | N/A | |
446 | s444465 | 2023-06-27 22:17 | 3.0.0 | s444465 bigram neural model fixed v2 batch_size=2500 embed_size=200 epochs=5 learning-rate=1.0e-3 vocab_size=40000 neural-network bigram | 787.58 | 934.09 | |
486 | s444465 | 2023-06-27 21:46 | 3.0.0 | s444465 bigram neural model fixed batch_size=2500 embed_size=200 epochs=5 learning-rate=1.0e-3 vocab_size=40000 neural-network bigram | 979.22 | 1155.04 | |
485 | s444465 | 2023-06-27 21:42 | 3.0.0 | s444465 bigram neural model fixed bigram | 979.22 | 1155.04 | |
8 | Jakub | 2023-06-27 20:07 | 3.0.0 | Updated input truncation | N/A | 85.17 | |
10 | Jakub | 2023-06-27 19:09 | 3.0.0 | Fix formatting fine-tuned roberta-base | N/A | 108.46 | |
711 | Jakub | 2023-06-27 19:08 | 3.0.0 | Truncated inputs fine-tuned roberta-base | N/A | N/A | |
20 | Jakub | 2023-06-27 17:52 | 3.0.0 | Fix out formatting v3 fine-tuned roberta-base | N/A | 140.33 | |
585 | Jakub | 2023-06-27 17:39 | 3.0.0 | Fix out formatting fine-tuned roberta-base | N/A | Infinity | |
710 | Jakub | 2023-06-27 17:36 | 3.0.0 | Fix output formatting fine-tuned roberta-base | N/A | N/A | |
709 | Jakub | 2023-06-27 17:24 | 3.0.0 | First finetuned model fine-tuned roberta-base | N/A | N/A | |
31 | [anonymized] | 2023-06-26 21:55 | 3.0.0 | gpt-2 left context out files left-to-right gpt2 transformer-decoder | 165.54 | 185.64 | |
46 | s478839 | 2023-06-25 11:29 | 3.0.0 | gpt2-fine-tuned top=30 train_dataset=50000 gpt2 fine-tuned | 200.50 | 226.54 | |
45 | s478839 | 2023-06-24 22:05 | 3.0.0 | gpt2-fine-tuned top=30 train_dataset=50000 gpt2 fine-tuned | N/A | 226.54 | |
35 | s478873 | 2023-06-24 20:29 | 3.0.0 | s478873 gpt-2-file-tuned top_k=30 neural-network gpt2 fine-tuned | 167.13 | 195.24 | |
50 | s478839 | 2023-06-24 20:00 | 3.0.0 | gpt2-fine-tuned top=30 train_dataset=10000 gpt2 fine-tuned | N/A | 232.89 | |
521 | s478815 | 2023-06-20 21:04 | 3.0.0 | s478815 zad4-2 bigram | 1773.49 | 1829.31 | |
117 | s478815 | 2023-06-20 19:43 | 3.0.0 | s478815 trigram | 262.11 | 289.83 | |
375 | s444354 | 2023-06-17 12:25 | 3.0.0 | gpt right context | 512.49 | 650.59 | |
374 | s444354 | 2023-06-17 12:20 | 3.0.0 | gpt right context | 210.92 | 650.59 | |
373 | s444354 | 2023-06-17 12:17 | 3.0.0 | gpt right context | 229.46 | 650.59 | |
174 | 444498 | 2023-06-16 23:33 | 3.0.0 | gpt2 fine tuning | 291.98 | 333.79 | |
372 | s444354 | 2023-06-16 23:27 | 3.0.0 | gpt right context | N/A | 650.59 | |
95 | s444354 | 2023-06-16 22:55 | 3.0.0 | gpt fine tune | 230.88 | 272.09 | |
708 | s444354 | 2023-06-16 22:48 | 3.0.0 | gpt right context | N/A | N/A | |
39 | s444391 | 2023-06-16 22:01 | 3.0.0 | fine-tuned gpt2 embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network fine-tuned | 182.00 | 203.53 | |
38 | [anonymized] | 2023-06-16 10:44 | 3.0.0 | zad 13-1 | 182.00 | 203.53 | |
158 | Adam Wojdyła | 2023-06-16 06:33 | 3.0.0 | gpt2 top=10 neural-network | 277.28 | 318.25 | |
115 | s444455 | 2023-06-15 21:00 | 3.0.0 | 12 | 268.65 | 287.19 | |
566 | Marcin Kostrzewski | 2023-06-15 20:58 | 3.0.0 | 444409 GPT2 gpt2 | 38887.42 | 39655.75 | |
707 | Marcin Kostrzewski | 2023-06-15 20:24 | 3.0.0 | 444409 GPT2 gpt2 | N/A | N/A | |
116 | [anonymized] | 2023-06-15 18:48 | 3.0.0 | zad 12-2 | 268.65 | 287.20 | |
706 | Marcin Kostrzewski | 2023-06-15 18:29 | 3.0.0 | 444409 GPT2 | N/A | N/A | |
136 | [anonymized] | 2023-06-15 17:44 | 3.0.0 | Neural network with changed parameters batch_size=6000 embed_size=200 epochs=1 learning-rate=1.0e-2 vocab_size=2500 gpt2-large | 266.63 | 307.54 | |
37 | Jakub Adamski | 2023-06-15 13:31 | 3.0.0 | gpt2 finetuning top=50 neural-network gpt2 | 182.00 | 203.53 | |
276 | s443930 | 2023-06-15 12:02 | 3.0.0 | zadanie12-2-2 | 395.08 | 452.53 | |
99 | s444018 | 2023-06-15 10:47 | 3.0.0 | s444018, transformer, decoder, right context top=25 gpt2 transformer-decoder | 232.64 | 272.80 | |
211 | s444391 | 2023-06-14 22:20 | 3.0.0 | lstm lab10 submission embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 480.73 | 375.17 | |
32 | s478873 | 2023-06-14 21:20 | 3.0.0 | s478873 right-context top_k=30 neural-network gpt2 | 163.32 | 186.02 | |
122 | [anonymized] | 2023-06-14 21:06 | 3.0.0 | GPT-2 transfer-learning on reversed right context gpt2 | 271.09 | 292.21 | |
16 | Jakub Adamski | 2023-06-14 15:53 | 3.0.0 | nn trigram neural-network trigram | 118.45 | 132.60 | |
15 | s444391 | 2023-06-14 10:05 | 3.0.0 | Embeddings and bow solution embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 118.45 | 132.60 | |
705 | s444354 | 2023-06-14 08:42 | 3.0.0 | gpt right context | N/A | N/A | |
412 | s444354 | 2023-06-14 08:06 | 3.0.0 | trigram model HIDDEN-SIZE=128 embed_size=300 epochs=1 hidden_size=256 learning-rate=1.0e-4 vocab_size=40000 neural-network trigram | 760.93 | 800.64 | |
393 | s444354 | 2023-06-14 08:06 | 3.0.0 | trigram model HIDDEN-SIZE=256 embed_size=300 epochs=1 hidden_size=256 learning-rate=1.0e-4 vocab_size=40000 neural-network trigram | 684.67 | 715.52 | |
267 | s444417 | 2023-06-14 08:03 | 3.0.0 | simple bigram bigram | 408.71 | 441.15 | |
482 | s444354 | 2023-06-13 23:59 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=256 learning-rate=1.0e-4 vocab_size=40000 neural-network trigram | 1129.08 | 1136.57 | |
18 | s444354 | 2023-06-13 21:05 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=750 learning-rate=1.0e-4 vocab_size=5000 | 123.73 | 138.03 | |
121 | s478855 | 2023-06-13 20:55 | 3.0.0 | s478855 decoder right context | 250.28 | 291.73 | |
13 | s444455 | 2023-06-12 23:24 | 3.0.0 | embeddings, bov | 119.72 | 130.68 | |
376 | s444417 | 2023-06-12 20:14 | 3.0.0 | gpt, right context transformer-decoder | 635.73 | 660.25 | |
237 | Mikołaj Pokrywka | 2023-06-12 17:52 | 3.0.0 | gorgot to add files | 431.38 | 397.80 | |
114 | s444356 | 2023-06-12 11:14 | 3.0.0 | transformer, decoder, only right context, fine tuning GPT-2 top=20 transformer-decoder | 268.65 | 287.19 | |
14 | [anonymized] | 2023-06-11 17:15 | 3.0.0 | zad9 | 118.45 | 132.60 | |
495 | s444465 | 2023-06-11 14:16 | 3.0.0 | 444465 simple trigram prediction | 1021.19 | 1188.03 | |
48 | s443930 | 2023-06-10 20:59 | 3.0.0 | zadanie12-1 | 210.11 | 231.20 | |
499 | s444465 | 2023-06-10 16:52 | 3.0.0 | s444465 bigram neural model batch_size=2500 embed_size=200 epochs=5 learning-rate=1.0e-3 vocab_size=40000 neural-network bigram | 1116.73 | 1302.80 | |
704 | s444465 | 2023-06-10 16:25 | 3.0.0 | s444465 bigram neural model | 1116.73 | N/A | |
150 | [anonymized] | 2023-06-10 15:31 | 3.0.0 | zad 12 gpt2 | 266.94 | 314.41 | |
703 | s444465 | 2023-06-10 14:25 | 3.0.0 | s444465 batch_size=1000 embed_size=300 epochs=5 hidden_size=150 learning-rate=1.0e-3 vocab_size=30000 neural-network trigram | 239.80 | N/A | |
120 | Kamil Guttmann | 2023-06-10 12:13 | 3.0.0 | gpt-2, left+right context, zero-shot topk=150 gpt2 just-inference | 252.00 | 291.50 | |
86 | s444501 | 2023-06-09 19:28 | 3.0.0 | 444501 gpt2 left+right context top=50 neural-network gpt2 | 229.46 | 268.68 | |
584 | Jakub | 2023-06-09 00:34 | 3.0.0 | nn, trigram, previous and next batch-size=5000 embed-size=100 epochs=1 topk=150 vocab-size=20000 neural-network trigram gpt2 transformer-decoder | N/A | Infinity | |
583 | Jakub | 2023-06-09 00:27 | 3.0.0 | nn, trigram, previous and next batch-size=5000 embed-size=100 epochs=1 topk=150 vocab-size=20000 neural-network trigram gpt2 transformer-decoder | N/A | Infinity | |
702 | Jakub | 2023-06-09 00:25 | 3.0.0 | nn, trigram, previous and next batch-size=5000 embed-size=100 epochs=1 topk=150 vocab-size=20000 neural-network trigram gpt2 transformer-decoder | N/A | N/A | |
582 | Jakub | 2023-06-09 00:11 | 3.0.0 | nn, trigram, previous and next batch-size=5000 embed-size=100 epochs=1 topk=150 vocab-size=20000 neural-network trigram gpt2 transformer-decoder | N/A | Infinity | |
581 | Jakub | 2023-06-09 00:02 | 3.0.0 | nn, trigram, previous and next batch-size=5000 embed-size=100 epochs=1 topk=150 vocab-size=20000 neural-network trigram gpt2 transformer-decoder | 2071.99 | Infinity | |
148 | s444455 | 2023-06-08 21:47 | 3.0.0 | gpt2 top=20 neural-network gpt2 | 264.48 | 312.53 | |
91 | s444391 | 2023-06-08 21:45 | 3.0.0 | Gpt2 left-context embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 255.11 | 270.39 | |
701 | s444391 | 2023-06-08 21:44 | 3.0.0 | Gpt2 left-context embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | N/A | N/A | |
64 | s478839 | 2023-06-08 21:38 | 3.0.0 | gpt2 top=30 neural-network gpt2 | N/A | 250.27 | |
129 | s444354 | 2023-06-08 21:26 | 3.0.0 | GPT2 | 247.64 | 296.49 | |
404 | s478839 | 2023-06-08 21:10 | 3.0.0 | gpt2 top=10 neural-network gpt2 | 684.20 | 745.30 | |
155 | Mikołaj Pokrywka | 2023-06-08 19:12 | 3.0.0 | siup | 290.31 | 316.31 | |
172 | [anonymized] | 2023-06-08 19:08 | 3.0.0 | 478831 batch_size=6000 embed_size=200 epochs=1 learning-rate=1.0e-2 vocab_size=2500 | 281.29 | 331.71 | |
43 | [anonymized] | 2023-06-08 18:03 | 3.0.0 | s444421 top=10 gpt2 | 190.53 | 215.47 | |
151 | Jakub Adamski | 2023-06-08 17:36 | 3.0.0 | gpt2-left top=50 neural-network gpt2 | 266.94 | 314.51 | |
33 | s478873 | 2023-06-08 16:18 | 3.0.0 | s478873 top_k=30 neural-network gpt2 | 165.88 | 186.29 | |
75 | s444452 | 2023-06-08 16:14 | 3.0.0 | lstm lstm | 234.01 | 257.14 | |
700 | [anonymized] | 2023-06-08 15:36 | 3.0.0 | 478831 batch_size=6000 embed_size=200 epochs=1 learning-rate=1.0e-2 vocab_size=2500 gpt2 | N/A | N/A | |
275 | s443930 | 2023-06-08 14:37 | 3.0.0 | zadanie12-2 | 395.08 | 452.53 | |
71 | Martyna Druminska | 2023-06-08 12:25 | 3.0.0 | new | 225.75 | 252.80 | |
160 | s444386 | 2023-06-08 12:20 | 3.0.0 | gpt2 gpt2 | 286.78 | 318.75 | |
153 | s444501 | 2023-06-08 12:11 | 3.0.0 | 444501 gpt2-small left context top=50 neural-network gpt2 | 266.94 | 314.51 | |
52 | Mikołaj Pokrywka | 2023-06-08 11:07 | 3.0.0 | all done | 216.26 | 239.28 | |
699 | Martyna Druminska | 2023-06-08 11:02 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=200 | 700.81 | N/A | |
698 | Martyna Druminska | 2023-06-08 11:02 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=100 | 700.81 | N/A | |
697 | Martyna Druminska | 2023-06-08 11:02 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=1000 | 700.81 | N/A | |
70 | Martyna Druminska | 2023-06-08 11:02 | 3.0.0 | Prześlij pliki do 'dev-0' | N/A | 252.80 | |
696 | Martyna Druminska | 2023-06-08 10:35 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=200 | 700.81 | N/A | |
695 | Martyna Druminska | 2023-06-08 10:35 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=100 | 700.81 | N/A | |
694 | Martyna Druminska | 2023-06-08 10:35 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=1000 | 700.81 | N/A | |
69 | Martyna Druminska | 2023-06-08 10:35 | 3.0.0 | Prześlij pliki do 'test-A' | N/A | 252.80 | |
693 | Martyna Druminska | 2023-06-08 10:18 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=200 | 700.81 | N/A | |
692 | Martyna Druminska | 2023-06-08 10:18 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=100 | 700.81 | N/A | |
691 | Martyna Druminska | 2023-06-08 10:18 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=1000 | 700.81 | N/A | |
68 | Martyna Druminska | 2023-06-08 10:18 | 3.0.0 | Prześlij pliki do 'dev-0' | N/A | 252.80 | |
690 | Martyna Druminska | 2023-06-08 10:02 | 3.0.0 | Usuń 'dev-0/out.tsv' | N/A | N/A | |
689 | Martyna Druminska | 2023-06-08 10:02 | 3.0.0 | Usuń 'dev-0/out.tsv' HIDDENLAYER=200 | 700.81 | N/A | |
688 | Martyna Druminska | 2023-06-08 10:02 | 3.0.0 | Usuń 'dev-0/out.tsv' HIDDENLAYER=100 | 700.81 | N/A | |
687 | Martyna Druminska | 2023-06-08 10:02 | 3.0.0 | Usuń 'dev-0/out.tsv' HIDDENLAYER=1000 | 700.81 | N/A | |
686 | Martyna Druminska | 2023-06-08 10:00 | 3.0.0 | Prześlij pliki do 'dev-0' | N/A | N/A | |
685 | Martyna Druminska | 2023-06-08 10:00 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=200 | 700.81 | N/A | |
684 | Martyna Druminska | 2023-06-08 10:00 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=100 | 700.81 | N/A | |
683 | Martyna Druminska | 2023-06-08 10:00 | 3.0.0 | Prześlij pliki do 'dev-0' HIDDENLAYER=1000 | 700.81 | N/A | |
682 | Martyna Druminska | 2023-06-08 09:20 | 3.0.0 | gpt | N/A | N/A | |
681 | Martyna Druminska | 2023-06-08 09:20 | 3.0.0 | gpt HIDDENLAYER=200 | 700.81 | N/A | |
680 | Martyna Druminska | 2023-06-08 09:20 | 3.0.0 | gpt HIDDENLAYER=100 | 700.81 | N/A | |
679 | Martyna Druminska | 2023-06-08 09:20 | 3.0.0 | gpt HIDDENLAYER=1000 | 700.81 | N/A | |
678 | Martyna Druminska | 2023-06-08 09:11 | 3.0.0 | trigram model batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | N/A | N/A | |
677 | Martyna Druminska | 2023-06-08 09:11 | 3.0.0 | trigram model HIDDENLAYER=200 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
676 | Martyna Druminska | 2023-06-08 09:11 | 3.0.0 | trigram model HIDDENLAYER=100 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
675 | Martyna Druminska | 2023-06-08 09:11 | 3.0.0 | trigram model HIDDENLAYER=1000 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
674 | s444501 | 2023-06-08 01:13 | 3.0.0 | 444501 gpt2-small left context zeroshot top=600 neural-network gpt2 | 226.60 | N/A | |
152 | s444501 | 2023-06-08 01:13 | 3.0.0 | 444501 gpt2-small left context zeroshot top=50 neural-network gpt2 | 266.94 | 314.51 | |
98 | s444501 | 2023-06-08 01:13 | 3.0.0 | 444501 gpt2-small left context zeroshot top=200 neural-network gpt2 | 242.33 | 272.46 | |
78 | s444501 | 2023-06-08 01:13 | 3.0.0 | 444501 gpt2-small left context zeroshot top=400 neural-network gpt2 | 231.45 | 257.93 | |
109 | s478855 | 2023-06-07 22:39 | 3.0.0 | s478855 decoder | 239.18 | 283.52 | |
673 | Martyna Druminska | 2023-06-07 21:05 | 3.0.0 | Prześlij pliki do 'test-A' | N/A | N/A | |
672 | Martyna Druminska | 2023-06-07 21:05 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=200 | 700.81 | N/A | |
671 | Martyna Druminska | 2023-06-07 21:05 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=100 | 700.81 | N/A | |
670 | Martyna Druminska | 2023-06-07 21:05 | 3.0.0 | Prześlij pliki do 'test-A' HIDDENLAYER=1000 | 700.81 | N/A | |
176 | Cezary | 2023-06-07 19:34 | 3.0.0 | gpt2 top=15 gpt2 transformer-decoder | 288.76 | 337.71 | |
21 | s409771 | 2023-06-07 19:14 | 3.0.0 | gpt2-large gpt2-large | 126.73 | 142.89 | |
125 | s444452 | 2023-06-07 18:50 | 3.0.0 | gpt2 gpt2-large | 245.00 | 292.67 | |
146 | s444415 | 2023-06-07 16:33 | 3.0.0 | gpt2 lab 12 final gpt2-large | 268.56 | 312.24 | |
145 | s444415 | 2023-06-07 16:20 | 3.0.0 | gpt2 lab 12 v3 gpt2 | Infinity | 312.24 | |
29 | s444517 | 2023-06-07 14:41 | 3.0.0 | gpt-2, only left-context gpt2 transformer-decoder | 165.36 | 183.70 | |
28 | s444018 | 2023-06-07 14:35 | 3.0.0 | s444018 top_k=25 gpt2 transformer-decoder | 162.04 | 181.24 | |
30 | s444356 | 2023-06-07 13:55 | 3.0.0 | transformer, decoder, left context top=20 transformer-decoder | 165.26 | 185.39 | |
163 | s444417 | 2023-06-07 12:38 | 3.0.0 | gpt-2, left context transformer-decoder | 273.43 | 323.60 | |
165 | Kamil Guttmann | 2023-06-07 12:22 | 3.0.0 | gpt-2, left context, zero-shot topk=150 gpt2 just-inference | 268.51 | 324.27 | |
580 | Kamil Guttmann | 2023-06-07 12:01 | 3.0.0 | gpt-2, left context, zero-shot topk=150 gpt2 just-inference | 274.54 | Infinity | |
159 | s444386 | 2023-06-07 11:00 | 3.0.0 | gpt-2 test gpt2 | N/A | 318.75 | |
200 | Cezary | 2023-06-07 06:01 | 3.0.0 | GRU lab 10 batch_size=15000 embed_size=200 epochs=1 hidden_size=190 l_context=15 learning-rate=1.0e-3 top=10 vocab_size=20000 lstm | 318.99 | 358.50 | |
327 | s444391 | 2023-06-07 02:51 | 3.0.0 | lstm lab10 submission embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 480.73 | 515.25 | |
195 | [anonymized] | 2023-06-06 22:46 | 3.0.0 | 314.80 | 355.17 | ||
118 | s444354 | 2023-06-06 22:20 | 3.0.0 | LSTM, 10 right / 10 left context window embed_size=300 hidden_size=500 learning-rate=1.0e-4 vocab-size=5000 | 273.33 | 290.31 | |
124 | s444417 | 2023-06-06 22:10 | 3.0.0 | nn lstm, epoch:1, l_ctx, r_ctx = 10 batch_size=5000 embed_size=200 vocab_size=5000 lstm | 277.70 | 292.47 | |
245 | s444354 | 2023-06-06 22:09 | 3.0.0 | LSTM, vocab-size: 500, embed-size: 300, hidden-layer:500 embed_size=300 epochs=1 hidden_size=750 learning-rate=1.0e-4 vocab_size=5000 | 374.37 | 407.04 | |
219 | s409771 | 2023-06-06 21:38 | 3.0.0 | lstm, left context lstm | 359.23 | 383.83 | |
205 | s444455 | 2023-06-06 21:18 | 3.0.0 | trigram model embed_size=300 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 324.69 | 363.34 | |
234 | s444415 | 2023-06-06 21:07 | 3.0.0 | lstm lab10 v4 lstm | 356.93 | 397.05 | |
502 | s444415 | 2023-06-06 20:40 | 3.0.0 | lstm lab10 v3 lstm | 1366.37 | 1421.55 | |
500 | s444415 | 2023-06-06 20:30 | 3.0.0 | lstm lab10 v2 lstm | 1303.55 | 1315.75 | |
57 | ked | 2023-06-06 20:10 | 3.0.0 | bidirectional lstm with left-right context batch_size=4096 embed_size=150 hidden_size=1024 learning_rate=1.0e-4 vocab_size=20000 neural-network lstm | 214.06 | 242.86 | |
217 | [anonymized] | 2023-06-06 19:32 | 3.0.0 | s444421 batch-size=1024 embed-size=200 lstm_size=150 vocab-size=1000 lstm | 344.82 | 383.04 | |
536 | [anonymized] | 2023-06-06 12:01 | 3.0.0 | lstm 1 layer 100 epochs lstm | 3042.17 | 3308.79 | |
227 | s444386 | 2023-06-06 11:30 | 3.0.0 | lstm lstm | 353.73 | 389.94 | |
221 | s444386 | 2023-06-06 11:25 | 3.0.0 | lstm lstm | 348.40 | 384.55 | |
231 | s444386 | 2023-06-06 10:10 | 3.0.0 | lstm lstm | 5137.00 | 394.92 | |
669 | s444386 | 2023-06-06 10:05 | 3.0.0 | lstm lstm | 5137.00 | N/A | |
472 | s444386 | 2023-06-06 10:00 | 3.0.0 | lstm lstm | 5137.00 | 1025.13 | |
456 | [anonymized] | 2023-06-06 05:28 | 3.0.0 | younger model test lstm | 1021.98 | 1022.34 | |
480 | [anonymized] | 2023-06-06 05:11 | 3.0.0 | lstm first try lstm | 1084.15 | 1091.10 | |
228 | Kamil Guttmann | 2023-06-05 23:30 | 3.0.0 | bilstm, left context batch-size=64 embed-size=50 epochs=1 learning-rate=1.0e-3 lstm-size=300 topk=200 vocab-size=5000 bilstm | 353.20 | 390.31 | |
239 | Kamil Guttmann | 2023-06-05 23:22 | 3.0.0 | bilstm, left context batch-size=64 embed-size=50 epochs=1 learning-rate=1.0e-3 lstm-size=250 topk=200 vocab-size=5000 bilstm | 369.57 | 403.05 | |
208 | s444517 | 2023-06-05 16:19 | 3.0.0 | lstm, one direction, left context lstm | 339.06 | 368.42 | |
401 | s444386 | 2023-06-05 15:41 | 3.0.0 | lstm lstm | 5137.00 | 741.12 | |
230 | s444386 | 2023-06-05 15:29 | 3.0.0 | lstm lstm | 5137.00 | 394.72 | |
223 | Mikołaj Pokrywka | 2023-06-04 15:12 | 3.0.0 | zad 10 done | 344.75 | 386.51 | |
251 | s443930 | 2023-06-04 15:07 | 3.0.0 | tetragram tetragram | 385.42 | 416.99 | |
229 | s443930 | 2023-06-04 15:03 | 3.0.0 | zad10 context-length=10 embed-size=50 hidden-dim=25 vocab-size=1000 | 352.48 | 391.87 | |
224 | s443930 | 2023-06-04 15:03 | 3.0.0 | zad10 context-length=5 embed-size=50 hidden-dim=25 vocab-size=1000 | 351.13 | 388.41 | |
157 | s443930 | 2023-06-04 15:03 | 3.0.0 | zad10 context-length=10 embed-size=200 hidden-dim=150 vocab-size=20000 | 294.13 | 317.58 | |
233 | s444356 | 2023-06-04 09:27 | 3.0.0 | neural network, lstm batch-size=64 dropout=0.2 embed-size=100 epochs=1 hidden-size=128 learning-rate=1.0e-3 top=500 vocab-size=80000 neural-network lstm | 363.17 | 396.95 | |
23 | s478815 | 2023-06-02 19:15 | 3.0.0 | neural network | 138.20 | 149.96 | |
470 | Wiktor Bombola | 2023-06-02 12:28 | 3.0.0 | change the standarization method of output, wildcard | 1023.93 | 1024.24 | |
552 | Wiktor Bombola | 2023-06-02 11:37 | 3.0.0 | zad 7 prediction using only previous word | 5340.29 | 5773.59 | |
72 | Martyna Druminska | 2023-05-31 19:50 | 3.0.0 | Zaktualizuj 'gonito.yaml' | 227.23 | 253.07 | |
27 | s478873 | 2023-05-31 09:26 | 3.0.0 | s478873 batch-size=5000 embed-size=100 epochs=1 hidden_size=100 top=100 vocab-size=10000 neural-network bow | 164.88 | 180.53 | |
192 | s444452 | 2023-05-31 08:24 | 3.0.0 | bag of words neural-network bow | 344.26 | 354.14 | |
34 | Jakub Eichner | 2023-05-31 08:17 | 3.0.0 | Bag of words NN | 170.75 | 188.60 | |
90 | s444455 | 2023-05-31 00:48 | 3.0.0 | embeddings, bov embed_size=300 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 257.20 | 270.39 | |
100 | s444391 | 2023-05-31 00:45 | 3.0.0 | Embeddings and bow solution final results embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 256.80 | 273.37 | |
92 | [anonymized] | 2023-05-31 00:45 | 3.0.0 | embeddings, bov | 255.11 | 270.77 | |
103 | s444391 | 2023-05-31 00:33 | 3.0.0 | Embeddings and bow solution embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | N/A | 275.06 | |
83 | s444386 | 2023-05-30 23:04 | 3.0.0 | neural bag-of-words bow | 226.70 | 264.30 | |
259 | s444391 | 2023-05-30 22:58 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | N/A | 425.63 | |
244 | s444354 | 2023-05-30 22:55 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=750 learning-rate=1.0e-4 vocab_size=5000 | 374.37 | 407.04 | |
243 | s444391 | 2023-05-30 22:49 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | N/A | 407.04 | |
19 | s444018 | 2023-05-30 22:47 | 3.0.0 | s444018 batch-size=10000 dropout=0.3 embed-size=300 hidden-size=1000 second-hidden-size=500 vocab-size=12500 neural-network bow | 126.63 | 139.63 | |
668 | Martyna Druminska | 2023-05-30 22:37 | 3.0.0 | trigram model HIDDENLAYER=1000 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
667 | Martyna Druminska | 2023-05-30 22:37 | 3.0.0 | trigram model HIDDENLAYER=100 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
666 | Martyna Druminska | 2023-05-30 22:37 | 3.0.0 | trigram model HIDDENLAYER=200 batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 700.81 | N/A | |
49 | Martyna Druminska | 2023-05-30 22:37 | 3.0.0 | trigram model batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | N/A | 232.67 | |
258 | s444391 | 2023-05-30 22:04 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 390.25 | 424.44 | |
286 | s444391 | 2023-05-30 21:42 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 423.46 | 458.77 | |
142 | s444415 | 2023-05-30 21:17 | 3.0.0 | Nn ngram model pre top=500 neural-network n-grams bow | 268.50 | 311.66 | |
126 | Jakub Adamski | 2023-05-30 21:14 | 3.0.0 | nn-trigram-tanh-bow | 264.80 | 294.01 | |
357 | s444415 | 2023-05-30 21:14 | 3.0.0 | nn model prepro neural-network n-grams bow | 525.76 | 590.78 | |
59 | s444386 | 2023-05-30 20:39 | 3.0.0 | nn with bag of words neural-network bow | 218.85 | 243.69 | |
161 | s444415 | 2023-05-30 20:16 | 3.0.0 | Nn ngram model lower vocab v2 neural-network n-grams bow | 277.11 | 318.89 | |
194 | s444415 | 2023-05-30 20:12 | 3.0.0 | Neural network bow lower vocab neural-network n-grams bow | 305.08 | 355.06 | |
216 | Jakub Adamski | 2023-05-30 19:41 | 3.0.0 | lstm | 344.56 | 380.25 | |
177 | s444386 | 2023-05-30 19:30 | 3.0.0 | bag-of-words | 264.05 | 339.02 | |
358 | s444415 | 2023-05-30 19:22 | 3.0.0 | Nn ngram model top=10 neural-network n-grams bow | 524.23 | 592.25 | |
516 | s444386 | 2023-05-30 19:20 | 3.0.0 | bag-of-words | 1691.10 | 1661.01 | |
130 | s444415 | 2023-05-30 19:13 | 3.0.0 | Nn ngram model top=600 neural-network n-grams bow | 262.42 | 300.73 | |
171 | s444415 | 2023-05-30 19:08 | 3.0.0 | Nn ngram model top=300 neural-network n-grams | 283.37 | 331.20 | |
214 | s444415 | 2023-05-30 19:04 | 3.0.0 | Nn ngram model v6 neural-network n-grams | 328.34 | 377.28 | |
295 | s444415 | 2023-05-30 19:01 | 3.0.0 | Nn ngram model v5 neural-network n-grams | 410.74 | 475.73 | |
62 | s444386 | 2023-05-30 18:09 | 3.0.0 | bag-of-words neural-network bow | 221.05 | 246.45 | |
65 | s444386 | 2023-05-30 17:59 | 3.0.0 | bag-of-words | 224.44 | 251.36 | |
85 | s444386 | 2023-05-30 17:52 | 3.0.0 | bag-of-words | 238.89 | 267.66 | |
254 | s444415 | 2023-05-30 17:48 | 3.0.0 | Nn ngram model v4 neural-network n-grams | 377.46 | 422.46 | |
111 | s444386 | 2023-05-30 17:47 | 3.0.0 | bag-of-words | 264.05 | 285.96 | |
343 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=5 embed-size=140 learning-rate=1e-3 vocab-size=20000 | 533.75 | 558.09 | |
328 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=100 learning-rate=1e-3 vocab-size=10000 | 496.46 | 519.02 | |
324 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=5 embed-size=140 learning-rate=1e-4 vocab-size=20000 | 485.65 | 512.96 | |
323 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=140 learning-rate=1e-3 vocab-size=20000 | 489.16 | 510.91 | |
322 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=10000 context-length=10 embed-size=140 learning-rate=1e-4 vocab-size=20000 | 485.60 | 510.39 | |
319 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=300 learning-rate=1e-3 vocab-size=20000 | 481.32 | 505.13 | |
318 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=140 learning-rate=1e-4 vocab-size=20000 | 478.55 | 504.85 | |
316 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=140 learning-rate=1e-2 vocab-size=20000 | 483.35 | 502.12 | |
315 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=10 embed-size=30 learning-rate=1e-3 vocab-size=1000 | 478.76 | 501.18 | |
294 | s443930 | 2023-05-30 17:04 | 3.0.0 | Update outputs batch-size=5000 context-length=70 embed-size=140 learning-rate=1e-4 vocab-size=20000 | 454.83 | 473.56 | |
9 | s478846 | 2023-05-30 15:49 | 3.0.0 | neural solution neural-network n-grams bow | 96.29 | 105.79 | |
154 | s444386 | 2023-05-30 15:37 | 3.0.0 | bag-of-words | 298.25 | 316.09 | |
17 | s444356 | 2023-05-30 14:53 | 3.0.0 | neural network, bag of words batch-size=10000 context-size=25 dropout=0.3 embed-size=200 epochs=1 hidden-size=1000 hidden-size_2=500 learning-rate=1.0e-3 [...] neural-network bow | 125.69 | 134.70 | |
143 | s444386 | 2023-05-30 14:50 | 3.0.0 | bag-of-words | 279.87 | 311.92 | |
665 | Jakub Adamski | 2023-05-30 14:44 | 3.0.0 | lstm | N/A | N/A | |
170 | s444386 | 2023-05-30 14:43 | 3.0.0 | bag-of-words | 296.27 | 329.63 | |
664 | Jakub Adamski | 2023-05-30 14:40 | 3.0.0 | lstm | N/A | N/A | |
41 | Jakub Adamski | 2023-05-30 14:14 | 3.0.0 | nn-trigram | 187.98 | 210.78 | |
447 | Jakub Adamski | 2023-05-30 14:07 | 3.0.0 | nn-trigram | 935.24 | 942.35 | |
22 | s444417 | 2023-05-30 12:04 | 3.0.0 | nn, vocab_size: 1500, embed_size: 200, batch_size: 5000, epoch:1, hl, l_ctx, r_ctx = 10 batch_size=5000 embed_size=200 vocab_size=15000 neural-network | 130.41 | 143.54 | |
11 | s444501 | 2023-05-30 11:50 | 3.0.0 | s444501 neural with bagging bagging_left_ctx=25 bagging_right_ctx=25 batch_size=4000 embed_size=300 epochs=1 hidden_size=150 learning-rate=1.0e-4 ngram_left_ctx=7 [...] neural-network | 115.59 | 126.69 | |
365 | s444415 | 2023-05-30 10:50 | 3.0.0 | Nn ngram model v3 neural-network n-grams | 585.39 | 613.00 | |
51 | s444517 | 2023-05-30 10:43 | 3.0.0 | trigram, vocab_size=5000, embed_size=150, batch_size=12500 neural-network trigram | 215.12 | 234.42 | |
367 | s444415 | 2023-05-30 10:02 | 3.0.0 | Nn ngram model v2 n-grams | 587.84 | 613.49 | |
366 | s444415 | 2023-05-30 09:38 | 3.0.0 | Nn ngram model neural-network n-grams | 587.95 | 613.26 | |
398 | Jakub Adamski | 2023-05-30 09:36 | 3.0.0 | Trigram | 616.35 | 737.65 | |
403 | Jakub Adamski | 2023-05-30 07:36 | 3.0.0 | Trigram | 616.61 | 744.37 | |
236 | Mikołaj Pokrywka | 2023-05-29 15:05 | 3.0.0 | remove gonito yaml | 431.38 | 397.80 | |
235 | Mikołaj Pokrywka | 2023-05-29 15:00 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 epochs=1,2,3,4 learning-rate=0.0001,0.00001 training-set=100000-lines neural-network trigram | 431.38 | 397.80 | |
455 | s444415 | 2023-05-29 13:54 | 3.0.0 | Neural network trigram neural-network trigram | 1016.11 | 1016.42 | |
213 | s444356 | 2023-05-29 10:57 | 3.0.0 | neural network, bag of words Learning-rate=1.0e-3 batch-size=2500 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=300 vocab-size=20000 neural-network bow | 331.54 | 377.06 | |
405 | Jakub Adamski | 2023-05-28 18:32 | 3.0.0 | Trigram | 650.19 | 757.07 | |
36 | s444517 | 2023-05-28 12:28 | 3.0.0 | Neural network with context(left context 20 words, 3 previous words, 1 word after, 50k lines from input file) neural-network n-grams bow | 179.79 | 198.41 | |
361 | s444354 | 2023-05-28 00:59 | 3.0.0 | neurl-network bigram epochs=1 learning-rate=1.0e-3 neural-network bigram | 512.49 | 600.70 | |
56 | s444354 | 2023-05-27 22:38 | 3.0.0 | rerun with smoothing | 214.99 | 240.33 | |
663 | s444354 | 2023-05-27 22:34 | 3.0.0 | rerun with smoothing | 214.99 | N/A | |
662 | s444354 | 2023-05-27 22:22 | 3.0.0 | tetragrams-added | 214.99 | N/A | |
96 | 444498 | 2023-05-27 14:03 | 3.0.0 | lab8 neural network solution top=200 | 242.33 | 272.46 | |
76 | 444498 | 2023-05-27 14:03 | 3.0.0 | lab8 neural network solution top=400 | 231.45 | 257.93 | |
66 | 444498 | 2023-05-27 14:03 | 3.0.0 | lab8 neural network solution top=600 | 226.60 | 252.52 | |
24 | s409771 | 2023-05-26 23:32 | 3.0.0 | neural network, bag of words, l-context=10, r-context=5, batch-size=30_000, embed-size=300 neural-network bow | 152.85 | 175.20 | |
503 | 444498 | 2023-05-26 17:19 | 3.0.0 | zajecia7 | 1206.03 | 1421.71 | |
55 | 444498 | 2023-05-26 17:13 | 3.0.0 | trigram, tetragram | 210.92 | 239.79 | |
579 | 444498 | 2023-05-26 17:03 | 3.0.0 | trigram | Infinity | Infinity | |
12 | s478846 | 2023-05-26 13:16 | 3.0.0 | Neural network using embeddings neural-network bow | 116.84 | 130.59 | |
26 | ked | 2023-05-25 23:12 | 3.0.0 | zad9_v3 neural-network | 154.46 | 177.87 | |
282 | [anonymized] | 2023-05-25 18:40 | 3.0.0 | trigram | 397.60 | 455.94 | |
410 | ked | 2023-05-25 10:55 | 3.0.0 | simple bigram redone corr | 702.17 | 786.01 | |
156 | ked | 2023-05-25 09:32 | 3.0.0 | neural trigram fixed batch_size=512, 1024, 4096 embed_size=150 hidden_size=256, 1024 learning_rate=0.0001, 0.001 vocab_size=20000 neural-network | 300.66 | 316.80 | |
269 | ked | 2023-05-25 09:29 | 3.0.0 | neural bigram fixed batch_size=5000 embed_size=150 vocab_size=20000 neural-network | 394.97 | 446.29 | |
578 | Jakub Adamski | 2023-05-24 11:40 | 3.0.0 | Trigram | Infinity | Infinity | |
408 | Jakub Adamski | 2023-05-24 08:59 | 3.0.0 | Trigram fixed | 654.54 | 771.32 | |
94 | Jakub Eichner | 2023-05-24 08:26 | 3.0.0 | Bigram simple solution bigram challenging-america | 240.31 | 272.06 | |
553 | Jakub Adamski | 2023-05-24 07:39 | 3.0.0 | Trigram improved | 6374.19 | 6313.78 | |
253 | s444383 | 2023-05-15 10:16 | 3.0.0 | bigram | 380.26 | 420.77 | |
355 | s444383 | 2023-05-15 10:13 | 3.0.0 | bigram | 517.57 | 573.32 | |
661 | s444383 | 2023-05-15 09:29 | 3.0.0 | bigram | N/A | N/A | |
660 | s444383 | 2023-05-15 09:18 | 3.0.0 | bigram | N/A | N/A | |
144 | s478839 | 2023-05-10 23:11 | 3.0.0 | s478839 batch_size=3000 embed_size=200 epochs=1 hidden_size=100 num_of_top=10 top=100 vocab_size=15000 neural-network trigram | 272.53 | 312.09 | |
123 | s478839 | 2023-05-10 23:11 | 3.0.0 | s478839 batch_size=3000 embed_size=200 epochs=1 hidden_size=100 num_of_top=10 top=200 vocab_size=15000 neural-network trigram | 257.85 | 292.28 | |
110 | s478839 | 2023-05-10 23:11 | 3.0.0 | s478839 batch_size=3000 embed_size=200 epochs=1 hidden_size=100 num_of_top=10 top=300 vocab_size=15000 neural-network trigram | 251.88 | 284.56 | |
188 | s478873 | 2023-05-10 21:00 | 3.0.0 | s478873 EMBED_SIZE=300 batch-size=1000 embed-size=500 epochs=1 vocab-size=2000 neural-network trigram | 312.82 | 351.22 | |
187 | s478873 | 2023-05-10 21:00 | 3.0.0 | s478873 EMBED_SIZE=500 batch-size=1000 embed-size=500 epochs=1 vocab-size=2000 neural-network trigram | 314.40 | 348.95 | |
186 | s478873 | 2023-05-10 21:00 | 3.0.0 | s478873 EMBED_SIZE=200 batch-size=1000 embed-size=500 epochs=1 vocab-size=2000 neural-network trigram | 319.56 | 348.85 | |
108 | s478840 | 2023-05-10 20:25 | 3.0.0 | s478840 batch-size=1500 embed-size=250 hidden-size=100 top=200 vocab-size=20000 neural-network trigram | 246.31 | 281.23 | |
88 | s478840 | 2023-05-10 20:25 | 3.0.0 | s478840 batch-size=1500 embed-size=250 hidden-size=100 top=400 vocab-size=20000 neural-network trigram | 236.41 | 269.43 | |
82 | s478840 | 2023-05-10 20:25 | 3.0.0 | s478840 batch-size=1500 embed-size=250 hidden-size=100 top=600 vocab-size=20000 neural-network trigram | 232.15 | 263.30 | |
384 | s444354 | 2023-05-10 17:25 | 3.0.0 | trigram lr=0.0001 vocab=40000 hiddensize=[128,256] embed-size=300 embed_size=300 epochs=1 hidden_size=256 learning-rate=1.0e-4 vocab_size=40000 neural-network trigram | 640.26 | 691.03 | |
102 | s444018 | 2023-05-10 17:13 | 3.0.0 | s444018 batch-size=2000 context-size=-2 embed-size=300 hidden-size=150 learning-rate=1.0e-4 top=200 vocab-size=25000 neural-network trigram | 242.91 | 273.85 | |
80 | s444018 | 2023-05-10 17:13 | 3.0.0 | s444018 batch-size=2000 context-size=-2 embed-size=300 hidden-size=150 learning-rate=1.0e-4 top=400 vocab-size=25000 neural-network trigram | 231.35 | 258.18 | |
74 | s444018 | 2023-05-10 17:13 | 3.0.0 | s444018 batch-size=2000 context-size=-2 embed-size=300 hidden-size=150 learning-rate=1.0e-4 top=600 vocab-size=25000 neural-network trigram | 226.93 | 253.19 | |
312 | Adam Wojdyła | 2023-05-10 09:41 | 3.0.0 | trigram nn lr 0.1 emb=200 vocab=10k batch=6k hiddensize=100 batch-size=5000 embed-size=150 learningrate=1 vocab-size=15000 neural-network bigram | 431.23 | 490.64 | |
311 | Adam Wojdyła | 2023-05-10 09:41 | 3.0.0 | trigram nn lr 0.1 emb=200 vocab=10k batch=6k hiddensize=100 batch-size=5000 embed-size=150 learningrate=0_001 vocab-size=15000 neural-network bigram | 429.63 | 489.84 | |
310 | Adam Wojdyła | 2023-05-10 09:41 | 3.0.0 | trigram nn lr 0.1 emb=200 vocab=10k batch=6k hiddensize=100 batch-size=5000 embed-size=150 learningrate=0_1 vocab-size=15000 neural-network bigram | 430.41 | 487.35 | |
309 | Adam Wojdyła | 2023-05-10 09:41 | 3.0.0 | trigram nn lr 0.1 emb=200 vocab=10k batch=6k hiddensize=100 batch-size=5000 embed-size=150 learningrate=0_0001 vocab-size=15000 neural-network bigram | 427.98 | 487.02 | |
304 | Adam Wojdyła | 2023-05-10 09:41 | 3.0.0 | trigram nn lr 0.1 emb=200 vocab=10k batch=6k hiddensize=100 batch-size=5000 embed-size=150 learningrate=0_01 vocab-size=15000 neural-network bigram | 427.79 | 480.29 | |
58 | Cezary | 2023-05-10 07:51 | 3.0.0 | This is my trigram nn solution batch_size=15000 embed_size=200 epochs=1 hidden_size=190 learning-rate=1.0e-3 top=10 vocab_size=20000 neural-network trigram | 217.53 | 243.43 | |
260 | Jakub | 2023-05-10 07:49 | 3.0.0 | nn, trigram, previous and next out-embed-100 batch-size=10000 embed-size=- 100 - 500 - 1000 epochs=1 topk=10 vocab-size=20000 neural-network trigram | 389.81 | 430.91 | |
212 | Jakub | 2023-05-10 07:49 | 3.0.0 | nn, trigram, previous and next out-embed-500 batch-size=10000 embed-size=- 100 - 500 - 1000 epochs=1 topk=10 vocab-size=20000 neural-network trigram | 347.24 | 376.99 | |
189 | s478855 | 2023-05-09 21:57 | 3.0.0 | s478855 BATCH_SIZE=2000 EMBED_SIZE=100 VOCAB_SIZE=10000 neural-network trigram | 316.24 | 351.82 | |
135 | s478855 | 2023-05-09 21:57 | 3.0.0 | s478855 BATCH_SIZE=2000 EMBED_SIZE=400 VOCAB_SIZE=10000 neural-network trigram | 276.40 | 304.11 | |
127 | s478855 | 2023-05-09 21:57 | 3.0.0 | s478855 BATCH_SIZE=2000 EMBED_SIZE=200 VOCAB_SIZE=10000 neural-network trigram | 271.96 | 294.89 | |
101 | s444018 | 2023-05-09 21:26 | 3.0.0 | s444018 top=200 | 242.91 | 273.85 | |
79 | s444018 | 2023-05-09 21:26 | 3.0.0 | s444018 top=400 | 231.35 | 258.18 | |
73 | s444018 | 2023-05-09 21:26 | 3.0.0 | s444018 top=600 | 226.93 | 253.19 | |
567 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-2 optimizer=Adam vocab-size=20000 neural-network trigram adam | 63326.44 | 74084.21 | |
270 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=2 hidden-layer=True learning-rate=1e-3 optimizer=Adam vocab-size=10000 neural-network trigram adam | 406.17 | 446.54 | |
249 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-4 optimizer=Adam vocab-size=20000 neural-network trigram adam | 389.04 | 413.63 | |
247 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-2 optimizer=Adagrad vocab-size=20000 neural-network trigram adam | 384.42 | 411.31 | |
242 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=1 hidden-layer=True learning-rate=1e-3 optimizer=Adam vocab-size=10000 neural-network trigram adam | 383.27 | 406.92 | |
209 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=1000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-3 optimizer=Adam vocab-size=20000 neural-network trigram adam | 347.11 | 368.50 | |
204 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=10000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-3 optimizer=Adam vocab-size=20000 neural-network trigram adam | 339.31 | 362.71 | |
198 | s443930 | 2023-05-09 20:18 | 3.0.0 | Zajęcia 8 batch-size=5000 embed-size=100 epochs=1 hidden-layer=False learning-rate=1e-3 optimizer=Adam vocab-size=20000 neural-network trigram adam | 334.24 | 356.58 | |
659 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0dot00001 training-set=100000-lines neural-network trigram | 526.98 | N/A | |
344 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0dot00001 training-set=100000-lines neural-network trigram | N/A | 561.23 | |
226 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=12800 epochs=1 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | 347.67 | 388.41 | |
203 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | 324.85 | 361.47 | |
169 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=2 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | 292.90 | 327.95 | |
141 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=3 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | 276.94 | 311.58 | |
133 | Mikołaj Pokrywka | 2023-05-09 20:13 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=4 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | 267.06 | 300.97 | |
225 | Mikołaj Pokrywka | 2023-05-09 20:08 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 347.67 | 388.41 | |
202 | Mikołaj Pokrywka | 2023-05-09 20:08 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0dot0001 training-set=100000-lines neural-network trigram | N/A | 361.47 | |
168 | Mikołaj Pokrywka | 2023-05-09 20:08 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=2 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 292.90 | 327.95 | |
140 | Mikołaj Pokrywka | 2023-05-09 20:08 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=3 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 276.94 | 311.58 | |
132 | Mikołaj Pokrywka | 2023-05-09 20:08 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=4 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 267.06 | 300.97 | |
201 | Mikołaj Pokrywka | 2023-05-09 20:01 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=1 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 347.67 | 388.41 | |
167 | Mikołaj Pokrywka | 2023-05-09 20:01 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=2 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 292.90 | 327.95 | |
139 | Mikołaj Pokrywka | 2023-05-09 20:01 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=3 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 276.94 | 311.58 | |
131 | Mikołaj Pokrywka | 2023-05-09 20:01 | 3.0.0 | Zajęcia 8 batch-size=6400,12800 batch_size=6400 epochs=4 learning-rate=0.0001 training-set=100000-lines neural-network trigram | 267.06 | 300.97 | |
284 | [anonymized] | 2023-05-09 19:59 | 3.0.0 | trigram batch-size=2000 embed-size=200 embed_size=400 hidden_size=100 vocab-size=10000 neural-network trigram | 415.19 | 458.31 | |
281 | [anonymized] | 2023-05-09 19:59 | 3.0.0 | trigram batch-size=2000 embed-size=200 embed_size=300 hidden_size=100 vocab-size=10000 neural-network trigram | 418.78 | 455.83 | |
273 | [anonymized] | 2023-05-09 19:59 | 3.0.0 | trigram batch-size=2000 embed-size=200 embed_size=200 hidden_size=100 vocab-size=10000 neural-network trigram | 415.57 | 449.20 | |
207 | Jakub Eichner | 2023-05-09 19:48 | 3.0.0 | Trigram Neural Network Solution batch_size=5000 context_size=2 embed_size=200 epochs=2 hidden_size=100 learning-rate=2.0e-3 top_k=200 vocab_size=20000 neural-network | 323.47 | 368.39 | |
137 | Kamil Guttmann | 2023-05-09 17:59 | 3.0.0 | nn, trigram, predict from next two tokens batch-size=5000 embed-size=100 epochs=1 hidden_size=500 topk=150 vocab-size=20000 neural-network trigram | 287.54 | 307.65 | |
113 | Kamil Guttmann | 2023-05-09 17:59 | 3.0.0 | nn, trigram, predict from next two tokens batch-size=5000 embed-size=100 epochs=1 hidden_size=1024 topk=150 vocab-size=20000 neural-network trigram | 265.70 | 287.08 | |
112 | Kamil Guttmann | 2023-05-09 17:59 | 3.0.0 | nn, trigram, predict from next two tokens batch-size=5000 embed-size=100 epochs=1 hidden_size=2048 topk=150 vocab-size=20000 neural-network trigram | 259.62 | 286.56 | |
104 | Kamil Guttmann | 2023-05-09 17:59 | 3.0.0 | nn, trigram, predict from next two tokens batch-size=5000 embed-size=100 embed_size=256 epochs=1 hidden_size=2048 topk=150 vocab-size=20000 neural-network trigram | 250.85 | 275.80 | |
563 | Jakub Adamski | 2023-05-09 13:51 | 3.0.0 | Trigram regression | 10006.89 | 10041.36 | |
210 | s444476 | 2023-05-08 22:26 | 3.0.0 | trigram nn done HIDDENLAYER=100 | 341.24 | 373.09 | |
196 | s444476 | 2023-05-08 22:26 | 3.0.0 | trigram nn done HIDDENLAYER=1000 | 319.69 | 355.85 | |
308 | s444452 | 2023-05-08 16:48 | 3.0.0 | neural trigrams EMBDEDSIZE=200 | 463.17 | 486.35 | |
307 | s444452 | 2023-05-08 16:48 | 3.0.0 | neural trigrams EMBDEDSIZE=150 | 461.59 | 485.85 | |
306 | s444452 | 2023-05-08 16:48 | 3.0.0 | neural trigrams EMBDEDSIZE=100 | 462.03 | 485.76 | |
63 | s478846 | 2023-05-08 12:50 | 3.0.0 | Neural trigram solution HIDDENLAYERSIZE=2000 neural-network trigram | 231.52 | 248.42 | |
60 | s478846 | 2023-05-08 12:50 | 3.0.0 | Neural trigram solution HIDDENLAYERSIZE=500 neural-network trigram | 227.52 | 244.86 | |
252 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=1000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=50 vocab-size=20000 neural-network trigram | 365.74 | 417.79 | |
222 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=1000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=100 vocab-size=20000 neural-network trigram | 339.86 | 385.95 | |
199 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=1000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=200 vocab-size=20000 neural-network trigram | 317.91 | 357.01 | |
190 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=5000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=300 vocab-size=20000 neural-network trigram | 313.50 | 353.34 | |
184 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=1000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=300 vocab-size=20000 neural-network trigram | 302.20 | 346.43 | |
183 | s444356 | 2023-05-07 21:22 | 3.0.0 | neural network, trigram Learning-rate=1.0e-3 batch-size=2500 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=300 vocab-size=20000 neural-network trigram | 305.08 | 344.46 | |
97 | s444501 | 2023-05-07 20:59 | 3.0.0 | trigram neural 444501 batch_size=2000 embed_size=300 epochs=1 hidden_size=150 learning-rate=1.0e-4 top=200 vocab_size=25000 neural-network trigram | 242.33 | 272.46 | |
77 | s444501 | 2023-05-07 20:59 | 3.0.0 | trigram neural 444501 batch_size=2000 embed_size=300 epochs=1 hidden_size=150 learning-rate=1.0e-4 top=400 vocab_size=25000 neural-network trigram | 231.45 | 257.93 | |
67 | s444501 | 2023-05-07 20:59 | 3.0.0 | trigram neural 444501 batch_size=2000 embed_size=300 epochs=1 hidden_size=150 learning-rate=1.0e-4 top=600 vocab_size=25000 neural-network trigram | 226.60 | 252.52 | |
191 | s444356 | 2023-05-05 22:51 | 3.0.0 | neural network, trigram Learning-rate=2.0e-3 batch-size=1000 context-size=2 embed-size=200 epochs=1 hidden-size=100 top=300 vocab-size=20000 neural-network trigram | 311.38 | 353.70 | |
246 | [anonymized] | 2023-05-05 17:20 | 3.0.0 | Trigram nn | 382.63 | 411.25 | |
432 | s478840 | 2023-04-29 11:16 | 3.0.0 | s478840 | 741.81 | 882.51 | |
268 | Cezary | 2023-04-28 13:19 | 3.0.0 | Bigramn nn neural-network bigram | 388.14 | 441.29 | |
541 | Wiktor Bombola | 2023-04-28 11:38 | 3.0.0 | simple nn model zad7 | 4621.88 | 4462.41 | |
562 | Jakub Adamski | 2023-04-28 09:00 | 3.0.0 | Bigram regression | 10083.91 | 10022.76 | |
342 | Adam Wojdyła | 2023-04-28 07:59 | 3.0.0 | s444507 batch-size=5000 embed-size=150 vocab-size=15000 neural-network bigram | 532.26 | 557.26 | |
383 | s444354 | 2023-04-28 07:57 | 3.0.0 | s444354 bigrams neural-network bigram trigram | 640.26 | 691.03 | |
346 | s478873 | 2023-04-28 07:35 | 3.0.0 | s478873 batch-size=2000 embed-size=100 epochs=1 vocab-size=10000 neural-network bigram | 497.68 | 561.58 | |
449 | s444415 | 2023-04-28 07:33 | 3.0.0 | lab 7 nn bigram batch_size=4000 embed_size=250 epochs=1 learning-rate=1.0e-2 vocab_size=3000 bigram | 826.73 | 947.55 | |
561 | Jakub Adamski | 2023-04-28 07:33 | 3.0.0 | Bigram regression | 9956.13 | 10000.14 | |
411 | s478873 | 2023-04-28 07:02 | 3.0.0 | s478873 neural-network bigram | 691.01 | 800.61 | |
397 | [anonymized] | 2023-04-27 23:53 | 3.0.0 | s478841 - nn bigram neural-network bigram | 655.65 | 731.00 | |
436 | s444018 | 2023-04-27 23:44 | 3.0.0 | bigram neural network #2 | 850.31 | 892.10 | |
435 | s444018 | 2023-04-27 23:36 | 3.0.0 | bigram neural network | 850.69 | 890.82 | |
396 | [anonymized] | 2023-04-27 23:33 | 3.0.0 | First model again | 655.65 | 731.00 | |
402 | [anonymized] | 2023-04-27 23:24 | 3.0.0 | No mods but stopped training | 667.67 | 741.67 | |
291 | s478839 | 2023-04-27 22:56 | 3.0.0 | s478839 neural-network bigram | 410.34 | 471.21 | |
377 | s478855 | 2023-04-27 22:45 | 3.0.0 | s478855 batch-size=6000 embed-size=100 vocab-size=10000 neural-network bigram | 601.95 | 667.90 | |
335 | s478839 | 2023-04-27 22:25 | 3.0.0 | s478839 | 463.68 | 532.15 | |
501 | [anonymized] | 2023-04-27 22:13 | 3.0.0 | Just cleaned text | 1376.70 | 1404.97 | |
363 | s444018 | 2023-04-27 22:07 | 3.0.0 | kenLM | 583.71 | 606.41 | |
504 | [anonymized] | 2023-04-27 22:02 | 3.0.0 | Vocabulary from sentences only | 1525.59 | 1562.43 | |
360 | s478839 | 2023-04-27 21:52 | 3.0.0 | s478839 | 551.63 | 599.16 | |
345 | s478873 | 2023-04-27 21:46 | 3.0.0 | s478873 neural-network bigram | 497.68 | 561.58 | |
128 | s409771 | 2023-04-27 20:37 | 3.0.0 | neural network neural-network bigram | 263.66 | 294.94 | |
349 | s444386 | 2023-04-27 19:53 | 3.0.0 | Neural network bigram model bigram | 504.48 | 565.89 | |
369 | [anonymized] | 2023-04-27 19:36 | 3.0.0 | bigram, nn batch-size=2500 embed-size=100 epochs=1 vocab-size=10000 neural-network bigram | 550.51 | 618.84 | |
185 | s444356 | 2023-04-27 19:35 | 3.0.0 | neural network, bigram batch-size=5000 embed-size=1000 epochs=1 top=300 vocab-size=30000 neural-network bigram | 302.93 | 347.59 | |
326 | [anonymized] | 2023-04-27 19:00 | 3.0.0 | s478831 bigram | 466.22 | 515.08 | |
166 | s444517 | 2023-04-27 18:13 | 3.0.0 | Neural-network bigram more words in output neural-network bigram | 287.21 | 325.07 | |
240 | s443930 | 2023-04-27 18:00 | 3.0.0 | Zajęcia 7 - Zadanie 1 epochs=1 learning-rate=1.0e-3 neural-network bigram adam | 378.31 | 404.44 | |
197 | s444356 | 2023-04-27 17:12 | 3.0.0 | neural network, bigram batch-size=5000 embed-size=100 epochs=1 vocab-size=20000 neural-network bigram | 316.27 | 356.11 | |
93 | Jakub Eichner | 2023-04-27 17:08 | 3.0.0 | RegLog simple solution challenging-america | 240.31 | 272.06 | |
164 | Kamil Guttmann | 2023-04-27 16:48 | 3.0.0 | nn, bigram, 20k vocab, 100 embed, 5k batch, top150 batch-size=5000 embedding-size=20000 epochs=1 topk=150 vocab-size=20000 neural-network bigram | 285.73 | 323.67 | |
395 | [anonymized] | 2023-04-27 13:50 | 3.0.0 | first nn test results | 655.65 | 731.00 | |
356 | s444501 | 2023-04-27 09:35 | 3.0.0 | s444501 neural bigram epochs=1 learning-rate=1.0e-3 neural-network bigram | 510.88 | 584.51 | |
232 | s444452 | 2023-04-27 07:14 | 3.0.0 | Neural bigrams bigram | 360.94 | 396.30 | |
256 | s444476 | 2023-04-26 19:01 | 3.0.0 | Neural Network Bigrams bigram | 379.65 | 422.76 | |
382 | s444354 | 2023-04-26 16:06 | 3.0.0 | My solution | 640.26 | 691.03 | |
261 | s478846 | 2023-04-26 16:04 | 3.0.0 | bigram neuron solution neural-network bigram | 393.45 | 432.05 | |
241 | s444517 | 2023-04-26 14:23 | 3.0.0 | neural network with changed parameters, preprocessed neural-network bigram | 358.65 | 405.63 | |
339 | s444417 | 2023-04-26 13:09 | 3.0.0 | bigram neural neural-network bigram | 485.49 | 540.81 | |
457 | Jakub Eichner | 2023-04-26 06:19 | 3.0.0 | kenlm solution challenging-america | 930.37 | 1023.11 | |
364 | Adam Wojdyła | 2023-04-26 06:16 | 3.0.0 | kenlm trigram | 585.16 | 607.40 | |
368 | s409771 | 2023-04-25 20:38 | 3.0.0 | kenlm kenlm | 587.13 | 617.71 | |
458 | s444517 | 2023-04-25 17:26 | 3.0.0 | KenLM kenlm | 930.37 | 1023.41 | |
348 | s444386 | 2023-04-25 15:48 | 3.0.0 | neural bigram | 504.48 | 565.89 | |
658 | s444386 | 2023-04-25 15:28 | 3.0.0 | neural bigrams | N/A | N/A | |
289 | s444452 | 2023-04-24 21:03 | 3.0.0 | neural bigrams 1 | 413.49 | 468.11 | |
106 | s409771 | 2023-04-23 21:09 | 3.0.0 | trigram, left and right context trigram | 248.34 | 279.95 | |
206 | s409771 | 2023-04-23 19:41 | 3.0.0 | trigram solution trigram | 326.67 | 367.57 | |
262 | s444517 | 2023-04-23 12:11 | 3.0.0 | neural network some params adjusted neural-network | 383.65 | 433.05 | |
407 | s478855 | 2023-04-23 08:27 | 3.0.0 | s478855 tetragram | 747.31 | 770.87 | |
431 | s478855 | 2023-04-23 08:17 | 3.0.0 | s478855 | 747.31 | 871.40 | |
347 | Adam Wojdyła | 2023-04-23 08:00 | 3.0.0 | trigram left context interpolation smoothing | 521.69 | 565.13 | |
220 | Adam Wojdyła | 2023-04-23 07:47 | 3.0.0 | trigram l+r final | 335.02 | 383.93 | |
84 | s478839 | 2023-04-23 07:33 | 3.0.0 | s478839 | 235.74 | 265.93 | |
443 | Adam Wojdyła | 2023-04-23 07:32 | 3.0.0 | tetragram | 830.38 | 923.00 | |
314 | Martyna Druminska | 2023-04-23 07:28 | 3.0.0 | Prześlij pliki do 'test-A' | 448.72 | 499.11 | |
538 | Jakub Adamski | 2023-04-23 07:26 | 3.0.0 | Tetragram | 3974.10 | 4044.83 | |
549 | Jakub Adamski | 2023-04-23 07:12 | 3.0.0 | Trigram | 5250.87 | 5414.22 | |
332 | Martyna Druminska | 2023-04-23 05:27 | 3.0.0 | Prześlij pliki do 'dev-0' | 475.49 | 526.24 | |
657 | Adam Wojdyła | 2023-04-23 01:47 | 3.0.0 | trigram | 335.02 | N/A | |
341 | [anonymized] | 2023-04-22 23:56 | 3.0.0 | trigram | 486.66 | 551.61 | |
87 | s478873 | 2023-04-22 22:58 | 3.0.0 | s478873 tetragram | 237.91 | 268.80 | |
338 | Martyna Druminska | 2023-04-22 22:53 | 3.0.0 | Prześlij pliki do 'dev-0' | 482.58 | 537.87 | |
336 | Martyna Druminska | 2023-04-22 22:44 | 3.0.0 | Prześlij pliki do 'test-A' | 479.94 | 532.32 | |
656 | Martyna Druminska | 2023-04-22 22:33 | 3.0.0 | Prześlij pliki do 'test-A' | 731.99 | N/A | |
53 | [anonymized] | 2023-04-22 22:15 | 3.0.0 | s478831 | 216.45 | 239.65 | |
400 | Martyna Druminska | 2023-04-22 21:49 | 3.0.0 | dfdfv | 731.99 | 741.10 | |
386 | Adam Wojdyła | 2023-04-22 20:41 | 3.0.0 | tetragram left only | 612.09 | 692.91 | |
175 | Kamil Guttmann | 2023-04-22 20:09 | 3.0.0 | tetragram, left, tuned lambdas, top_150 tetragram | 294.77 | 336.28 | |
321 | Kamil Guttmann | 2023-04-22 19:33 | 3.0.0 | tetragram left context tetragram | 441.92 | 509.11 | |
61 | s444018 | 2023-04-22 19:27 | 3.0.0 | tetragram solution + '-\n' | 218.84 | 246.28 | |
320 | s444517 | 2023-04-22 17:44 | 3.0.0 | trigram left context trigram | 461.67 | 506.27 | |
283 | Adam Wojdyła | 2023-04-22 15:21 | 3.0.0 | trigram full context no smoothing | 373.03 | 456.30 | |
292 | Adam Wojdyła | 2023-04-22 14:24 | 3.0.0 | trigram left context final - fix | 405.40 | 471.89 | |
655 | Adam Wojdyła | 2023-04-22 14:13 | 3.0.0 | trigram left context final | 405.40 | N/A | |
519 | Adam Wojdyła | 2023-04-22 01:18 | 3.0.0 | trigram left only trigram | 1387.27 | 1746.24 | |
149 | s444356 | 2023-04-21 19:16 | 3.0.0 | tetragram, left and right context tetragram challenging-america | 281.24 | 312.55 | |
179 | s444356 | 2023-04-21 18:59 | 3.0.0 | tetragram, left and right context tetragram challenging-america | 296.65 | 340.24 | |
515 | Adam Wojdyła | 2023-04-20 20:34 | 3.0.0 | trigram | 1575.09 | 1630.19 | |
89 | s478846 | 2023-04-20 07:51 | 3.0.0 | trigram solution trigram | 241.65 | 269.49 | |
193 | s444417 | 2023-04-19 12:25 | 3.0.0 | tri-gram model n-grams trigram | 319.81 | 354.40 | |
44 | ked | 2023-04-15 00:55 | 3.0.0 | 449288 tetragram model | 192.56 | 217.10 | |
250 | s443930 | 2023-04-13 19:42 | 3.0.0 | vocab > 25k prob all left context only | 385.42 | 416.99 | |
272 | Mikołaj Pokrywka | 2023-04-13 19:35 | 3.0.0 | 4gram left context | 415.43 | 448.41 | |
331 | s443930 | 2023-04-13 07:42 | 3.0.0 | v>25k p<-25k left only | 478.92 | 526.24 | |
441 | Mikołaj Pokrywka | 2023-04-13 07:33 | 3.0.0 | tetragram | N/A | 917.41 | |
483 | s443930 | 2023-04-12 20:53 | 3.0.0 | v>10k p=50k left only | 1092.38 | 1141.87 | |
533 | s443930 | 2023-04-12 20:33 | 3.0.0 | 4gram left only | 2831.40 | 3030.81 | |
442 | Mikołaj Pokrywka | 2023-04-12 18:58 | 3.0.0 | 4gram | 884.31 | 920.36 | |
554 | s443930 | 2023-04-12 18:36 | 3.0.0 | 4gram left only | 6160.51 | 6499.18 | |
380 | 444498 | 2023-04-12 09:13 | 3.0.0 | bigram solution | 640.26 | 691.03 | |
381 | s444354 | 2023-04-11 12:16 | 3.0.0 | added test set evaluation | 640.26 | 691.03 | |
478 | Adam Wojdyła | 2023-04-10 16:53 | 3.0.0 | bigram final bigram | 1003.67 | 1072.06 | |
481 | Adam Wojdyła | 2023-04-10 16:47 | 3.0.0 | bigram fixed v5 | 1007.59 | 1100.94 | |
477 | Adam Wojdyła | 2023-04-10 16:41 | 3.0.0 | bigram fixed v4 | 1003.67 | 1072.06 | |
54 | s444501 | 2023-04-10 15:36 | 3.0.0 | s444501 tetragram | 210.92 | 239.79 | |
496 | s444465 | 2023-04-09 16:13 | 3.0.0 | Bigram model solution | 1186.80 | 1218.88 | |
413 | s478840 | 2023-04-08 09:44 | 3.0.0 | s478840 | 708.10 | 800.68 | |
448 | s478873 | 2023-04-05 16:13 | 3.0.0 | s478873 bigram | 5141.26 | 945.01 | |
654 | s478873 | 2023-04-05 16:07 | 3.0.0 | s478873 | 5141.26 | N/A | |
653 | Jakub | 2023-04-05 12:56 | 3.0.0 | Fix wrong test data challenging-america | 2071.99 | N/A | |
652 | Jakub | 2023-04-05 10:58 | 3.0.0 | New results challenging-america | N/A | N/A | |
577 | Martyna Druminska | 2023-04-05 10:56 | 3.0.0 | zxdcdxc | Infinity | Infinity | |
297 | [anonymized] | 2023-04-05 10:49 | 3.0.0 | dev0 bigram | 421.55 | 476.65 | |
394 | s444452 | 2023-04-05 10:37 | 3.0.0 | trigrams | 618.07 | 722.18 | |
305 | s444415 | 2023-04-05 10:35 | 3.0.0 | Bigram bigram | 435.03 | 483.09 | |
523 | s478839 | 2023-04-05 10:17 | 3.0.0 | s478839 | N/A | 1924.40 | |
333 | [anonymized] | 2023-04-05 09:31 | 3.0.0 | bigram smoothing data fix bigram | 469.28 | 526.85 | |
651 | s478873 | 2023-04-05 09:08 | 3.0.0 | s478873 | 5141.26 | N/A | |
650 | s478873 | 2023-04-05 08:57 | 3.0.0 | s478873 | 5141.26 | N/A | |
649 | s444354 | 2023-04-05 08:42 | 3.0.0 | transformer | 640.26 | N/A | |
527 | s444018 | 2023-04-05 08:28 | 3.0.0 | s444018 | 2014.97 | 2337.24 | |
484 | s443930 | 2023-04-05 08:25 | 3.0.0 | Add test-A/out | 1153.61 | 1144.88 | |
524 | Mikołaj Pokrywka | 2023-04-05 08:23 | 3.0.0 | second try | 2265.56 | 2117.10 | |
430 | s478855 | 2023-04-05 08:23 | 3.0.0 | s478855 bigram | N/A | 871.40 | |
576 | [anonymized] | 2023-04-05 08:22 | 3.0.0 | 478841 - double bigram bigram | Infinity | Infinity | |
518 | s478815 | 2023-04-05 08:21 | 3.0.0 | bigram | 1738.06 | 1712.76 | |
399 | Kamil Guttmann | 2023-04-05 08:20 | 3.0.0 | Bigrams v2 | 700.48 | 739.85 | |
517 | s478815 | 2023-04-04 20:57 | 3.0.0 | zad2p challenging-america | N/A | 1701.45 | |
525 | Jakub Adamski | 2023-04-04 20:44 | 3.0.0 | Bigram solution | 1937.68 | 2213.73 | |
542 | [anonymized] | 2023-04-04 20:41 | 3.0.0 | s444421 | 5060.94 | 5054.60 | |
325 | s444455 | 2023-04-04 20:41 | 3.0.0 | 450.31 | 514.32 | ||
575 | Martyna Druminska | 2023-04-04 20:32 | 3.0.0 | cxvz | Infinity | Infinity | |
362 | s478846 | 2023-04-04 20:31 | 3.0.0 | bigram solution | 5030.56 | 604.96 | |
300 | s444391 | 2023-04-04 20:12 | 3.0.0 | add output files | 417.80 | 477.62 | |
334 | s478846 | 2023-04-04 20:12 | 3.0.0 | bigram solution | 5030.56 | 529.43 | |
520 | Mikołaj Pokrywka | 2023-04-04 20:01 | 3.0.0 | first try bigram | 1777.89 | 1760.43 | |
350 | s478846 | 2023-04-04 19:58 | 3.0.0 | bigram solution | 5030.56 | 566.27 | |
574 | Martyna Druminska | 2023-04-04 19:50 | 3.0.0 | cxvz | Infinity | Infinity | |
422 | Kamil Guttmann | 2023-04-04 19:41 | 3.0.0 | Bigrams first try | 813.62 | 825.56 | |
278 | s444452 | 2023-04-04 19:13 | 3.0.0 | bigrams | 408.85 | 453.98 | |
648 | Martyna Druminska | 2023-04-04 18:34 | 3.0.0 | cxvz | N/A | N/A | |
522 | s444386 | 2023-04-04 18:23 | 3.0.0 | s444386 | 1715.41 | 1903.69 | |
387 | s444517 | 2023-04-04 18:00 | 3.0.0 | zad 4_2 | 676.39 | 699.79 | |
526 | s444386 | 2023-04-04 16:26 | 3.0.0 | s444386 | 2054.05 | 2251.95 | |
385 | s444417 | 2023-04-04 16:01 | 3.0.0 | bigram bert bigram bert | N/A | 692.24 | |
303 | s444476 | 2023-04-04 14:50 | 3.0.0 | bigram solution bigram | 423.77 | 479.71 | |
452 | s444501 | 2023-04-03 16:48 | 3.0.0 | s444501 | 872.09 | 964.56 | |
296 | s444356 | 2023-03-31 19:19 | 3.0.0 | bigram challenging-america | 416.90 | 476.53 | |
302 | s444356 | 2023-03-31 19:03 | 3.0.0 | bigram challenging-america | 417.54 | 477.98 | |
301 | s444356 | 2023-03-31 18:19 | 3.0.0 | bigram challenging-america | 418.42 | 477.94 | |
647 | Martyna Druminska | 2023-03-29 17:37 | 3.0.0 | dfgedv | 3498.81 | N/A | |
646 | s478840 | 2023-03-29 11:56 | 3.0.0 | s478840 | 6492.60 | N/A | |
645 | s444455 | 2023-03-29 11:35 | 3.0.0 | . | 2791.21 | N/A | |
644 | s444455 | 2023-03-29 11:15 | 3.0.0 | z1v4 | 2829.22 | N/A | |
530 | s443930 | 2023-03-29 11:12 | 3.0.0 | Add .gitignore | 2810.79 | 2717.38 | |
529 | s444452 | 2023-03-29 11:09 | 3.0.0 | second solution | 2491.87 | 2544.09 | |
543 | s444465 | 2023-03-29 10:57 | 3.0.0 | 444465 first solution | 4880.78 | 5151.61 | |
548 | s444476 | 2023-03-29 10:49 | 3.0.0 | Solution | 5137.00 | 5392.48 | |
497 | 444498 | 2023-03-29 10:32 | 3.0.0 | solution word gap tylka | 1181.17 | 1221.55 | |
546 | [anonymized] | 2023-03-29 10:31 | 3.0.0 | s444421 | 5032.83 | 5353.27 | |
573 | s444452 | 2023-03-29 10:30 | 3.0.0 | trivial solution | Infinity | Infinity | |
643 | Wiktor Bombola | 2023-03-29 10:17 | 3.0.0 | simple prediction | 8659.81 | N/A | |
572 | [anonymized] | 2023-03-29 10:05 | 3.0.0 | test | Infinity | Infinity | |
642 | s444455 | 2023-03-29 10:04 | 3.0.0 | zad1vol3 | 2811.12 | N/A | |
641 | s478873 | 2023-03-29 10:01 | 3.0.0 | s478873 | 5141.26 | N/A | |
640 | s444455 | 2023-03-29 10:00 | 3.0.0 | zad1 vol2 | 2798.33 | N/A | |
639 | s444455 | 2023-03-29 09:57 | 3.0.0 | zad1 | 2826.23 | N/A | |
638 | s444356 | 2023-03-29 09:55 | 3.0.0 | done challenging-america | 5219.75 | N/A | |
555 | s478839 | 2023-03-29 09:54 | 3.0.0 | s478839 | 6851.67 | 7343.77 | |
637 | Martyna Druminska | 2023-03-29 09:53 | 3.0.0 | fgfdb | N/A | N/A | |
636 | s444356 | 2023-03-29 09:51 | 3.0.0 | done challenging-america | N/A | N/A | |
635 | s444356 | 2023-03-29 09:48 | 3.0.0 | done challenging-america | N/A | N/A | |
531 | s444018 | 2023-03-29 09:44 | 3.0.0 | s444018 | 2833.96 | 2960.56 | |
634 | s478815 | 2023-03-29 09:41 | 3.0.0 | s478815 challenging-america | 2421.96 | N/A | |
633 | s444354 | 2023-03-29 09:40 | 3.0.0 | eeeee | 5601.35 | N/A | |
632 | s478815 | 2023-03-29 09:39 | 3.0.0 | s478815 | N/A | N/A | |
551 | s444417 | 2023-03-29 09:38 | 3.0.0 | if United | 5377.48 | 5606.89 | |
547 | s444386 | 2023-03-29 09:37 | 3.0.0 | zad | 5137.00 | 5392.48 | |
528 | s409771 | 2023-03-29 09:36 | 3.0.0 | first submission | 2378.19 | 2355.35 | |
631 | Jakub | 2023-03-29 09:35 | 3.0.0 | Updated rand distribs | 4275.28 | N/A | |
540 | [anonymized] | 2023-03-29 09:33 | 3.0.0 | simple solution | 4199.53 | 4245.04 | |
505 | s444517 | 2023-03-29 09:33 | 3.0.0 | zadanie 4_1 | 1553.08 | 1589.54 | |
479 | s444383 | 2023-03-29 09:31 | 3.0.0 | init3 | N/A | 1083.26 | |
630 | Jakub | 2023-03-29 09:31 | 3.0.0 | Rand distribs | 4343.27 | N/A | |
559 | s478839 | 2023-03-29 09:31 | 3.0.0 | s478839 | 8580.14 | 8509.62 | |
560 | s444383 | 2023-03-29 09:29 | 3.0.0 | init2 | N/A | 9100.12 | |
629 | Jakub Eichner | 2023-03-29 09:29 | 3.0.0 | Wspaniale szacowanie & model predykcyjny | 2779.92 | N/A | |
534 | ked | 2023-03-29 09:26 | 3.0.0 | zad1 | 2924.54 | 3067.07 | |
628 | Jakub | 2023-03-29 09:25 | 3.0.0 | Overcome infinity | 4112.00 | N/A | |
627 | Jakub | 2023-03-29 09:22 | 3.0.0 | My briliant solution | Infinity | N/A | |
429 | s444383 | 2023-03-29 09:22 | 3.0.0 | init | N/A | 870.81 | |
545 | [anonymized] | 2023-03-29 09:22 | 3.0.0 | random mod results | 4893.98 | 5250.39 | |
539 | Jakub Adamski | 2023-03-29 09:20 | 3.0.0 | Proste rozwiązanie | 4199.53 | 4245.04 | |
558 | s444501 | 2023-03-29 09:18 | 3.0.0 | s444501 | 7976.73 | 7946.32 | |
557 | s444501 | 2023-03-29 09:14 | 3.0.0 | s444501 | 7976.73 | 7946.32 | |
556 | s444501 | 2023-03-29 09:06 | 3.0.0 | s444501 | N/A | 7946.32 | |
544 | Kamil Guttmann | 2023-03-29 09:04 | 3.0.0 | Simple rules | 4889.69 | 5244.34 | |
550 | s444501 | 2023-03-29 09:00 | 3.0.0 | s444501 | N/A | 5574.23 | |
532 | Mikołaj Pokrywka | 2023-03-29 08:58 | 3.0.0 | Test s444463 | 2853.34 | 2981.35 | |
427 | [anonymized] | 2022-06-29 11:29 | 3.0.0 | s434695 kenlm kenlm | 833.52 | 852.54 | |
353 | [anonymized] | 2022-06-28 11:41 | 3.0.0 | s434695 neural-network bigram | 506.72 | 570.80 | |
266 | [anonymized] | 2022-06-26 18:43 | 3.0.0 | 434695 smoothing plusalpha plusaplha | 391.43 | 436.81 | |
299 | [anonymized] | 2022-06-26 17:59 | 3.0.0 | 434695 n-gram n-grams | 419.16 | 477.08 | |
626 | [name not given] | 2022-06-20 08:30 | 3.0.0 | 434742 out-best gpt2 | N/A | N/A | |
625 | [name not given] | 2022-06-20 08:30 | 3.0.0 | 434742 out-old2 gpt2 | N/A | N/A | |
624 | [name not given] | 2022-06-20 08:30 | 3.0.0 | 434742 out-old3 gpt2 | N/A | N/A | |
623 | [name not given] | 2022-06-20 08:30 | 3.0.0 | 434742 out-old gpt2 | 435.21 | N/A | |
622 | [name not given] | 2022-06-20 08:24 | 3.0.0 | 434742 out-best gpt2 | N/A | N/A | |
621 | [name not given] | 2022-06-20 08:24 | 3.0.0 | 434742 out-old2 gpt2 | N/A | N/A | |
620 | [name not given] | 2022-06-20 08:24 | 3.0.0 | 434742 out-old gpt2 | 435.21 | N/A | |
619 | [name not given] | 2022-06-19 23:29 | 3.0.0 | 434742 out-old2 gpt2 | N/A | N/A | |
618 | [name not given] | 2022-06-19 23:29 | 3.0.0 | 434742 out-old gpt2 | 435.21 | N/A | |
420 | [name not given] | 2022-06-19 23:29 | 3.0.0 | 434742 gpt2 | N/A | 816.39 | |
617 | [name not given] | 2022-06-19 23:26 | 3.0.0 | 434742 out-old2 gpt2 | N/A | N/A | |
616 | [name not given] | 2022-06-19 23:26 | 3.0.0 | 434742 out-old gpt2 | 435.21 | N/A | |
419 | [name not given] | 2022-06-19 23:26 | 3.0.0 | 434742 gpt2 | N/A | 816.39 | |
615 | [name not given] | 2022-06-19 22:47 | 3.0.0 | 434742 out-old gpt2 fine-tuned | 435.21 | N/A | |
418 | [name not given] | 2022-06-19 22:47 | 3.0.0 | 434742 gpt2 fine-tuned | N/A | 816.39 | |
340 | [name not given] | 2022-06-19 20:19 | 3.0.0 | 434742 gpt2 | 435.21 | 541.78 | |
614 | [name not given] | 2022-06-19 20:15 | 3.0.0 | 434742 gpt2 | 435.21 | N/A | |
453 | Anna Nowak | 2022-06-19 11:34 | 3.0.0 | gpt test gpt2 | 846.38 | 967.01 | |
613 | Anna Nowak | 2022-06-10 21:21 | 3.0.0 | gpt 434760 | N/A | N/A | |
612 | Anna Nowak | 2022-06-10 21:18 | 3.0.0 | s434760 gpt gpt2 | N/A | N/A | |
611 | Anna Nowak | 2022-06-10 21:01 | 3.0.0 | 434650 GPT-2 final test gpt2 | N/A | N/A | |
610 | Anna Nowak | 2022-06-10 20:57 | 3.0.0 | 434650 GPT-2 gpt2 | N/A | N/A | |
571 | Anna Nowak | 2022-06-09 19:02 | 3.0.0 | 434760 gpt2 (last try with line breaks) gpt2 | 844.62 | Infinity | |
570 | Anna Nowak | 2022-06-09 18:47 | 3.0.0 | GPT-2 LF gpt2 | 844.62 | Infinity | |
569 | Anna Nowak | 2022-06-09 18:40 | 3.0.0 | 434760 GPT-2 (LF) gpt2 | 844.62 | Infinity | |
568 | Anna Nowak | 2022-06-09 17:35 | 3.0.0 | 434760 gpt-2 | 844.62 | Infinity | |
293 | [anonymized] | 2022-06-06 09:29 | 3.0.0 | 434780 lstm ensemble | 439.93 | 473.36 | |
498 | Anna Nowak | 2022-05-30 19:10 | 3.0.0 | 434760 lstm lstm | 1214.68 | 1289.99 | |
359 | zrostek | 2022-05-29 22:19 | 3.0.0 | 470619 lstm ensemble | 557.35 | 597.73 | |
609 | zrostek | 2022-05-29 21:59 | 3.0.0 | 470619 lstm ensemble | 557.35 | N/A | |
181 | Łukasz Jędyk | 2022-05-29 10:25 | 3.0.0 | 434708 lstm ensemble | 313.22 | 343.21 | |
257 | Łukasz Jędyk | 2022-05-28 13:38 | 3.0.0 | 434708 gru | 380.07 | 423.74 | |
280 | Piotr | 2022-05-09 19:39 | 3.0.0 | 440058 neural network neural-network bigram | 411.07 | 454.57 | |
426 | Piotr | 2022-05-09 19:17 | 3.0.0 | 440058 kenlm kenlm | 833.52 | 852.54 | |
313 | [name not given] | 2022-05-08 22:21 | 3.0.0 | 434742 neural-network bigram | 435.21 | 493.60 | |
317 | MaciejSobkowiak | 2022-05-08 21:53 | 3.0.0 | Bigram model neural-network bigram | 445.15 | 502.94 | |
445 | [anonymized] | 2022-05-08 21:50 | 3.0.0 | s430705 neural-network bigram | 882.94 | 923.80 | |
354 | s434788 | 2022-05-08 21:46 | 3.0.0 | 434788 nn bigram neural-network bigram | 507.89 | 571.79 | |
279 | [anonymized] | 2022-05-08 17:33 | 3.0.0 | 434732 neural-network bigram | 411.07 | 454.57 | |
444 | [name not given] | 2022-05-08 15:25 | 3.0.0 | s470611 neural-network bigram | 882.90 | 923.80 | |
271 | [anonymized] | 2022-05-07 14:40 | 3.0.0 | 426206 neural-network bigram | 399.61 | 447.01 | |
287 | [anonymized] | 2022-05-07 12:56 | 3.0.0 | s426206 neural-network bigram | 406.60 | 463.01 | |
274 | Przemek | 2022-05-07 12:20 | 3.0.0 | 434766 neural neural-network bigram | 402.11 | 452.53 | |
406 | Łukasz Jędyk | 2022-05-02 11:41 | 3.0.0 | 434708 neural-network trigram | 733.77 | 761.47 | |
440 | Wojciech Jarmosz | 2022-05-02 00:53 | 3.0.0 | neural bigrams v2 neural-network bigram | 803.71 | 912.78 | |
608 | Wojciech Jarmosz | 2022-05-02 00:39 | 3.0.0 | neural bigrams v2 neural-network bigram | 803.71 | N/A | |
450 | Piotr Kopycki | 2022-05-01 22:24 | 3.0.0 | 470629 neural-network trigram | 960.15 | 961.09 | |
471 | Wojciech Jarmosz | 2022-05-01 22:02 | 3.0.0 | neural bigrams neural-network bigram | 912.83 | 1024.80 | |
277 | Anna Nowak | 2022-05-01 09:22 | 3.0.0 | 434760, bigram neural-network final neural-network bigram | 398.85 | 453.65 | |
564 | Anna Nowak | 2022-04-30 10:07 | 3.0.0 | v3 neural-network bigram | 12469.19 | 16664.84 | |
535 | Anna Nowak | 2022-04-29 11:03 | 3.0.0 | 434760 - nn bigrams neural-network bigram | 2632.19 | 3239.03 | |
537 | Anna Nowak | 2022-04-29 07:37 | 3.0.0 | 434760 neural-network bigram | 2772.19 | 3382.41 | |
492 | [anonymized] | 2022-04-25 21:19 | 3.0.0 | 426206 kenlm | 995.83 | 997.07 | |
490 | [anonymized] | 2022-04-25 15:11 | 3.0.0 | 434732 kenlm | 801.28 | 825.75 | |
489 | [name not given] | 2022-04-25 13:50 | 3.0.0 | s470611 kenlm | 795.46 | 819.20 | |
491 | [anonymized] | 2022-04-24 23:18 | 3.0.0 | 440054 kenlm | 884.90 | 906.41 | |
493 | [anonymized] | 2022-04-24 21:58 | 3.0.0 | s434804 v3 kenlm | 1001.84 | 1003.73 | |
607 | [anonymized] | 2022-04-24 21:10 | 3.0.0 | s434804 v2 | N/A | N/A | |
494 | Anna Nowak | 2022-04-24 18:33 | 3.0.0 | 434760 v2 kenlm | 1015.60 | 1017.27 | |
488 | Anna Nowak | 2022-04-24 15:52 | 3.0.0 | 434760 | NaN | NaN | |
417 | MaciejSobkowiak | 2022-04-23 23:47 | 3.0.0 | s434784 kenlm | 793.79 | 816.16 | |
416 | [name not given] | 2022-04-23 22:31 | 3.0.0 | 434742 kenlm | 793.79 | 816.16 | |
330 | [name not given] | 2022-04-23 22:17 | 3.0.0 | 434742 kenlm | 485.93 | 524.59 | |
437 | Wojciech Jarmosz | 2022-04-23 21:29 | 3.0.0 | s434704 kenlm | 873.87 | 896.43 | |
371 | Jakub Pietrzak | 2022-04-23 21:15 | 3.0.0 | 470628 kenlm | 611.46 | 646.24 | |
468 | [anonymized] | 2022-04-23 19:38 | 3.0.0 | 434780 kenlm | 910.33 | 926.88 | |
415 | Jakub Pogodziński | 2022-04-23 17:20 | 3.0.0 | s437622 kenlm kenlm | 792.81 | 816.08 | |
238 | [anonymized] | 2022-04-23 15:45 | 3.0.0 | 434780 goodturing | 369.12 | 402.22 | |
425 | Przemek | 2022-04-23 15:32 | 3.0.0 | kenlm 434766 kenlm | 833.52 | 852.54 | |
424 | Łukasz Jędyk | 2022-04-22 16:25 | 3.0.0 | 434708 kenlm | 812.12 | 833.33 | |
469 | Łukasz Jędyk | 2022-04-21 20:30 | 3.0.0 | 434708 kenlm | 954.74 | 963.04 | |
329 | [name not given] | 2022-04-12 19:02 | 3.0.0 | s434742 n-grams plusaplha | 485.93 | 524.59 | |
464 | MaciejSobkowiak | 2022-04-12 18:18 | 3.0.0 | s434784 wygładzenie plusaplha | 400.10 | 441.86 | |
487 | Anna Nowak | 2022-04-12 09:55 | 3.0.0 | 434760 v3 | 1156.64 | 1178.05 | |
475 | Anna Nowak | 2022-04-12 08:43 | 3.0.0 | 434760 v2 interpolation | 1066.28 | 1065.77 | |
476 | Anna Nowak | 2022-04-12 08:33 | 3.0.0 | 434760 | 1067.53 | 1068.31 | |
466 | [anonymized] | 2022-04-11 18:28 | 3.0.0 | test n-grams | 27.76 | 519.82 | |
467 | [anonymized] | 2022-04-11 09:17 | 3.0.0 | 426206 n-grams goodturing | 588.51 | 642.02 | |
463 | Jakub Pogodziński | 2022-04-11 09:17 | 3.0.0 | 437622 alpha n-grams goodturing | 381.39 | 427.45 | |
465 | s434788 | 2022-04-11 08:15 | 3.0.0 | 434788 plusalpha v_2 plusaplha | 432.82 | 473.60 | |
462 | [name not given] | 2022-04-10 23:14 | 3.0.0 | 434742 n-grams | 341.57 | 379.56 | |
461 | MaciejSobkowiak | 2022-04-10 22:59 | 3.0.0 | s434784 n-grams | 341.53 | 379.52 | |
460 | s434788 | 2022-04-10 22:44 | 3.0.0 | 434788 plusalpha plusaplha | NaN | NaN | |
428 | zrostek | 2022-04-10 21:13 | 3.0.0 | 470619 backoff | 794.13 | 859.56 | |
392 | Piotr Kopycki | 2022-04-10 21:01 | 3.0.0 | 470629 plusaplha | 644.09 | 714.72 | |
389 | Piotr | 2022-04-10 20:55 | 3.0.0 | 440058 plusaplha | 694.55 | 710.19 | |
288 | Piotr | 2022-04-10 19:51 | 3.0.0 | 440058 n-grams | 426.60 | 467.30 | |
337 | [anonymized] | 2022-04-10 19:34 | 3.0.0 | s434804 n-grams plusaplha | 488.34 | 533.82 | |
370 | [anonymized] | 2022-04-10 19:19 | 3.0.0 | s430705 plusalpha n-grams goodturing | 557.09 | 628.51 | |
438 | [anonymized] | 2022-04-10 19:13 | 3.0.0 | 434732 plusaplha | 775.76 | 905.71 | |
182 | Przemek | 2022-04-10 18:49 | 3.0.0 | 434766 plusalpha plusaplha | 309.95 | 343.70 | |
265 | [name not given] | 2022-04-10 17:19 | 3.0.0 | s470611 n-grams backoff | 391.43 | 436.81 | |
423 | Wojciech Jarmosz | 2022-04-10 12:57 | 3.0.0 | s434704 n-grams plusaplha goodturing | 759.20 | 828.47 | |
248 | Jakub Pietrzak | 2022-04-09 20:59 | 3.0.0 | 470628 n-grams plusaplha | 364.70 | 412.64 | |
218 | Łukasz Jędyk | 2022-04-09 17:02 | 3.0.0 | 434708 plusalpha n-grams plusaplha | 345.11 | 383.72 | |
255 | Jakub Pietrzak | 2022-04-09 11:38 | 3.0.0 | 470628 n-grams | 373.67 | 422.47 | |
351 | [anonymized] | 2022-04-05 17:11 | 3.0.0 | 440054 n-grams | 510.57 | 568.10 | |
388 | [anonymized] | 2022-04-04 16:53 | 3.0.0 | 426206 n-grams | 579.65 | 702.13 | |
162 | [anonymized] | 2022-04-04 16:44 | 3.0.0 | 434749: n-gramowy model oparty na 3-gram (fill in the middle) + backoff to 2-gram + backoff to 2-gram reversed + alpha smoothing n-grams backoff | 287.28 | 322.06 | |
215 | [anonymized] | 2022-04-03 21:45 | 3.0.0 | 434780 n-grams | 341.53 | 379.52 | |
451 | Wojciech Jarmosz | 2022-04-03 19:44 | 3.0.0 | s434704 n-grams | 864.09 | 962.91 | |
290 | Przemek | 2022-04-03 18:44 | 3.0.0 | 434766 n-grams | 425.15 | 468.41 | |
439 | [anonymized] | 2022-04-03 18:16 | 3.0.0 | 434732 n-grams | 776.79 | 907.06 | |
264 | Jakub Pogodziński | 2022-04-03 18:14 | 3.0.0 | 437622 n-grams | 387.11 | 434.55 | |
298 | [name not given] | 2022-04-03 18:11 | 3.0.0 | s470611 n-grams | 419.16 | 477.08 | |
433 | [anonymized] | 2022-04-03 18:00 | 3.0.0 | s430705 n-grams | 770.18 | 887.49 | |
263 | [anonymized] | 2022-04-03 17:48 | 3.0.0 | init n-grams | 387.11 | 434.55 | |
434 | [anonymized] | 2022-04-03 17:22 | 3.0.0 | s434804 n-grams | 776.13 | 887.65 | |
379 | Łukasz Jędyk | 2022-04-03 12:06 | 3.0.0 | 434708 n-grams | 583.60 | 688.65 | |
390 | zrostek | 2022-04-01 13:45 | 3.0.0 | 470619 n-grams | 649.87 | 710.90 | |
3 | kubapok | 2021-12-15 21:44 | 3.0.0 | challam-roberta-without-date checkpoint 1325000 | 49.82 | 56.64 | |
2 | kubapok | 2021-12-15 21:29 | 3.0.0 | challam-roberta-with-date day_of_year checkpoint 1325000 | 48.72 | 53.76 | |
1 | kubapok | 2021-12-11 17:25 | 3.0.0 | roberta_large_no_ft | 45.33 | 52.58 | |
5 | kubapok | 2021-12-11 16:17 | 3.0.0 | challam-roberta-with-date day_of_year | 64.84 | 73.63 | |
6 | kubapok | 2021-12-11 15:57 | 3.0.0 | challam-roberta-with-date weekday | 64.90 | 73.91 | |
7 | kubapok | 2021-12-11 14:02 | 3.0.0 | challam-roberta-without-date | 67.36 | 77.31 | |
4 | kubapok | 2021-12-11 13:59 | 3.0.0 | roberta base noft | 62.75 | 72.10 | |
459 | kubapok | 2021-12-10 15:51 | 3.0.0 | same prob | 1024.00 | 1024.00 | |
606 | kubapok | 2021-07-11 18:35 | 2.0.1 | transformer word (kaczla) word-level transformer | 259.27 | 250.45 | |
605 | kubapok | 2021-07-11 18:32 | 2.0.1 | bilstm word (kaczla) word-level bilstm | 184.39 | 177.01 | |
604 | kubapok | 2021-07-11 18:31 | 2.0.1 | transformer bpe (kaczla) transformer bpe | 194.93 | 185.40 | |
603 | kubapok | 2021-07-11 18:28 | 2.0.1 | bilstm bpe (kaczla) bilstm bpe | 185.41 | 179.20 | |
602 | kubapok | 2021-07-11 18:08 | 2.0.1 | english roberta base no finetune bilstm bpe | 67.27 | 65.19 | |
601 | kubapok | 2021-07-11 17:35 | 2.0.1 | english roberta large finetune roberta | 46.93 | 45.74 | |
600 | kubapok | 2021-07-11 17:34 | 2.0.1 | english roberta base finetune roberta | 53.86 | 53.26 | |
599 | kubapok | 2021-07-11 17:33 | 2.0.1 | english roberta large no finetune roberta | 51.44 | 49.85 | |
598 | kubapok | 2021-07-11 17:30 | 2.0.1 | same probability roberta | 1024.00 | 1024.00 | |
597 | kaczla | 2021-06-07 15:59 | 1.0.1 | Add simple BPE Transformer lvl=bpe model=transformer transformer bpe | N/A | N/A | |
596 | kaczla | 2021-06-07 15:59 | 1.0.1 | Add simple BPE BiLSTM lvl=bpe model=bilstm bilstm bpe | N/A | N/A | |
595 | kaczla | 2021-06-07 15:58 | 1.0.1 | Add simple word Transformer lvl=word model=transformer word-level transformer | N/A | N/A | |
594 | kaczla | 2021-06-07 15:57 | 1.0.1 | Add simple word BiLSTM lvl=word model=bilstm word-level bilstm | N/A | N/A | |
593 | kaczla | 2021-06-07 15:35 | 1.0.0 | Add BiLSTM/Transformer (word and BPE) models lvl=word model=transformer word-level bilstm transformer bpe | N/A | N/A | |
592 | kaczla | 2021-06-07 15:35 | 1.0.0 | Add BiLSTM/Transformer (word and BPE) models lvl=bpe model=transformer word-level bilstm transformer bpe | N/A | N/A | |
591 | kaczla | 2021-06-07 15:35 | 1.0.0 | Add BiLSTM/Transformer (word and BPE) models lvl=word model=bilstm word-level bilstm transformer bpe | N/A | N/A | |
590 | kaczla | 2021-06-07 15:35 | 1.0.0 | Add BiLSTM/Transformer (word and BPE) models lvl=bpe model=bilstm word-level bilstm transformer bpe | N/A | N/A | |
589 | kaczla | 2021-06-06 18:51 | 1.0.0 | Simple models lvl=word model=transformer bilstm transformer | N/A | N/A | |
588 | kaczla | 2021-06-06 18:51 | 1.0.0 | Simple models lvl=bpe model=transformer bilstm transformer | N/A | N/A | |
587 | kaczla | 2021-06-06 18:51 | 1.0.0 | Simple models lvl=word model=bilstm bilstm transformer | N/A | N/A | |
586 | kaczla | 2021-06-06 18:51 | 1.0.0 | Simple models lvl=bpe model=bilstm bilstm transformer | N/A | N/A |