Challenging America word-gap prediction
Guess a word in a gap. [ver. 3.0.0]
Git repo URL: git://gonito.net/challenging-america-word-gap-prediction / Branch: master
Run git clone --single-branch git://gonito.net/challenging-america-word-gap-prediction -b master to get the challenge data
Browse at https://gonito.net/gitlist/challenging-america-word-gap-prediction.git/master
Leaderboard
# | submitter | when | ver. | description | test-A PerplexityHashed | × | |
---|---|---|---|---|---|---|---|
1 | kubapok | 2021-12-11 17:25 | 3.0.0 | roberta_large_no_ft | 52.58 | 26 | |
2 | Jakub | 2023-06-27 20:07 | 3.0.0 | Updated input truncation | 85.17 | 31 | |
3 | s478846 | 2023-05-30 15:49 | 3.0.0 | neural solution neural-network n-grams bow | 105.79 | 42 | |
4 | s444501 | 2023-05-30 11:50 | 3.0.0 | s444501 neural with bagging bagging_left_ctx=25 bagging_right_ctx=25 batch_size=4000 embed_size=300 epochs=1 hidden_size=150 learning-rate=1.0e-4 ngram_left_ctx=7 [...] neural-network | 126.69 | 17 | |
5 | [anonymized] | 2023-06-11 17:15 | 3.0.0 | zad9 | 132.60 | 10 | |
6 | s444391 | 2023-06-14 10:05 | 3.0.0 | Embeddings and bow solution embed_size=300 epochs=1 hidden_size=300 learning-rate=1.0e-4 vocab_size=5000 neural-network | 132.60 | 13 | |
7 | s444356 | 2023-05-30 14:53 | 3.0.0 | neural network, bag of words batch-size=10000 context-size=25 dropout=0.3 embed-size=200 epochs=1 hidden-size=1000 hidden-size_2=500 learning-rate=1.0e-3 [...] neural-network bow | 134.70 | 73 | |
8 | s444354 | 2023-06-13 21:05 | 3.0.0 | trigram model embed_size=300 epochs=1 hidden_size=750 learning-rate=1.0e-4 vocab_size=5000 | 138.03 | 24 | |
9 | s444018 | 2023-05-30 22:47 | 3.0.0 | s444018 batch-size=10000 dropout=0.3 embed-size=300 hidden-size=1000 second-hidden-size=500 vocab-size=12500 neural-network bow | 139.63 | 27 | |
10 | s409771 | 2023-06-07 19:14 | 3.0.0 | gpt2-large gpt2-large | 142.89 | 8 | |
11 | s478815 | 2023-06-02 19:15 | 3.0.0 | neural network | 149.96 | 20 | |
12 | ked | 2023-05-25 23:12 | 3.0.0 | zad9_v3 neural-network | 177.87 | 15 | |
13 | s478873 | 2023-05-31 09:26 | 3.0.0 | s478873 batch-size=5000 embed-size=100 epochs=1 hidden_size=100 top=100 vocab-size=10000 neural-network bow | 180.53 | 27 | |
14 | s444517 | 2023-06-07 14:41 | 3.0.0 | gpt-2, only left-context gpt2 transformer-decoder | 183.70 | 12 | |
15 | [anonymized] | 2023-06-26 21:55 | 3.0.0 | gpt-2 left context out files left-to-right gpt2 transformer-decoder | 185.64 | 22 | |
16 | Jakub Eichner | 2023-05-31 08:17 | 3.0.0 | Bag of words NN | 188.60 | 19 | |
17 | [anonymized] | 2023-06-08 18:03 | 3.0.0 | s444421 top=10 gpt2 | 215.47 | 9 | |
18 | Mikołaj Pokrywka | 2023-06-08 11:07 | 3.0.0 | all done | 239.28 | 15 | |
19 | [anonymized] | 2023-04-22 22:15 | 3.0.0 | s478831 | 239.65 | 15 | |
20 | 444498 | 2023-05-26 17:13 | 3.0.0 | trigram, tetragram | 239.79 | 7 | |
21 | Cezary | 2023-05-10 07:51 | 3.0.0 | This is my trigram nn solution batch_size=15000 embed_size=200 epochs=1 hidden_size=190 learning-rate=1.0e-3 top=10 vocab_size=20000 neural-network trigram | 243.43 | 5 | |
22 | s444386 | 2023-05-30 20:39 | 3.0.0 | nn with bag of words neural-network bow | 243.69 | 27 | |
23 | s444452 | 2023-06-08 16:14 | 3.0.0 | lstm lstm | 257.14 | 10 | |
24 | s478840 | 2023-05-10 20:25 | 3.0.0 | s478840 batch-size=1500 embed-size=250 hidden-size=100 top=600 vocab-size=20000 neural-network trigram | 263.30 | 21 | |
25 | Kamil Guttmann | 2023-05-09 17:59 | 3.0.0 | nn, trigram, predict from next two tokens batch-size=5000 embed-size=100 embed_size=256 epochs=1 hidden_size=2048 topk=150 vocab-size=20000 neural-network trigram | 275.80 | 12 | |
26 | s478855 | 2023-06-07 22:39 | 3.0.0 | s478855 decoder | 283.52 | 12 | |
27 | s444415 | 2023-05-30 19:13 | 3.0.0 | Nn ngram model top=600 neural-network n-grams bow | 300.73 | 24 | |
28 | Adam Wojdyła | 2023-06-28 15:51 | 3.0.0 | s444507 top=30 neural-network | 303.80 | 43 | |
29 | Łukasz Jędyk | 2022-05-29 10:25 | 3.0.0 | 434708 lstm ensemble | 343.21 | 7 | |
30 | Przemek | 2022-04-10 18:49 | 3.0.0 | 434766 plusalpha plusaplha | 343.70 | 4 | |
31 | s444476 | 2023-05-08 22:26 | 3.0.0 | trigram nn done HIDDENLAYER=1000 | 355.85 | 7 | |
32 | [name not given] | 2022-04-10 23:14 | 3.0.0 | 434742 n-grams | 379.56 | 13 | |
33 | Jakub Pietrzak | 2022-04-09 20:59 | 3.0.0 | 470628 n-grams plusaplha | 412.64 | 8 | |
34 | s444383 | 2023-05-15 10:16 | 3.0.0 | bigram | 420.77 | 9 | |
35 | Jakub Pogodziński | 2022-04-11 09:17 | 3.0.0 | 437622 alpha n-grams goodturing | 427.45 | 3 | |
36 | [name not given] | 2022-04-10 17:19 | 3.0.0 | s470611 n-grams backoff | 436.81 | 11 | |
37 | [anonymized] | 2022-06-26 18:43 | 3.0.0 | 434695 smoothing plusalpha plusaplha | 436.81 | 6 | |
38 | [anonymized] | 2022-05-07 14:40 | 3.0.0 | 426206 neural-network bigram | 447.01 | 5 | |
39 | [anonymized] | 2022-05-08 17:33 | 3.0.0 | 434732 neural-network bigram | 454.57 | 5 | |
40 | [anonymized] | 2022-04-10 19:34 | 3.0.0 | s434804 n-grams plusaplha | 533.82 | 5 | |
41 | zrostek | 2022-05-29 22:19 | 3.0.0 | 470619 lstm ensemble | 597.73 | 6 | |
42 | [anonymized] | 2022-04-10 19:19 | 3.0.0 | s430705 plusalpha n-grams goodturing | 628.51 | 7 | |
43 | Piotr Kopycki | 2022-04-10 21:01 | 3.0.0 | 470629 plusaplha | 714.72 | 2 | |
44 | Wojciech Jarmosz | 2022-04-10 12:57 | 3.0.0 | s434704 n-grams plusaplha goodturing | 828.47 | 8 | |
45 | s434788 | 2022-04-10 22:44 | 3.0.0 | 434788 plusalpha plusaplha | NaN | 3 | |
46 | s444455 | 2023-06-12 23:24 | 3.0.0 | embeddings, bov | 130.68 | 14 | |
47 | Jakub Adamski | 2023-06-14 15:53 | 3.0.0 | nn trigram neural-network trigram | 132.60 | 23 | |
48 | s444417 | 2023-05-30 12:04 | 3.0.0 | nn, vocab_size: 1500, embed_size: 200, batch_size: 5000, epoch:1, hl, l_ctx, r_ctx = 10 batch_size=5000 embed_size=200 vocab_size=15000 neural-network | 143.54 | 11 | |
49 | Marcin Kostrzewski | 2023-09-25 10:16 | 3.0.0 | GRU, both left and right context used, embedding_size=128, hidden_size=256, gru_layers=4, epochs=10 | 175.95 | 18 | |
50 | s478839 | 2023-06-24 22:05 | 3.0.0 | gpt2-fine-tuned top=30 train_dataset=50000 gpt2 fine-tuned | 226.54 | 28 | |
51 | s443930 | 2023-06-10 20:59 | 3.0.0 | zadanie12-1 | 231.20 | 26 | |
52 | Martyna Druminska | 2023-05-30 22:37 | 3.0.0 | trigram model batch_size=800 embed_size=300 epochs=1 hidden_size=128 learning-rate=1.0e-4 vocab_size=38000 neural-network trigram | 232.67 | 24 | |
53 | [anonymized] | 2022-04-04 16:44 | 3.0.0 | 434749: n-gramowy model oparty na 3-gram (fill in the middle) + backoff to 2-gram + backoff to 2-gram reversed + alpha smoothing n-grams backoff | 322.06 | 2 | |
54 | MaciejSobkowiak | 2022-04-10 22:59 | 3.0.0 | s434784 n-grams | 379.52 | 4 | |
55 | [anonymized] | 2022-04-03 21:45 | 3.0.0 | 434780 n-grams | 379.52 | 8 | |
56 | Anna Nowak | 2022-05-01 09:22 | 3.0.0 | 434760, bigram neural-network final neural-network bigram | 453.65 | 21 | |
57 | Piotr | 2022-05-09 19:39 | 3.0.0 | 440058 neural network neural-network bigram | 454.57 | 5 | |
58 | [anonymized] | 2022-04-11 18:28 | 3.0.0 | test n-grams | 519.82 | 2 | |
59 | [anonymized] | 2022-04-05 17:11 | 3.0.0 | 440054 n-grams | 568.10 | 5 | |
60 | s444465 | 2023-06-27 22:17 | 3.0.0 | s444465 bigram neural model fixed v2 batch_size=2500 embed_size=200 epochs=5 learning-rate=1.0e-3 vocab_size=40000 neural-network bigram | 934.09 | 13 | |
61 | Wiktor Bombola | 2023-06-02 12:28 | 3.0.0 | change the standarization method of output, wildcard | 1024.24 | 20 | |
62 | [anonymized] | 2023-03-29 10:05 | 3.0.0 | test | Infinity | 1 |