Diachronic equivalents

For a given Polish word, as used in a given year, give a diachronic equivalent (a.k.a. temporal word analogy) for a given year. [ver. 1.0.0]

# submitter when ver. description dev-0 MAP test-A MAP
7 zrostek 2023-05-29 08:58 1.0.0 word2vec (with close matches for words with no embedding) word2vec 0.0467 0.0543
13 zrostek 2023-05-26 12:30 1.0.0 word2vec word2vec 0.0419 0.0465
16 zrostek 2023-05-25 06:20 1.0.0 nanoT5 (adafactor, legacy), 70000 out of 364210 epochs=20 t5 0.3374 0.0320
15 zrostek 2023-05-25 06:15 1.0.0 nanoT5 (adafactor, legacy), 70000 out of 364210 epochs=10 t5 0.1181 0.0367
1 zrostek 2023-05-18 17:56 1.0.0 gpt-3.5 0.0000 0.0923
5 zrostek 2023-05-18 13:34 1.0.0 allegro plT5-large with wikipedia + wiktionary data t5 0.2646 0.0670
4 zrostek 2023-05-17 06:33 1.0.0 allegro plT5-large with wikipedia data t5 0.2188 0.0680
6 zrostek 2023-05-16 13:11 1.0.0 allegro plT5-large with wikipedia data t5 0.2118 0.0606
3 zrostek 2023-05-16 10:31 1.0.0 allegro plT5-large t5 0.2054 0.0680
18 zrostek 2023-05-16 08:47 1.0.0 allegro plT5 t5 0.0596 0.0310
2 p/tlen 2018-01-31 21:48 1.0.0 with Word2Vec, method=most-similar, model=normalized-lemmatized-with-years word2vec 0.0678 0.0904
14 p/tlen 2018-01-30 20:35 1.0.0 with Word2Vec, method=year-analogy, model=normalized-lemmatized-with-years word2vec 0.0281 0.0427
20 p/tlen 2018-01-26 19:04 1.0.0 with Word2Vec, method=year-analogy, model=every-n-5 word2vec 0.0128 0.0204
9 p/tlen 2018-01-26 16:22 1.0.0 with Word2Vec, method=most-similar, model=every-n-5 0.0434 0.0517
8 p/tlen 2018-01-25 19:59 1.0.0 with Word2Vec, method=most-similar, model=normalized-with-years word2vec 0.0488 0.0541
19 p/tlen 2018-01-25 17:16 1.0.0 with Word2Vec, method=year-analogy, model=normalized-with-years word2vec 0.0295 0.0306
17 p/tlen 2018-01-25 16:12 1.0.0 with Word2Vec, method=year-analogy, model=with-years 0.0254 0.0310
10 p/tlen 2018-01-25 15:17 1.0.0 with Word2Vec, method=most-similar, model=with-years word2vec 0.0447 0.0516
11 p/tlen 2018-01-25 13:44 1.0.0 with Word2Vec, method=most-similar, model=raw word2vec 0.0467 0.0499
12 p/tlen 2018-01-25 10:34 1.0.0 with Word2Vec, method=year-analogy, model=raw word2vec 0.0432 0.0492