659beed99370d3b246b2a1eb5f5870b930c6586d
Transformer encoder models (MLM) transformer huggingface-transformers transformer-encoder mlm

 

challenge
GLUE-LM-GAP
submitter
kaczla
submitted
2022-12-29 14:08:12.167942 UTC
# model_name test-A PerplexityHashed
30 MiniLM-L12-H384-XLMR-Large 876508.944709
29 MiniLM-L6-H384-RoBERTa-large 875278.906409
28 MiniLM-L12-H384-RoBERTa-large 875114.462710
27 MiniLM-L6-H768-BERT-large-uncased 873852.303343
26 MiniLM-L6-H768-BERT-base-uncased 870332.354133
25 MiniLM-L6-H384-BERT-large-uncased 869819.109384
24 MiniLM-L6-H384-BERT-base-uncased 869300.217146
23 MiniLM-L6-H768-RoBERTa-large 869124.251622
22 MiniLM-L6-H384-XLMR-Large 867987.178408
21 XLM-en 428281.101166
20 XLM-17-lang 52129.248038
19 XLM-100-lang 15046.785077
18 ALBERT-base 1575.853755
17 BERT-base-multilingual-uncased 1477.047924
16 DistilBERT-base-uncased 1303.640415
15 ALBERT-large 933.385723
14 MobileBERT-uncased 913.525374
13 BERT-base-uncased 804.253307
12 ALBERT-xxlarge 746.990635
11 ALBERT-xlarge 730.841384
10 BERT-large-uncased 670.671724
9 BERT-base-multilingual-cased 635.838112
8 DistilBERT-base-cased 494.413642
7 XLM-RoBERTa-base 193.883054
6 DistilRoBERTa-base 179.690259
5 BERT-base-cased 172.101095
4 XLM-RoBERTa-large 138.830588
3 BERT-large-cased 125.733251
2 RoBERTa-base 91.907166
1 RoBERTa-large 69.393009
Parameter Value
method simple
token_length 1
top_k 15