2cf89cb184eebc5b84e5f9e3fe8e43cd8f9cd296
Transformer encoder-decoder models (seq2seq) transformer huggingface-transformers transformer-encoder-decoder seq2seq

 

challenge
GLUE-LM-GAP
submitter
kaczla
submitted
2023-02-24 12:04:25.539329 UTC
# model_name dev-0 PerplexityHashed test-A PerplexityHashed
153 LongT5-Local-base 1006327.841621 1008648.894465
132 FLAN-T5-small 853010.711229 860545.059404
128 T5-large-v1.1 616699.817328 622033.924463
126 LongT5-TGlobal-base 599174.609809 594610.012669
121 T5-base-v1.1 462216.715180 456565.137625
114 T5-small-v1.1 341731.446603 361959.431007
110 FLAN-T5-large 291276.702614 309516.650005
104 FLAN-T5-base 266041.436835 266466.400683
99 T5-large 170274.560487 182789.640921
97 mT5-base 163267.421887 158446.962205
90 T5-base 56915.639536 60987.821059
77 mT5-small 16713.044290 14703.538644
74 T5-small 10171.933229 9998.940853
Parameter Value
depth 1
top_k 15