92fbfcf520159abe95203dc40d379149b3e90142
Transformer encoder-decoder models (seq2seq) transformer huggingface-transformers transformer-encoder-decoder seq2seq

 

challenge
GLUE-LM-GAP
submitter
kaczla
submitted
2023-02-24 12:21:39.668287 UTC
original repo
https://gitlab.com/kaczla/glue-lm-gap.git / branch encoder-decoder
publicly available at
git://gonito.net/glue-lm-gap / branch submission-08005
browsable at
https://gonito.net/gitlist/glue-lm-gap.git/submission-08005/
clone by
git clone --single-branch git://gonito.net/glue-lm-gap -b submission-08005
# model_name dev-0 PerplexityHashed test-A PerplexityHashed
154 LongT5-Local-base 1006327.841621 1008648.894465
133 FLAN-T5-small 853010.711229 860545.059404
129 T5-large-v1_1 616699.817328 622033.924463
127 LongT5-TGlobal-base 599174.609809 594610.012669
122 T5-base-v1_1 462216.715180 456565.137625
115 T5-small-v1_1 341731.446603 361959.431007
111 FLAN-T5-large 291276.702614 309516.650005
105 FLAN-T5-base 266041.436835 266466.400683
100 T5-large 170274.560487 182789.640921
98 mT5-base 163267.421887 158446.962205
91 T5-base 56915.639536 60987.821059
78 mT5-small 16713.044290 14703.538644
75 T5-small 10171.933229 9998.940853
Parameter Value
depth 1
top_k 15