Multilingual pre-trained language models, such as mBERT and XLM-R, have shown impres- sive cross-lingual ability. Surprisingly, both of.
確定! 回上一頁