The model mBERT, which stands for “multilingual BERT,” is effectively BERT pretrained on over 100 languages simultaneously. Naturally, this model is ...
確定! 回上一頁