Abstract. The multilingual BERT model is trained on 104 languages and meant to serve as a universal language model and tool for encoding sentences.
確定! 回上一頁