The multilingual BERT model is trained on 104 languages and meant to serve as a universal language model and tool for encoding sentences. We ...
確定! 回上一頁