We present Latin BERT, a contextual language model for the Latin language, trained on 642.7 million words from a variety of sources spanning the ...
確定! 回上一頁