BERT -base, English, uncased, 12-layer, 768-hidden, 12-heads, 110M parameters: download ... We have trained BERT-base model for other languages and domains:.
確定! 回上一頁