This work proposes a novel neural architecture Transformer-XL that enables learning dependency beyond a fixed length without disrupting ...
確定! 回上一頁