Computing log_softmax is less error-prone. Therefore PyTorch usually uses log_softmax, but this means you need the special NLLLoss() function.
確定! 回上一頁