Backward propagation is kicked off when we call .backward() on the error tensor. ... Let's take a look at how autograd collects gradients.
確定! 回上一頁