Adaptive optimizers extend neural network gradient descent. Covers (Nesterov) momentum, Adagrad, Adadelta, RMSprop, Adam, AdaMax and Nadam.
確定! 回上一頁