Adam Optimizer for Neural Networks with 0.05 learning rate and 5e-7 decay. Optimizers with live results: Stochastic Gradient Descent: Optimizer: SGD.
確定! 回上一頁