In PyTorch, you can construct a ReLU layer using the simple function relu1 = nn. ReLU with the argument inplace=False.
確定! 回上一頁