The reason the tensor takes up so much memory is because by default the tensor will store the values with the type torch.float32 .
確定! 回上一頁