Central to all neural networks in PyTorch is the autograd package.
autograd
provides automatic differentiation for all operations on Tensors.
Generally speaking, torch.autograd is an engine for computing vector-Jacobian product.
torch.autograd
torch.Tensor is the central class of the package.
torch.Tensor
.requires_grad as True, starts to track all operations on it
.requires_grad
True
you can call .backward() and have all the gradients computed automatically.
.backward()
accumulated into .grad attribute.
.grad
with torch.no_grad(): to prevent tracking history (and using memory)
with torch.no_grad():
.backward() compute the derivatives,
.grad_fn attribute references a Function
.grad_fn
Function
Pytorch Autograd_tutorial arrow-up-right
Last updated 2 years ago