# Automatic Differentiation¶

In [1]:
from mxnet import autograd, np, npx
npx.set_np()

x = np.arange(4)
x
Out[1]:
array([0., 1., 2., 3.])

Allocate space to store the gradient with respect to x.

In [2]:

Record the computation within the record scope.

In [3]:
The gradient of the function $y = 2\mathbf{x}^{\top}\mathbf{x}$ with respect to $\mathbf{x}$ should be $4\mathbf{x}$.