# Gradient

Updated: 2020-12-31

## Gradient vs Derivative

- The gradient is a multi-variable generalization of the derivative
- The gradient is a vector-valued function, as opposed to a derivative, which is scalar-valued.
- Like the derivative, the gradient represents the slope of the tangent of the graph of the function.

## Calculation of Gradient

- Gradient is
`(change in x)/(change in y)`

. y, here, is the index, so the difference(distance) between adjacent values is 1. -
The gradient is computed using

**central differences**in the interior and**first differences**at the boundaries. Read More About Central Differences- At the boundaries, the first difference is calculated. This means that at each end of the array, the gradient given is simply, the difference between the end two values (divided by 1)
- Away from the boundaries the gradient for a particular index is given by taking the difference between the the values either side and dividing by 2.

## Example

Use Numpy

```
>>> x = np.array([1, 2, 4, 7, 11, 16], dtype=np.float)
>>> j = np.gradient(x)
>>> j
array([ 1. , 1.5, 2.5, 3.5, 4.5, 5. ])
```

So, the gradient of `x`

is calculated as this:

```
g[0] = (x[1]-x[0])/1 = (2-1)/1 = 1
g[1] = (x[2]-x[0])/2 = (4-1)/2 = 1.5
g[2] = (x[3]-x[1])/2 = (7-2)/2 = 2.5
g[3] = (x[4]-x[2])/2 = (11-4)/2 = 3.5
g[4] = (x[5]-x[3])/2 = (16-7)/2 = 4.5
g[5] = (x[5]-x[4])/1 = (16-11)/1 = 5
```

Use Tensorflow

`[db,dW,dx] = tf.gradients(C, [b,W,x])`