polynomials_on_simplices.calculus.finite_difference module

Functions used to compute derivatives using finite difference.

central_difference(f, x, h=1e-08)[source]

Compute numerical gradient of the scalar valued function f using central finite difference.

\[f : \mathbb{R}^n \to \mathbb{R},\]
\[\nabla f(x)_i = \frac{\partial f(x)}{\partial x^i},\]
\[\nabla f_{\delta}(x)_i = \frac{f(x + \frac{h}{2}e_i) - f(x - \frac{h}{2}e_i)}{h}.\]
Parameters:
  • f (Callable f(x)) – Scalar valued function.
  • x (float or Iterable[float]) – Point where the gradient should be evaluated.
  • h (float) – Step size in the central difference method.
Returns:

Approximate gradient of f.

Return type:

float or length n Numpy array.

central_difference_jacobian(f, n, x, h=1e-08)[source]

Compute numerical jacobian of the vector valued function f using central finite difference.

\[f : \mathbb{R}^m \to \mathbb{R}^n,\]
\[J_f (x)^i_j = \frac{\partial f(x)^i}{\partial x^j},\]
\[J_{f, \delta}(x, h)^i_j = \frac{f(x + \frac{h}{2}e_j)^i - f(x - \frac{h}{2}e_j)^i}{h},\]

with \(i = 1, 2, \ldots, n, j = 1, 2, \ldots, m\).

Parameters:
  • f (Callable f(x)) – Vector valued function.
  • n (int) – Dimension of the target of f.
  • x (float or Iterable[float]) – Point where the jacobian should be evaluated.
  • h (float) – Step size in the finite difference method.
Returns:

Approximate jacobian of f.

Return type:

n by m Numpy matrix.

discrete_forward_difference(f0, x0, f1, x1)[source]

Compute numerical derivative of a scalar valued, univariate function f using two discrete point evaluations.

Parameters:
  • f0 (float) – Function value at x0.
  • x0 (float) – First point where the function has been evaluated.
  • f1 (float) – Function value at x1.
  • x1 (float) – Second point where the function has been evaluated.
Returns:

Numerical approximation of the derivative.

Return type:

float

forward_difference(f, x, h=1e-08)[source]

Compute numerical gradient of the scalar valued function f using forward finite difference.

\[f : \mathbb{R}^n \to \mathbb{R},\]
\[\nabla f(x)_i = \frac{\partial f(x)}{\partial x^i},\]
\[\nabla f_{\Delta}(x)_i = \frac{f(x + he_i) - f(x)}{h}.\]
Parameters:
  • f (Callable f(x)) – Scalar valued function.
  • x (float or Iterable[float]) – Point where the gradient should be evaluated.
  • h (float) – Step size in the finite difference method.
Returns:

Approximate gradient of f.

Return type:

float or length n Numpy array.

forward_difference_jacobian(f, n, x, h=1e-08)[source]

Compute numerical jacobian of the vector valued function f using forward finite difference.

\[f : \mathbb{R}^m \to \mathbb{R}^n,\]
\[J_f (x)^i_j = \frac{\partial f(x)^i}{\partial x^j},\]
\[J_{f, \Delta}(x, h)^i_j = \frac{f(x + he_j)^i - f(x)^i}{h},\]

with \(i = 1, 2, \ldots, n, j = 1, 2, \ldots, m\).

Parameters:
  • f (Callable f(x)) – Vector valued function.
  • n (int) – Dimension of the target of f.
  • x (float or Iterable[float]) – Point where the jacobian should be evaluated.
  • h (float) – Step size in the finite difference method.
Returns:

Approximate jacobian of f.

Return type:

n by m Numpy matrix.

second_central_difference(f, x, h=2e-05)[source]

Compute the numerical Hessian of the scalar valued function f using second order central finite difference.

\[f : \mathbb{R}^n \to \mathbb{R},\]
\[H_f(x)_{ij} = \frac{\partial^2 f(x)}{\partial x^i \partial x^j},\]
\[ \begin{align}\begin{aligned}H_{f, \delta}(x)_{ij} = \bigg[ &f(x + \frac{h}{2} (e_i + e_j)) - f(x + \frac{h}{2} (e_i - e_j))\\&- f(x + \frac{h}{2} (-e_i + e_j)) + f(x + \frac{h}{2} (-e_i - e_j)) \bigg] / h^2.\end{aligned}\end{align} \]
Parameters:
  • f (Callable f(x)) – Scalar valued function.
  • x (float or Iterable[float]) – Point where the Hessian should be evaluated.
  • h (float) – Step size in the finite difference method.
Returns:

Hessian (full matrix, i.e., not utilizing the symmetry or any sparsity structure of the Hessian).

Return type:

float or n by n Numpy matrix.

second_forward_difference(f, x, h=1e-05)[source]

Compute the numerical Hessian of the scalar valued function f using second order forward finite difference.

\[f : \mathbb{R}^n \to \mathbb{R},\]
\[H_f(x)_{ij} = \frac{\partial^2 f(x)}{\partial x^i \partial x^j},\]
\[H_{f, \Delta}(x)_{ij} = \frac{f(x + h (e_i + e_j)) - f(x + h e_i) - f(x + h e_j) + f(x)}{h^2}.\]
Parameters:
  • f (Callable f(x)) – Scalar valued function.
  • x (float or Iterable[float]) – Point where the Hessian should be evaluated.
  • h (float) – Step size in the finite difference method.
Returns:

Hessian (full matrix, i.e., not utilizing the symmetry or any sparsity structure of the Hessian).

Return type:

float or n by n Numpy matrix.