Jacobians
#math
As defined in the linear algebra aspect, in the book Neural Network Primer
The jacobian is a way to represent directional gradient of a function's output with respect to it's input in the form of a matrix.
Suppose a function
Then the Jacobian ~
$$\partial f(\mathbf{x}) = \begin{bmatrix} \frac{\partial y_{1}}{\partial x_{1}} & \dots & \frac{\partial y_{1}}{\partial x_{d}} \ \vdots & \ddots & \vdots \ \frac{\partial y_{o}}{\partial x_{1}} & \dots & \frac{\partial y_{d}}{\partial x_{d}} \end{bmatrix}$$
On the dimensionality of Jacobians
When we work with a linear function
We can represent Jacobian with respect to
However, when looked upon as a function of
Then, we can always take the isomorphic map of
This is like a squashing of the weight space into something more representable and digestible.
Later on in more readings, jacobian notation may hide the true dimensions of matrices.