Find the second derivative of the loss function
parent
fe54bf0d08
commit
eeccf5fe55
@ -95,7 +95,16 @@ This matrix is stored as `neg_grad` in the Rust and Julia implementations of the
|
||||
|
||||
#### The second derivative of the loss function
|
||||
|
||||
*Writeup in progress. Implemented in `app-proto/src/engine.rs` and `engine-proto/gram-test/Engine.jl`.*
|
||||
Recalling that
|
||||
\[ -d\Delta = \mathcal{P}(dA^\top Q A + A^\top Q\,dA), \]
|
||||
we can express the derivative of $\operatorname{grad}(f)$ as
|
||||
\[
|
||||
\begin{align*}
|
||||
d\operatorname{grad}(f) & = -4 Q\,dA\,\mathcal{P}(\Delta) - 4 Q A\,\mathcal{P}(d\Delta) \\
|
||||
& = -4 Q\big[dA\,\mathcal{P}(\Delta) + A\,\mathcal{P}(-d\Delta)\big].
|
||||
\end{align*}
|
||||
\]
|
||||
In the Rust and Julia implementations of the realization routine, we express $d\operatorname{grad}(f)$ as a matrix in the standard basis for $\operatorname{End}(\mathbb{R}^n)$. We apply the cotangent vector $d\operatorname{grad}(f)$ to each standard basis matrix $E_{ij}$ by setting the value of the matrix-valued 1-form $dA$ to $E_{ij}$.
|
||||
|
||||
#### Finding minima
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user