From eeccf5fe55bd957dc4f0cbdfbfa27020c4563a38 Mon Sep 17 00:00:00 2001 From: Vectornaut Date: Mon, 18 Nov 2024 20:26:52 +0000 Subject: [PATCH] Find the second derivative of the loss function --- Gram-matrix-parameterization.md | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/Gram-matrix-parameterization.md b/Gram-matrix-parameterization.md index dad08d9..f38e8b5 100644 --- a/Gram-matrix-parameterization.md +++ b/Gram-matrix-parameterization.md @@ -95,7 +95,16 @@ This matrix is stored as `neg_grad` in the Rust and Julia implementations of the #### The second derivative of the loss function -*Writeup in progress. Implemented in `app-proto/src/engine.rs` and `engine-proto/gram-test/Engine.jl`.* +Recalling that +\[ -d\Delta = \mathcal{P}(dA^\top Q A + A^\top Q\,dA), \] +we can express the derivative of $\operatorname{grad}(f)$ as +\[ +\begin{align*} +d\operatorname{grad}(f) & = -4 Q\,dA\,\mathcal{P}(\Delta) - 4 Q A\,\mathcal{P}(d\Delta) \\ +& = -4 Q\big[dA\,\mathcal{P}(\Delta) + A\,\mathcal{P}(-d\Delta)\big]. +\end{align*} +\] +In the Rust and Julia implementations of the realization routine, we express $d\operatorname{grad}(f)$ as a matrix in the standard basis for $\operatorname{End}(\mathbb{R}^n)$. We apply the cotangent vector $d\operatorname{grad}(f)$ to each standard basis matrix $E_{ij}$ by setting the value of the matrix-valued 1-form $dA$ to $E_{ij}$. #### Finding minima