Mention other minimization methods

Vectornaut 2025-05-23 20:04:11 +00:00
parent b26923c966
commit 73b3be2a90

@ -136,6 +136,16 @@ The minimization routine is implemented in [`engine.rs`](../src/branch/main/app-
4. Otherwise, multiply the step by the back-off parameter $\beta \in (0, 1)$. 4. Otherwise, multiply the step by the back-off parameter $\beta \in (0, 1)$.
* This parameter is passed to `realize_gram` as the argument `backoff`. * This parameter is passed to `realize_gram` as the argument `backoff`.
#### Other minimization methods
Ge, Chou, and Gao have written about using optimization methods to solve 2d geometric constraint problems.
- Jian-Xin Ge, Shang-Ching Chou, and Xiao-Shan Gao. ["Geometric Constraint Satisfaction Using Optimization Methods,"](https://doi.org/10.1016/S0010-4485(99)00074-3) 1999.
Instead of Newton's method, they use the BFGS method, which they say is less sensitive to the initial guess and more likely to arrive at a solution close to the initial guess.
Julia's [Optim](https://julianlsolvers.github.io/Optim.jl) package uses an [unconventional variant](https://julianlsolvers.github.io/Optim.jl/v1.12/algo/newton/) of Newton's method whose rationale is sketched in this [issue discussion](https://github.com/JuliaNLSolvers/Optim.jl/issues/153#issuecomment-161268535). It's based on the Cholesky-like factorization implemented in the [PositiveFactorizations](https://github.com/timholy/PositiveFactorizations.jl) package.
### Reconstructing a rigid subassembly ### Reconstructing a rigid subassembly
Suppose we can find a set of vectors $\{a_k\}_{k \in K}$ whose Lorentz products are all known. Restricting the Gram matrix to $\mathbb{R}^K$ and projecting its output orthogonally onto $\mathbb{R}^K$ gives a submatrix $G_K \colon \mathbb{R}^K \to \mathbb{R}^K$ whose entries are all known. Suppose in addition that the set $\{a_k\}_{k \in K}$ spans $V$, implying that $G_K$ has rank five. Suppose we can find a set of vectors $\{a_k\}_{k \in K}$ whose Lorentz products are all known. Restricting the Gram matrix to $\mathbb{R}^K$ and projecting its output orthogonally onto $\mathbb{R}^K$ gives a submatrix $G_K \colon \mathbb{R}^K \to \mathbb{R}^K$ whose entries are all known. Suppose in addition that the set $\{a_k\}_{k \in K}$ spans $V$, implying that $G_K$ has rank five.