Add a modified Cholesky section placeholder and some references
parent
a89e219b0b
commit
4ee434c770
1 changed files with 11 additions and 2 deletions
|
|
@ -57,12 +57,21 @@ F^{(1)}_p + F^{(2)}_p v & = 0
|
||||||
```
|
```
|
||||||
using the non-degeneracy of the inner product. When the bilinear form $f^{(2)}_p$ is positive-definite, the operator $F^{(2)}_p$ is positive-definite too, so we can solve this equation by taking the Cholesky decomposition of $F^{(2)}_p$.
|
using the non-degeneracy of the inner product. When the bilinear form $f^{(2)}_p$ is positive-definite, the operator $F^{(2)}_p$ is positive-definite too, so we can solve this equation by taking the Cholesky decomposition of $F^{(2)}_p$.
|
||||||
|
|
||||||
If $f$ is convex, its second derivative is positive-definite everywhere, so the Newton step is always well-defined. However, non-convex loss functions show up in many interesting problems, including ours. For these problems, we need to decide how to step at a point where the second derivative is indefinite. One approach is to *regularize* the equation that defines the Newton step by making some modification of the second derivative that turns it into a positive-definite bilinear form. We’ll discuss some regularization methods below.
|
If $f$ is convex, its second derivative is positive-definite everywhere, so the Newton step is always well-defined. However, non-convex loss functions show up in many interesting problems, including ours. For these problems, we need to decide how to step at a point where the second derivative is indefinite. One approach is to *regularize* the equation that defines the Newton step by making some explicit or implicit modification of the second derivative that turns it into a positive-definite bilinear form. We’ll discuss some regularization methods below.
|
||||||
|
|
||||||
|
- Jorge Nocedal and Stephen J. Wright. *Numerical Optimization,* second edition. Springer, 2006.
|
||||||
|
|
||||||
#### Uniform regularization
|
#### Uniform regularization
|
||||||
|
|
||||||
_To be added_
|
_To be added_
|
||||||
|
|
||||||
|
- Kenji Ueda and Nobuo Yamashita. [“Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization”](https://doi.org/10.1007/s00245-009-9094-9) *Applied Mathematics and Optimization* 62, 2010.
|
||||||
|
|
||||||
#### Positive-definite truncation
|
#### Positive-definite truncation
|
||||||
|
|
||||||
_To be added_
|
_To be added_
|
||||||
|
|
||||||
|
- Santiago Paternain, Aryan Mokhtari, and Alejandro Ribeiro. [“A Newton-Based Method for Nonconvex Optimization with Fast Evasion of Saddle Points.”](https://doi.org/10.1137/17M1150116) *SIAM Journal on Optimization* 29(1), 2019.
|
||||||
|
|
||||||
|
#### Modified Cholesky decomposition
|
||||||
|
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue