Properly implement Ueda and Yamashita's regularized Newton method #130
Labels
No labels
bug
design
duplicate
engine
enhancement
maintenance
prospective
question
regression
stub
todo
ui
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: StudioInfinity/dyna3#130
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
As of pull request #118, we carry out realization using a cheap imitation of Ueda and Yamashita's uniformly regularized Newton's method [UY]. We leave out the term of Ueda and Yamashita's regularization that involves the norm of the first derivative of the loss function. This has at least two downsides. One downside is practical: when the lowest eigenvalue of the Hessian is zero, our regularization is zero, so the regularized Hessian fails to be safely positive-definite. The other downside is conceptual: since we depart from Ueda and Yamashita's assumptions, we can't rely on their convergence results.
I informally tested a few regularization methods and decided that a proper implementation of Ueda and Yamashita’s method gave the most consistent convergence and the nicest-looking realizations. I therefore recommend switching to that method. To make the regularization easier to understand, and perhaps better adapted to our problem, I recommend taking the norm of the first derivative with respect to a meaningful metric on the configuration space.