Commit Graph

31 Commits

Author SHA1 Message Date
Aaron Fenyes
b185fd4b83 Switch to backtracking Newton's method in Optim
This performs much better than the trust region Newton's method for the
actual `circles-in-triangle` problem. (The trust region method performs
better for the simplified problem produced by the conversion bug.)
2024-07-15 15:52:38 -07:00
Aaron Fenyes
94e0d321d5 Switch back to BigFloat precision in examples 2024-07-15 14:31:30 -07:00
Aaron Fenyes
53d8c38047 Preserve explicit zeros in Gram matrix conversion
In previous commits, the `circles-in-triangle` example converged much
more slowly in BigFloat precision than in Float64 precision. This
turned out to be a sign of a bug in the Float64 computation: converting
the Gram matrix using `Float64.()` dropped the explicit zeros, removing
many constraints and making the problem much easier to solve. This
commit corrects the Gram matrix conversion. The Float64 search now
solves the same problem as the BigFloat search, with comparable
performance.
2024-07-15 14:08:57 -07:00
Aaron Fenyes
7b3efbc385 Clean up backtracking gradient descent code
Drop experimental singularity handling strategies. Reduce the default
tolerance to within 64-bit floating point precision. Report success.
2024-07-15 13:15:15 -07:00
Aaron Fenyes
25b09ebf92 Sketch backtracking Newton's method
This code is a mess, but I'm committing it to record a working state
before I start trying to clean up.
2024-07-15 11:32:04 -07:00
Aaron Fenyes
3910b9f740 Use Newton's method for polishing 2024-07-11 13:43:52 -07:00
Aaron Fenyes
d538cbf716 Correct improvement threshold by using unit step
Our formula for the improvement theshold works when the step size is
an absolute distance. However, in commit `4d5ea06`, the step size was
measured relative to the current gradient instead. This commit scales
the base step to unit length, so now the step size really is an absolute
distance.
2024-07-10 23:31:44 -07:00
Aaron Fenyes
4d5ea062a3 Record gradient and last line search in history 2024-07-09 15:00:13 -07:00
Aaron Fenyes
5652719642 Require triangle sides to be planar 2024-07-09 14:10:23 -07:00
Aaron Fenyes
f84d475580 Visualize neighborhoods of global minima 2024-07-09 14:01:30 -07:00
Aaron Fenyes
77bc124170 Change loss function to match gradient 2024-07-09 14:00:24 -07:00
Aaron Fenyes
023759a267 Start "circles in triangle" from a very close guess 2024-07-08 14:21:10 -07:00
Aaron Fenyes
610fc451f0 Track slope in gradient descent history 2024-07-08 14:19:25 -07:00
Aaron Fenyes
93dd05c317 Add required package to "sphere in tetrahedron" example 2024-07-08 14:19:05 -07:00
Aaron Fenyes
9efa99e8be Test gradient descent for circles in triangle 2024-07-08 12:56:28 -07:00
Aaron Fenyes
828498b3de Add sphere and plane utilities to engine 2024-07-08 12:56:14 -07:00
Aaron Fenyes
736ac50b07 Test gradient descent for sphere in tetrahedron 2024-07-07 17:58:55 -07:00
Aaron Fenyes
ea354b6c2b Randomize guess in gradient descent test
Randomly perturb the pre-solved part of the guess, and randomly choose
the unsolved part.
2024-07-07 17:56:12 -07:00
Aaron Fenyes
d39244d308 Host Ganja.js locally 2024-07-06 21:35:09 -07:00
Aaron Fenyes
7e94fef19e Improve random vector generator 2024-07-06 21:32:43 -07:00
Aaron Fenyes
abc53b4705 Sketch random vector generator
This needs to be rewritten: it can fail at generating spacelike vectors.
2024-07-02 17:16:31 -07:00
Aaron Fenyes
17fefff61e Name gradient descent test more specifically 2024-07-02 17:16:19 -07:00
Aaron Fenyes
133519cacb Encapsulate gradient descent code
The completed gram matrix from this commit matches the one from commit
e7dde58 to six decimal places.
2024-07-02 15:02:59 -07:00
Aaron Fenyes
e7dde5800c Do gradient descent entirely in BigFloat
The previos version accidentally returned steps in Float64.
2024-07-02 12:35:12 -07:00
Aaron Fenyes
c933e07312 Switch to Ganja.js basis ordering 2024-06-26 11:39:34 -07:00
Aaron Fenyes
2b6c4f4720 Avoid naming conflict with identity transformation 2024-06-26 11:28:47 -07:00
Aaron Fenyes
4a28a47520 Update namespace of AbstractAlgebra.Rationals 2024-06-26 01:06:27 -07:00
Aaron Fenyes
58a5c38e62 Try numerical low-rank factorization
The best technique I've found so far is the homemade gradient descent
routine in `descent-test.jl`.
2024-05-30 00:36:03 -07:00
Aaron Fenyes
ef33b8ee10 Correct signature 2024-03-01 13:26:20 -05:00
Aaron Fenyes
717e5a6200 Extend Gram matrix automatically
The signature of the Minkowski form on the subspace spanned by the Gram
matrix should tell us what the big Gram matrix has to look like
2024-02-21 03:00:06 -05:00
Aaron Fenyes
16826cf07c Try out the Gram matrix approach 2024-02-20 22:35:24 -05:00