From c9167e0f6f51f5d0242c30884ad0a45e8f7d58f7 Mon Sep 17 00:00:00 2001 From: Vectornaut Date: Fri, 17 Oct 2025 09:07:22 +0000 Subject: [PATCH] Use curly apostrophes --- Numerical-optimization.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Numerical-optimization.md b/Numerical-optimization.md index 04bf8c7..057e2da 100644 --- a/Numerical-optimization.md +++ b/Numerical-optimization.md @@ -22,11 +22,11 @@ We saw above that the vanishingly rare gradient descent paths that lead to saddl #### Affected optimization methods -Uniform regularization can be seen as an interpolation between Newton's method and gradient descent, which kicks in when lowest eigenvalue of the Hessian drops below zero and brings the search direction closer to the gradient descent direction as the lowest eigenvalue gets more negative. Since the Hessian is indefinite near a saddle point, Newton's method with uniform regularization should act at least sort of like gradient descent near a saddle point. This suggests that it could get bogged down near saddle points in the same way. +Uniform regularization can be seen as an interpolation between Newton’s method and gradient descent, which kicks in when lowest eigenvalue of the Hessian drops below zero and brings the search direction closer to the gradient descent direction as the lowest eigenvalue gets more negative. Since the Hessian is indefinite near a saddle point, Newton’s method with uniform regularization should act at least sort of like gradient descent near a saddle point. This suggests that it could get bogged down near saddle points in the same way. ## Methods -### Newton's method +### Newton’s method _To be added_