diff --git a/Engine-prototype.md b/Engine-prototype.md index 1466a4a..c8675c9 100644 --- a/Engine-prototype.md +++ b/Engine-prototype.md @@ -26,12 +26,13 @@ The size of the Gröbner basis depends a lot on the variable order. For the exam Some of these are obvious, but we may as well write them down. Note we may want to simultaneously run different solvers/heuristics and update the display as they each return new/useful information, killing ones that become redundant. * Keep the network of entities (or variables?) and constraints. Only ever work on one connected component of this at a time. -* Conversely, track rigid subnetworks, and replace them with fewer working variables. [What do we do about/how do we display over-constrained subnetworks? Do we display the configuration "nearest" to satisfying all the constraints in some sense but show it some bright alarm state the violated constraints, perhaps graded by "how badly" the constraint fails?] -* Track the current displayed witnesses of satisfying the constraints (after all, that's what we are showing). In lightly constrained networks, temporarily add constraints fixing all the entities except for the ones that are being moved, backing off on some temporary constraints by distance in the constraint network until we get a low-dimensional component that we are on, and move just along that component. We may be able to take advantage of the fact that fixing some of the entities will linearize the remaining constraints, so we can very rapidly check if there are any overall solutions with those points fixed, and even tell something about conditions on the values of those entities that do or don't allow the linearized system to be solved. +* Conversely, track rigid subnetworks, and replace them with fewer working variables. [What do we do about/how do we display over-constrained subnetworks? Do we display the configuration "nearest" to satisfying all the constraints in some sense but show in some bright alarm state the violated constraints, perhaps graded by "how badly" the constraint fails? If we can do this well -- and numeric near-solutions would suffice in over-determined cases, I don't think there's any point in trying to find exact algebraic near-solutions -- then dyna3 could actually be useful working with over-determined systems; it would be doing some sort of optimization that could actually be telling us interesting things.] +* Track the current displayed witnesses of satisfying the constraints (after all, we need those at least numerically to update the screen). Hopefully we can actually get exact algebraic current witnesses, at least when we're not in the midst of fast motion. If so, then in lightly constrained networks, we can temporarily add constraints fixing all the entities except for the ones that are being moved, backing off on some temporary constraints by distance in the constraint network until we get a low-dimensional component that we are on, and move just along that component. This will have consequences in the "feel" of the interface: it will make the system "conservative" in the sense of trying to move a small number of entities to keep constraints satisfied in a drag, leving as much as possible of the construction fixed. One potential problem with that conservative approach is that in many cases it will break symmetry. For example, if M is defined as the midpoint of AB, and you drag M along line AB, you could at most fix one point, but it could be either A or B, so how do you choose? "Least-squares" motion will mean you simply drag the whole segment AB along its line -- is that the "intuitive" motion of the system when its midpoint is dragged? Or is some arbitrary symmetry-breaking OK to maximize the number of stationary entities? +* We may be able to take good advantage of the fact that fixing some of the entities will linearize the remaining constraints, so we can very rapidly check if there are any overall solutions with those points fixed, and even tell something about conditions on the values of those entities that do or don't allow the linearized system to be solved. We might want to try to keep track of minimal sets of entities which, when fixed, linearize everything else (in their connected component, per the general simplification). ## Overall solver approaches -* Find the Gröbner basis of the constraints. This definitely tells us a lot about the possible configurations. For example, the basis will be 1 if and only if there are no solutions (correct?). And we should be able to tell if the system is rigid, i.e., has only finitely many solutions (correct?). In this case, can we find all of the algebraic number solutions? And hopefully in the general case, we still stand to find some rational points. We're of course also fine with finding say quadratic or other low-degree algebraic numbers that satisfy the constraints. (I'm no algebraic geometer; can a 1-dimensional variety have no rational points? I assume for any fixed degree, it can have only finitely many algebraic points of that degree over Q?) +* Find the Gröbner basis of the constraints. This definitely tells us a lot about the possible configurations. For example, the basis will be 1 if and only if there are no solutions (correct?). And we should be able to tell if the system is rigid, i.e., has only finitely many solutions (correct?). In this case, can we find all of the algebraic number solutions? (I am worried that is in general a high-complexity problem that may become intractable in practice for realistic constructions??) And hopefully in the case of nonzero dimension components, we will have ways to find some rational points if they exist; we could certainly hit a variety with no rational points, correct?. We're of course also fine with finding say quadratic or other low-degree algebraic numbers that satisfy the constraints. I assume for any fixed degree, it could have only finitely many algebraic points of that degree over Q. I think that's true, but according to https://math.stackexchange.com/questions/21979/points-of-bounded-degree-on-varieties if it has infinitely many points, then there is a least degree for which it has infinitely many points. Are we interested in that degree? Can we find it from the Gröbner basis? If we could generate points of a given degree well, then we would know what degree to go to in order to allow a "smooth drag" among exact solutions. E.g, for a circle/sphere of quadratic algebraic radius there are already infinitely many rational points, if I am not mistaken, see https://mathoverflow.net/questions/125224/rational-points-on-a-sphere-in-mathbbrd. Except for the occasional "holes" around particularly low-denominator solutions, this would allow for a smooth drag just among rational points. ) * Use Newton's method to minimize the sum of the squares of the constraints. If we do find an approximate zero this way, is there a way to find a nearby algebraic zero? Sadly, if we don't find a zero this way, we may have just gotten stuck at a local minimum, so that's no proof the constraints are inconsistent. * Use homotopy continuation (see https://www.juliahomotopycontinuation.org/) to get numerical information about the solution variety (and generate numerical approximations to points on it, correct?) If I understand correctly, an advantage of this method is that we get an almost-sure proof that the system is inconsistent (if it is), correct? And become almost sure whether it is rigid (has only isolated solutions), correct? As with Newton's method, we still have the question of whether having approximate numerical solutions helps us find exact algebraic ones. Also, if I understand correctly, this approach comes with a natural way to update solutions as one element is dragged, is that right? * Since generally speaking each constraint is quadratic, there _might_ be some useful information here: https://math.stackexchange.com/questions/2119007/how-do-you-solve-a-system-of-quadratic-equations