>1. I ended up directly using solvespace's solver instead of the suggested wrapper code since it didn't expose all of the features I needed. I also had to patch the solver to make it sufficiently fast for the kinds of equations I was generating by symbolically solving equations where applicable.
(anyone looking for an easy-to-use opensource 3D CAD program should consider it)
Constraint solving like that has been a feature in CAD programs for years. Here's Autodesk Inventor's 2D sketch mode.[1] You get an error if you try to overconstrain something. There's a symbolic solver checking for over-constraint.
Under-constraint is harder to deal with. There's a count of the number of additional constraints needed to specify the drawing fully. You don't have to get that count down to 0, but if you aren't just sketching for illustration, it's expected that you reach the fully constrained and dimensioned state before cutting metal.
Scaling problems dominate as drawings get more complex. It's easy to do this for high school geometry problems. It's hard to do it for jet engines. User interface design for dense drawings is really hard.
What do you do when stuff is on top of other stuff? The big, expensive CAD systems (Autodesk, Dassault, SolidWorks) have struggled through the scaling problem.
One of the cited references is Guillermo Webster — and while the article lists uncmin as the desired artifact, the project that uncmin was developed for takes the concept beyond layout to back solving for desired program outputs.
Carbide [0] is the project that encapsulated some of this in a workable demo. It’s really cool!
Do they release any software? It’s frustrating when Brett Victor tells us how modeling our protagonist’s path during development is a boon for the developer … and refuses to release even a demo. I can read (and hear) about things Ink & Switch have explored, but can we experience that, too?
I love the ideas these folks explore. I also want to explore - but I don’t have the time budget to learn and implement their concepts. I can’t speak for other software engineers, but I am much more likely to improve on such work if I can experience it first.
But "public release of hest" was a joking reference to the fact that he's stated quite clearly that the project is just a testbed for him and will never be released. (No matter that every once in a while it makes the rounds of sites like this and everyone goes a-twitter...)
Madly in love with how forward a frontier Ink & Switch cuts towards. Trying to make software less of a handcrafted artifact that each individual corporation builds on their own, as they see fit, and making computing a broader more universal project, with more regular hooks for humans to get themselves in the loop.
A reminder, for those who find TAOCP useful, Knuth released "Volume 4 Fascicle 7, Constraint Satisfaction" last month. It connected a lots of dots for me, but like most of TAOCP may/ or may not be useful standalone depending on the individual.
It helped unify what can be a complex intersection of fields with conflicting terms in a way that I could guess the direction this post was going.
> In addition to Alex’s relaxation-based solver, we tried several gradient descent-based solvers, which resulted in better convergence. The first was the uncmin from numericjs. Later, we found a version of uncmin that was modified by Guillermo Webster for his g9 project, which gave us even better results.
I was curious about what optimisation algorithm uncmin actually was. The current code & docs do not cite any academic literature apart from Moré et al, 1981 ("Testing unconstrained optimization software").
But, the commit messages and comments in older versions of uncmin [1] suggest uncmin is an implementation of BFGS.
In the scientific Python ecosystem, BFGS is used as the default nonlinear optimisation method by scipy.optimize.minimize [2] if your problem doesn't have any constraints or bounds. Scipy optimize cites Nocedal & Wright - Numerical Optimization (2nd ed) 2006 as a reference. BFGS is Algorithm 6.1 on page 140.
Nocedal & Wright say "the most popular quasi-Newton algorithm is the BFGS method, named for its discoverers Broyden, Fletcher, Goldfarb, and Shanno" after introducing quasi-Newton methods:
> Quasi-Newton methods, like steepest descent, require only the gradient of the objective function to be supplied at each iterate. By measuring the changes in gradients, they construct a model of the objective function that is good enough to produce superlinear convergence. The improvement over steepest descent is dramatic, especially on difficult problems. Moreover, since second derivatives are not required, quasi-Newton methods are sometimes more efficient than Newton’s method. [...]
> The development of automatic differentiation techniques has made it possible to use Newton’s method without requiring users to supply second derivatives; see Chapter 8. Still, automatic differentiation tools may not be applicable in many situations, and it may be much more costly to work with second derivatives in automatic differentiation software than with the gradient. For these reasons, quasi-Newton methods remain appealing.
Interestingly, uncmin originally had a much more complex line-search implementation, that was removed and replaced with a simple one. Ucmin's current line search code agrees with "Algorithm 3.1 (Backtracking Line Search)" from Nocedal & Wright, with some implementation-defined choices of: an initial step size of 1, first Wolfe condition checked with a constant of c_1 = 0.1, and step size halved until first Wolfe condition satisfied.
One part of the uncmin code that I can't immediately reconcile with equations from Nocedal & Wright is the code for updating "H1". The commit message that introduced it says "Sherman-Morrison for the BFGS update". The math seems to agree with the update for H_k , the approximate inverted Hessian, on the wikipedia page for BFGS [3] (with a quirk where one H_k should arguably be a H_k^T, but the approximation H_k is meant to be symmetric, provided BFGS doesn't break down, so that's probably fine).
edit: Nocedal & Wright comment on some pitfalls with simple line search procedures when implementing BFGS
> The performance of the BFGS method can degrade if the line search is not based on the Wolfe conditions. For example, some software implements an Armijo backtracking line search (see Section 3.1): The unit step length alpha_k = 1 is tried first and is successively decreased until the sufficient decrease condition (3.6a) is satisfied. For this strategy, there is no guarantee that the curvature condition y_k^T s_k > 0 (6.7) will be satisfied by the chosen step, since a step length greater than 1 may be required to satisfy this condition. To cope with this shortcoming, some implementations simply skip the BFGS update by setting H_{k+1} = H_k when y_k^T s_k is negative or too close to zero. This approach is not recommended, because the updates may be skipped much too often to allow H_k to capture important curvature information for the objective function f . In Chapter 18 we discuss a damped BFGS update that is a more effective strategy for coping with the case where the curvature condition (6.7) is not satisfied.
"some software implements an Armijo backtracking line search (see Section 3.1)" -- this appears to match uncmin's simple line search code. I do not see any curvature condition check in the uncmin code, so this could lead to slow convergence or failure for some problems if the line search chooses a step size where y_k^T s_k <= 0 .
https://github.com/dune3d/dune3d
>1. I ended up directly using solvespace's solver instead of the suggested wrapper code since it didn't expose all of the features I needed. I also had to patch the solver to make it sufficiently fast for the kinds of equations I was generating by symbolically solving equations where applicable.
(anyone looking for an easy-to-use opensource 3D CAD program should consider it)
For folks not familiar w/ Solvespace it's a light-weight 3D CAD program: https://solvespace.com/index.pl
Probably a bit more approachable for folks is: https://www.cadsketcher.com/ which adds the Solvespace constraint solver to Blender.
Constraint solving like that has been a feature in CAD programs for years. Here's Autodesk Inventor's 2D sketch mode.[1] You get an error if you try to overconstrain something. There's a symbolic solver checking for over-constraint.
Under-constraint is harder to deal with. There's a count of the number of additional constraints needed to specify the drawing fully. You don't have to get that count down to 0, but if you aren't just sketching for illustration, it's expected that you reach the fully constrained and dimensioned state before cutting metal.
Scaling problems dominate as drawings get more complex. It's easy to do this for high school geometry problems. It's hard to do it for jet engines. User interface design for dense drawings is really hard. What do you do when stuff is on top of other stuff? The big, expensive CAD systems (Autodesk, Dassault, SolidWorks) have struggled through the scaling problem.
[1] https://www.youtube.com/watch?v=r3LB0f-keL8
Carbide [0] is the project that encapsulated some of this in a workable demo. It’s really cool!
[0] https://alpha.trycarbide.com/
There is an important age-old piece of advice though: "never ask a man his salary, a woman her age, or Ivan when he's going to publicly release hest."
I love the ideas these folks explore. I also want to explore - but I don’t have the time budget to learn and implement their concepts. I can’t speak for other software engineers, but I am much more likely to improve on such work if I can experience it first.
Our GitHub has a boatload of experiments: https://github.com/orgs/inkandswitch/repositories
This is a fun afternoon just waiting for you: http://feelingisreality.com
Muse started as a research project and turned into an app: https://museapp.com/
Some of our essays have embedded versions of our prototypes you can play with:
* Crosscut: https://www.inkandswitch.com/crosscut/
* Potluck: https://www.inkandswitch.com/potluck/
Alex Warth is currently working on a remake of Ivan Sutherland's Sketchpad: https://github.com/alexwarth/sutherland
There's probably a bunch more I'm forgetting.
And Ivan's is here. https://github.com/ivanreese
But "public release of hest" was a joking reference to the fact that he's stated quite clearly that the project is just a testbed for him and will never be released. (No matter that every once in a while it makes the rounds of sites like this and everyone goes a-twitter...)
Madly in love with how forward a frontier Ink & Switch cuts towards. Trying to make software less of a handcrafted artifact that each individual corporation builds on their own, as they see fit, and making computing a broader more universal project, with more regular hooks for humans to get themselves in the loop.
A lot of "why are we still writing dead programs" vibes, in great ways. https://jackrusher.com/strange-loop-2022/ https://news.ycombinator.com/item?id=33270235 (181 points, Oct 2022, 61 comments)
It helped unify what can be a complex intersection of fields with conflicting terms in a way that I could guess the direction this post was going.
I was curious about what optimisation algorithm uncmin actually was. The current code & docs do not cite any academic literature apart from Moré et al, 1981 ("Testing unconstrained optimization software").
But, the commit messages and comments in older versions of uncmin [1] suggest uncmin is an implementation of BFGS.
In the scientific Python ecosystem, BFGS is used as the default nonlinear optimisation method by scipy.optimize.minimize [2] if your problem doesn't have any constraints or bounds. Scipy optimize cites Nocedal & Wright - Numerical Optimization (2nd ed) 2006 as a reference. BFGS is Algorithm 6.1 on page 140.
Nocedal & Wright say "the most popular quasi-Newton algorithm is the BFGS method, named for its discoverers Broyden, Fletcher, Goldfarb, and Shanno" after introducing quasi-Newton methods:
> Quasi-Newton methods, like steepest descent, require only the gradient of the objective function to be supplied at each iterate. By measuring the changes in gradients, they construct a model of the objective function that is good enough to produce superlinear convergence. The improvement over steepest descent is dramatic, especially on difficult problems. Moreover, since second derivatives are not required, quasi-Newton methods are sometimes more efficient than Newton’s method. [...]
> The development of automatic differentiation techniques has made it possible to use Newton’s method without requiring users to supply second derivatives; see Chapter 8. Still, automatic differentiation tools may not be applicable in many situations, and it may be much more costly to work with second derivatives in automatic differentiation software than with the gradient. For these reasons, quasi-Newton methods remain appealing.
Interestingly, uncmin originally had a much more complex line-search implementation, that was removed and replaced with a simple one. Ucmin's current line search code agrees with "Algorithm 3.1 (Backtracking Line Search)" from Nocedal & Wright, with some implementation-defined choices of: an initial step size of 1, first Wolfe condition checked with a constant of c_1 = 0.1, and step size halved until first Wolfe condition satisfied.
One part of the uncmin code that I can't immediately reconcile with equations from Nocedal & Wright is the code for updating "H1". The commit message that introduced it says "Sherman-Morrison for the BFGS update". The math seems to agree with the update for H_k , the approximate inverted Hessian, on the wikipedia page for BFGS [3] (with a quirk where one H_k should arguably be a H_k^T, but the approximation H_k is meant to be symmetric, provided BFGS doesn't break down, so that's probably fine).
edit: Nocedal & Wright comment on some pitfalls with simple line search procedures when implementing BFGS
> The performance of the BFGS method can degrade if the line search is not based on the Wolfe conditions. For example, some software implements an Armijo backtracking line search (see Section 3.1): The unit step length alpha_k = 1 is tried first and is successively decreased until the sufficient decrease condition (3.6a) is satisfied. For this strategy, there is no guarantee that the curvature condition y_k^T s_k > 0 (6.7) will be satisfied by the chosen step, since a step length greater than 1 may be required to satisfy this condition. To cope with this shortcoming, some implementations simply skip the BFGS update by setting H_{k+1} = H_k when y_k^T s_k is negative or too close to zero. This approach is not recommended, because the updates may be skipped much too often to allow H_k to capture important curvature information for the objective function f . In Chapter 18 we discuss a damped BFGS update that is a more effective strategy for coping with the case where the curvature condition (6.7) is not satisfied.
"some software implements an Armijo backtracking line search (see Section 3.1)" -- this appears to match uncmin's simple line search code. I do not see any curvature condition check in the uncmin code, so this could lead to slow convergence or failure for some problems if the line search chooses a step size where y_k^T s_k <= 0 .
[1] https://github.com/sloisel/numeric/blame/656fa1254be540f4287... [2] https://docs.scipy.org/doc/scipy/reference/generated/scipy.o... [3] https://en.wikipedia.org/wiki/Broyden%E2%80%93Fletcher%E2%80...