It sounded of interest to me, but I read it and closed the tab within a page or so as it wandered off into tech arcana. Shame. There may be an interesting idea in here but it's phrased in terms I think few will be able to follow and understand.
I did not finish it but I saw no mention of the lambda calculus or of currying, both of which -- from my very meagre understanding -- seem directly relevant to what I understood to be the core point, which seems to be about anonymous functions.
I don't think you're missing much. Yeah, the main point seems to be that if your language has closures, you suddenly can express a lot of things that were out of reach before. Not a new insight. But there's another point I think that is hinted at on the topic of control abstractions. Or at least I'm reminded of the topic. It's better and more succinctly and explicitly talked about in an early chapter of the free book Patterns of Software: https://dreamsongs.com/Files/PatternsOfSoftware.pdf
The extra point might be that more languages should facilitate defining your own control abstractions just as they support defining your own data abstractions. Functions are one way of making data abstractions, but languages often provide multiple ways. Closures are one way of doing a type of control abstraction (involving such things as delayed or multiple evaluation), but there are other ways too. For some reason we see value and a need for defining our own data abstractions, but not so much for control abstractions, even though (according to the book) once they were often co-designed, like Fortran's arrays and DO loop. And for some reason even in the few languages that do support making your own control abstractions, like Lisp, you'll still find users who disapprove of doing so, claiming all you need are the standard existing methods like looping, map/reduce style functions, and some non-local exits.
which is allegedly about programming with macros but I'd say 80% of the time he implements something with closures and then makes a macro-based implementation that peforms better. That 80% can be done in Python and the other 20% you wouldn't want to do in Python because Python already has those features... And if you wanted to implement meta-objects in Python you would do it Pythonically.
Graham unfortunately doesn't work any examples that involve complex transformations on the expression trees because these are hard and if you want to work that hard you're better off looking at the Dragon book.
You can work almost all the examples in Norvig's Common Lisp book
Currying is done automatically in Haskell but not in Lisp. If you wanted currying in Lisp you could write it, but Lisp programmers don't depend on or talk about currying as much as Haskell programmers do.
In languages like Java (or C) you can build S-expression like structures like so
Variable<Integer> x = newVariable();
Expression<Integer> = add(x,literal(5));
x.set(15);
System.out.println(eval(x)) // prints "20"
and it is not that hard to either serialize these to code or run them in a tree-walking interpreter where quote() and eval() imply an extended language where you can write functions that work on Expression<Expression<X>>. Type erasure causes some problems in Java that make you sometimes write a type you shouldn't have to and you do have to unerase types in method names which is a little ugly but it works.
I did some experiments towards this to convince myself it would work
had I really kept at it I would have bootstrapped by developing a ferocity0 which was sufficient to write a code generator that could generate stubs for the Java stdlib + a persistent collections library and then write a ferocity1 in ferocity0, and if necessary ferocity(N+1) in ferocityN until it supported "all" of Java, though "all" might have omitted some features like "var" that are both sugar and use type inference that ferocity would struggle with -- if you need sugar in this system you implement it with metaprogramming.
The idea is that certain projects would benefit from balls-to-the-walls metaprogramming and the code compression you get would compensate for the code getting puffed up. My guess is a lot of people would see it as an unholy mating of the worst of Java and Common Lisp. However, I'm certain it would be good for writing code generators.
The solution, which I often seen in practice, is to eventually write code generators, which is what Lisp macros are, after all. I've seen it in C and wrote a big piece about it that was posted here some time ago[1], about the extra tools, code generators, special formats and standards employed and needed to make up for C's deficiencies (in respect to meta-programming, at least).
Everywhere I see code generators, that's a feature lacking in the main language used for the project. Then you bring in other tools to make up for that deficiency. Only, usually, we don't call that deficiency, since we are used to things being that way. It is called day-to-day business. I think that's what I've tried to convey in the article.
I have done something exactly like this in production for a system that turned natural language into SQL. This was pre-LLM, so we had models that produced intent and keywords as structured output and we had to turn it into queries for several backends.
The project didn't work out for a variety of reasons, but technically it was beautiful: it produced query plans that in many cases were identical to those from the queries analysts wrote by hand.
So yeah, I accidentally wrote a compiler. Does it still count?
JooQ isn't everybody's taste but I use it for my job and I think it's great particularly in that you can reuse expressions and write generators for complex queries. We have a powerful search interface that combines full-text with other kinds of queries ("Is about topic T", "Data was collected between S and E") that is beautiful. I think it's funny how JooQ has that lispy f(a,b) style (no accident it is like ferocity) and how Sqlalchemy is really fluent and takes advantage of operator overloading.
I find it is a real challenge to come up with a good title. On the one hand it should, probably, convey to the potential reader something about the contents of the article, on the other hand it should be something to differentiate it from the rest of the articles published on the same subject.
I like those that read something like a punch-line, that come across as something different that just a summary of the article. But these maybe work best for literature, prose, movies, etc.
https://news.ycombinator.com/item?id=28851992
https://news.ycombinator.com/item?id=44359454
No comments on any of them.
It sounded of interest to me, but I read it and closed the tab within a page or so as it wandered off into tech arcana. Shame. There may be an interesting idea in here but it's phrased in terms I think few will be able to follow and understand.
I did not finish it but I saw no mention of the lambda calculus or of currying, both of which -- from my very meagre understanding -- seem directly relevant to what I understood to be the core point, which seems to be about anonymous functions.
The extra point might be that more languages should facilitate defining your own control abstractions just as they support defining your own data abstractions. Functions are one way of making data abstractions, but languages often provide multiple ways. Closures are one way of doing a type of control abstraction (involving such things as delayed or multiple evaluation), but there are other ways too. For some reason we see value and a need for defining our own data abstractions, but not so much for control abstractions, even though (according to the book) once they were often co-designed, like Fortran's arrays and DO loop. And for some reason even in the few languages that do support making your own control abstractions, like Lisp, you'll still find users who disapprove of doing so, claiming all you need are the standard existing methods like looping, map/reduce style functions, and some non-local exits.
Graham's On Lisp is a really interesting book
https://paulgraham.com/onlisptext.html
which is allegedly about programming with macros but I'd say 80% of the time he implements something with closures and then makes a macro-based implementation that peforms better. That 80% can be done in Python and the other 20% you wouldn't want to do in Python because Python already has those features... And if you wanted to implement meta-objects in Python you would do it Pythonically.
Graham unfortunately doesn't work any examples that involve complex transformations on the expression trees because these are hard and if you want to work that hard you're better off looking at the Dragon book.
You can work almost all the examples in Norvig's Common Lisp book
https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...
in Python and today Norvig would advocate that you do.
Allowing those in C would require a modicum of effort, while other languages make these kinds of syntax extension fairly easy.
I did some experiments towards this to convince myself it would work
https://github.com/paulhoule/ferocity/blob/main/ferocity0/sr...
had I really kept at it I would have bootstrapped by developing a ferocity0 which was sufficient to write a code generator that could generate stubs for the Java stdlib + a persistent collections library and then write a ferocity1 in ferocity0, and if necessary ferocity(N+1) in ferocityN until it supported "all" of Java, though "all" might have omitted some features like "var" that are both sugar and use type inference that ferocity would struggle with -- if you need sugar in this system you implement it with metaprogramming.
The idea is that certain projects would benefit from balls-to-the-walls metaprogramming and the code compression you get would compensate for the code getting puffed up. My guess is a lot of people would see it as an unholy mating of the worst of Java and Common Lisp. However, I'm certain it would be good for writing code generators.
Everywhere I see code generators, that's a feature lacking in the main language used for the project. Then you bring in other tools to make up for that deficiency. Only, usually, we don't call that deficiency, since we are used to things being that way. It is called day-to-day business. I think that's what I've tried to convey in the article.
[1] https://news.ycombinator.com/item?id=41066544
https://www.jooq.org/
and
https://www.sqlalchemy.org/
JooQ isn't everybody's taste but I use it for my job and I think it's great particularly in that you can reuse expressions and write generators for complex queries. We have a powerful search interface that combines full-text with other kinds of queries ("Is about topic T", "Data was collected between S and E") that is beautiful. I think it's funny how JooQ has that lispy f(a,b) style (no accident it is like ferocity) and how Sqlalchemy is really fluent and takes advantage of operator overloading.
I like those that read something like a punch-line, that come across as something different that just a summary of the article. But these maybe work best for literature, prose, movies, etc.