I’d just like to thank the author for giving the correct t reason for the Winchester Mystery House instead of just blindly repeating the “she went crazy” line story as truth.
Does anyone have numbers for churn vs. cumulative code?
Most of my commits (hand written and AI) have delete counts that are 75-110% the added line count.
The point that many developers will probably forget to tell the LLM to run cleanup/refactoring paths is probably true though. (I’ve definitely found ghost-chasing bugfixes in all sorts of corners of LLM generated code).
Yeah /simplify is your friend. That and constrained prompts - “refactor x for simplicity - resulting diff must remove n lines of code. Dont change tests. “
GNU didn't kick anything off. It was an attempt to document something that was already in full swing.
What was in full swing was Open Source, powered by scratch-your-own-itch. What was taking time was for the business world to learn the lessons by both carrot (Linux) and stick (Unix Wars, vendor lock-in, dozens of crappy competing standards). When Steve Balmer winds up using your language, you moved the ball.
Many ideas from The Cathedral & The Bazaar made it into The Lean Startup. The Cathedral development model was more related to waterfall. YC was already chugging along, but you can bet your ass PG was already steeped in the tea.
Arguably Linux wouldn’t have happened absent GNU although a lot of people I know argue that BSD would have eventually evolved to someplace like where Linux is today in spite of various legal and community factors holding it back.
I used a similarly shaped argument with different nouns to highlight the ambiguity, and now you see why that's problematic. Don't just make blind assertions without linking it back to some concrete, at least arguing that some mechanism was *dominant*.
Not really. From the essay: “I had been preaching the Unix gospel of small tools, rapid prototyping and evolutionary programming for years. But I also believed there was a certain critical complexity above which a more centralized, a priori approach was required. I believed that the most important software (operating systems and really large tools like the Emacs programming editor) needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time.”
So the Unix-philosophy small tools that constitute an important part of the GNU project are excluded. Rather, it’s about any programs of significant complexity, like Emacs (and likely GCC) and many commercial products. While the cathedral model doesn’t imply closed source, it implies building “in […] isolation”, rather than in the open. It may or may not remain proprietary and/or closed source.
Linux demonstrated to ESR that complex projects can also be built in the open with many collaborators, and don’t necessarily require the cathedral; which inspired the essay.
What I'm saying "not really" to is the claim that the "cathedral" does only refer to the GNU project and not to proprietary closed source. This is not the case. It refers to certain portions of GNU, as well as to certain segments of proprietary closed source. Neither GNU nor proprietary closed source is a criterion for the "cathedral". The criterion is the size and complexity of the software, independent of whether it is proprietary or not, or closed source or not.
GNU follows the Unix philosophy. ESR wrote The Art of Unix Programming [0] in which he writes extensively about it. GNU was envisioned to be a clone of Unix [1].
It wasnt one thing, gnu is a case of cathedrals. Corps are usually more cathedrally than bazaary because of their hierarchical top down structure, but ymmv, an elon musk or steve jobs company will be more cathedral than a conglomerate like unilever or a google or microsoft
I will not sit here idly as you disparage an entire kingdom of diverse, beautiful, highly efficient, decentralized problem-solvers. Some of my best friends are slime molds.
So, I’ve explored AI coding, but my conclusion up to this point has been that it’s interesting, but the code is sometimes a mess, and sometimes it will completely crater the project to the point where you just have to throw it all away and start over. After reading this article, I keep wondering if we’re really being productive or just creating lots of crappy code at machine speeds now. It’s one thing to say that we are using a “security agent,” for example, to ensure the security of the code, but quite another to actually know (or at least strongly believe) that our code is really secure. With all the froth of generating thousands of lines of code, how are we sure? In some sense, my question is whether we’re building a Winchester Mystery House or a house of cards.
Software developers working on their own have built monstrosities before (not as quickly) but it seems likely that this is a skill issue and we will learn how to use these tools better. You can tell coding agents to work on cleaning up code, improving the architecture, and so on.
Maybe adopting some hard constraints on code complexity that agents have to work within would help?
Yep, surely humans write bad code, too. But not nearly as fast. This feels a lot like hiring oodles of hyper-productive junior developers. Are we going to get true productivity out of that or a scrambled mess? I don’t know the answer to that. Or maybe the models get so much better that it’s like hiring oodles of senior developers and architects and the payoff is real.
Humans just don't commit the same kinds of booboos as LLMs do. My team at work recently started using LLM agents for coding and I have since seen WTFs that I know no human would ever write.
It's not all bad! It's also enormously fun. I've been able to work on things I'd been putting off forever. When I can use LLM agents, I less often feel paralyzed by perfectionism, which is probably the biggest productivity boost I get. My own code has not decreased in quality, and I think that for the truly important things, neither has that of my colleagues.
But LLMs don't make junior dev mistakes. They make "my brain has worms in it" mistakes.
It used to be that most college graduates had little or no experience working on large-scale projects. Now they’ll get to speed-run the issues involved in maintaining a large project.
Seems about right and American: pervert a dead person's reputation and personality into a cartoonish mythological character to fabricate lore for a profitable tourist attraction. Add doors to nowhere and guided tours repeating misinformation, exit through the gift shop.
PS: While I grew up in San Jose, my parents unfortunately took me on that tour once. It looked extremely staged and all about $$$ then and I was a dumb kid. It occupied a plot of land in a very busy area across what is now Santana Row and beside the original hemispherical buildings of the first Century 21 theaters that originally had massive parking lots that extended all the way to Winchester Blvd back when people went to the movies. The parking lot was only eclipsed by the nearby Winchester Drive-In in Campbell. Where Santana Row is at the corner of Stevens Creek and Winchester was the car dealership Courtesy Chevrolet.
Julia Morgan, Winchester's contemporary, was the first woman to obtain an architecture license in California in 1904 and had a very prolific career throughout the state including her most famous - Hearst Castle - commissioned in 1919.
One of my guilty pleasures as a software engineer is that working on my Winchester House is way more fun than working on someone else's Cathedral, or Bazaar.
> Which is why maintainers feel like they’re drowning.
How about actually funding opensource project mantainers? We have non profit orgs, that eat billions of public funds. We spend biilions for influencing hardly measurable metrics, with very nebulous benefits in far distant future.
Direct sponsoring of critical projects would have far better and concrete benefits.
The problem is the cost is so wildly asymmetric. When everyone with a computer and a subscription can vibe code low quality features, when everyone can submit dubious security bug reports, no amount of funding will even that out. Producing submissions is essentially free while triaging and reviewing remains very expensive.
3 years ago the cost was asymmetric in the other direction. The cost of writing code was high. The cost of finding security bugs was extremely high. The cost of triaging and reviewing was basically the same as it is today.
Large corporations that are well funded are facing the exact same issues internally right now. With agent output so cheap, how do you deal with the deluge? It’s not practical or desirable to have your best engineers doing nothing but reviewing generated code, some of which is likely very low value.
This plus accountability is the way; and what I think I mean here is "accountability for those who choose to USE (maybe not create) the software in a way that may be harmful."
If you'd like to push that accountability to the developers, that can work, but they should be paid or otherwise compensated accordingly for the risk they take on.
Why spend more money just to trawl through BS contributions? Cutting off the nonsense would be both cheaper and have the same result.
More funding for more development of open source is a good thing, but more money to ease the burden imposed by an ever rising tide of slop is not a solution.
>"Sarah didn’t build her mansion to house ghosts, she built her mansion because she liked architecture."
That quote from the article directly-contradicts what multiple tour-guides at the Winchester Mystery House in California have told me over many decades. Specifically: Sarah Winchester built the house because she was told that the ghosts of all those killed by Winchester guns would haunt her unless her house was sufficiently labyrinthine, and endlessly expanding; to confuse them.
Visit the house (the tour is rad) and see for yourself the architecture. There is no reasonable explanation for internal doors leading to sheer-drops, throughout the house, and other bizarre 'traps', apart from Sarah legitimately believing she had to confuse the ghosts.
This is more akin to a programmer consciously obfuscating and expanding a codebase to make it impossible for their angry-users to ever finish auditing it, or to determine its author.
> That quote from the article directly-contradicts what multiple tour-guides at the Winchester Mystery House in California have told me over many decades.
The house is run by an organization that has a very vested interest in playing up the supernatural element of the house. Some tour guides have gone on record discussing their frustrations with having to repeat known falsehoods to guests.
> Visit the house (the tour is rad) and see for yourself the architecture. There is no reasonable explanation for internal doors leading to sheer-drops, throughout the house, and other bizarre 'traps', apart from Sarah legitimately believing she had to confuse the ghosts.
Parts of the house were damaged by the 1906 earthquake and were not rebuilt. A lot of the weird path-to-nowhere stuff is "the destination collapsed during the earthquake", nothing particularly mysterious there.
As others have noted, the guides are full of tall tales. I grew up in San Jose and remember when the property next to the Winchester Mystery House was a drive-in theater, and before the House was fire-damaged. The B.S. was well-known even then. My father, who moved to San Jose in the 1950s, even explained it to me as a child after some friends who were into ghost stories told me about it.
I don't know if it's still there, but my favorite part of the site was the detached museum showing some of the earliest pieces developed by the Winchester Repeating Arms Co. Easy to miss as it is not part of the house or the guided tour.
Does anyone know what “agent tea” is in the second graph? There is a paper about a protocol but it seems a bit obscure to be featured in this context and the other two points on the graph are models.
Yea, I was curious about that, too. It’s one thing to vibe code a one-off personal project. It’s another to create something that can run the distance.
I'm pretty sure that I could consistently spew 1000 lines a day/per commit if it was mostly cut-n-pasting of existing code, that I had complete access to, with some minor variations.
Before, when code was laboriously produced by hand, sharing it so other people could use it was seen as a gift. Today there is no such incentive. If I pick up some software and find a bug in it or a feature it's missing, I can (have Claude) fix that bug or add that feature, and keep it to myself. Why bother contributing that fix or feature to the world if it's just going to be met with complaints and accusations of it being vibe coded slop? Just as maintainers don't owe me anything, by the rules of the license, if I'm not distributing the binaries, I don't owe the world a public fork of the source code with my changes.
The Winchester mystery house is notable for becoming a public tourist attraction instead of a closed private piece of real estate. How do we evolve the Cathedral and the Bazaar to the modern era? I don't know. I know that my life on my computer is drastically improved by spending an afternoon a week building better tooling for myself, and I realize it's built on top of other's contributions to the world, but at the same time, I don't know how to contribute back under the new regime.
Most of my commits (hand written and AI) have delete counts that are 75-110% the added line count.
The point that many developers will probably forget to tell the LLM to run cleanup/refactoring paths is probably true though. (I’ve definitely found ghost-chasing bugfixes in all sorts of corners of LLM generated code).
What was in full swing was Open Source, powered by scratch-your-own-itch. What was taking time was for the business world to learn the lessons by both carrot (Linux) and stick (Unix Wars, vendor lock-in, dozens of crappy competing standards). When Steve Balmer winds up using your language, you moved the ball.
Many ideas from The Cathedral & The Bazaar made it into The Lean Startup. The Cathedral development model was more related to waterfall. YC was already chugging along, but you can bet your ass PG was already steeped in the tea.
I can see now that you expanded your comment after I wrote my response. Please leave a marker ("later:" or something) when you do that.
Who are you performing for?
Legacy Internet is so cooked.
So the Unix-philosophy small tools that constitute an important part of the GNU project are excluded. Rather, it’s about any programs of significant complexity, like Emacs (and likely GCC) and many commercial products. While the cathedral model doesn’t imply closed source, it implies building “in […] isolation”, rather than in the open. It may or may not remain proprietary and/or closed source.
Linux demonstrated to ESR that complex projects can also be built in the open with many collaborators, and don’t necessarily require the cathedral; which inspired the essay.
The statement you chose makes a carve-out for Unix, not GNU. It doesn't support "not really."
GNU follows the Unix philosophy. ESR wrote The Art of Unix Programming [0] in which he writes extensively about it. GNU was envisioned to be a clone of Unix [1].
[0] http://www.catb.org/esr/writings/taoup/html/
[1] http://www.catb.org/esr/writings/taoup/html/apa.html
Slime Mold Identification & Appreciation (amazing photography)
https://www.facebook.com/groups/1510123272580859
Maybe adopting some hard constraints on code complexity that agents have to work within would help?
It's not all bad! It's also enormously fun. I've been able to work on things I'd been putting off forever. When I can use LLM agents, I less often feel paralyzed by perfectionism, which is probably the biggest productivity boost I get. My own code has not decreased in quality, and I think that for the truly important things, neither has that of my colleagues.
But LLMs don't make junior dev mistakes. They make "my brain has worms in it" mistakes.
https://skepticalinquirer.org/2024/08/the-truth-about-sallie...
PS: While I grew up in San Jose, my parents unfortunately took me on that tour once. It looked extremely staged and all about $$$ then and I was a dumb kid. It occupied a plot of land in a very busy area across what is now Santana Row and beside the original hemispherical buildings of the first Century 21 theaters that originally had massive parking lots that extended all the way to Winchester Blvd back when people went to the movies. The parking lot was only eclipsed by the nearby Winchester Drive-In in Campbell. Where Santana Row is at the corner of Stevens Creek and Winchester was the car dealership Courtesy Chevrolet.
Julia Morgan, Winchester's contemporary, was the first woman to obtain an architecture license in California in 1904 and had a very prolific career throughout the state including her most famous - Hearst Castle - commissioned in 1919.
How about actually funding opensource project mantainers? We have non profit orgs, that eat billions of public funds. We spend biilions for influencing hardly measurable metrics, with very nebulous benefits in far distant future.
Direct sponsoring of critical projects would have far better and concrete benefits.
The problem is the cost is so wildly asymmetric. When everyone with a computer and a subscription can vibe code low quality features, when everyone can submit dubious security bug reports, no amount of funding will even that out. Producing submissions is essentially free while triaging and reviewing remains very expensive.
3 years ago the cost was asymmetric in the other direction. The cost of writing code was high. The cost of finding security bugs was extremely high. The cost of triaging and reviewing was basically the same as it is today.
Large corporations that are well funded are facing the exact same issues internally right now. With agent output so cheap, how do you deal with the deluge? It’s not practical or desirable to have your best engineers doing nothing but reviewing generated code, some of which is likely very low value.
If you'd like to push that accountability to the developers, that can work, but they should be paid or otherwise compensated accordingly for the risk they take on.
More funding for more development of open source is a good thing, but more money to ease the burden imposed by an ever rising tide of slop is not a solution.
That quote from the article directly-contradicts what multiple tour-guides at the Winchester Mystery House in California have told me over many decades. Specifically: Sarah Winchester built the house because she was told that the ghosts of all those killed by Winchester guns would haunt her unless her house was sufficiently labyrinthine, and endlessly expanding; to confuse them.
Visit the house (the tour is rad) and see for yourself the architecture. There is no reasonable explanation for internal doors leading to sheer-drops, throughout the house, and other bizarre 'traps', apart from Sarah legitimately believing she had to confuse the ghosts.
This is more akin to a programmer consciously obfuscating and expanding a codebase to make it impossible for their angry-users to ever finish auditing it, or to determine its author.
The house is run by an organization that has a very vested interest in playing up the supernatural element of the house. Some tour guides have gone on record discussing their frustrations with having to repeat known falsehoods to guests.
> Visit the house (the tour is rad) and see for yourself the architecture. There is no reasonable explanation for internal doors leading to sheer-drops, throughout the house, and other bizarre 'traps', apart from Sarah legitimately believing she had to confuse the ghosts.
Parts of the house were damaged by the 1906 earthquake and were not rebuilt. A lot of the weird path-to-nowhere stuff is "the destination collapsed during the earthquake", nothing particularly mysterious there.
As others have noted, the guides are full of tall tales. I grew up in San Jose and remember when the property next to the Winchester Mystery House was a drive-in theater, and before the House was fire-damaged. The B.S. was well-known even then. My father, who moved to San Jose in the 1950s, even explained it to me as a child after some friends who were into ghost stories told me about it.
I don't know if it's still there, but my favorite part of the site was the detached museum showing some of the earliest pieces developed by the Winchester Repeating Arms Co. Easy to miss as it is not part of the house or the guided tour.
Winchester Mystery Potemkin Village.
The Winchester mystery house is notable for becoming a public tourist attraction instead of a closed private piece of real estate. How do we evolve the Cathedral and the Bazaar to the modern era? I don't know. I know that my life on my computer is drastically improved by spending an afternoon a week building better tooling for myself, and I realize it's built on top of other's contributions to the world, but at the same time, I don't know how to contribute back under the new regime.