For some reason the app supports a separate standalone window mode as well [0]. It's not clear why the developer took the trouble to support two different modes when the menubar mode doesn't seem to add anything (like a live-updating icon for throughput).
Well, I can think of one reason why it wasn't that much more trouble. François Chollet had a nice tweet [1] on why removing human cognitive friction is resulting in needless software complexity.
> removing human cognitive friction is resulting in needless software complexity
This is kind of a hilarious statement just on the surface. Isn't removing burden from humans the whole purpose of software? How can you call the complexity "needless"?!
(the actual tweet seems to go into a bit more detail around being incentivized to find good abstractions)
I think you're conflating the burden of creation with the burden of relevance, suitability, usability and usefulness of the created artifact. The more the person in charge is disengaged, the sloppier the output is likely to be.
Making it trivial to generate software is making people turn their brains off. They don't think through the details and accept the "default" from an LLM which has no concern for the user experience.
It's "incline", the subtext is: "Reader, you might start thinking of a certain common stereotype at this point, but don't do that, because my argument is very different, and that stereotype is irrelevant or possibly untrue."
Compare to: "A pick-up truck is a useful thing to have, not because you are insecure about your genitalia, but because you can take home bigger products from IKEA."
The Gurkaniyan thing was true for Babur but I don’t think it was the case for later Mughals.
The poet Ghalib, who was the emperor Bahadur Shah Zafar’s contemporary, considered himself a descendant of the aristocracy and referred to himself as a “Mughal baccha” in a well-known quote (sourced from his letters, I believe).
Business logic is usually the most substantial part of legacy systems in my experience, so I imagine so.
Not to be too negative but a lot of modern software complexity is a prison of our own making, that we had time to build because our programs are actually pretty boring CRUD apps with little complex business logic.
I can only assume there's a ton of domain knowledge accrued over those years and beyond baked into the legacy code, that an LLM can just scoop up in a minute.
There is non-stimulant medication as well for ADHD. If you're really struggling, it might be worthwhile to suspend judgement and actually try these out for a while. In the worst case you go back to how you were without medication. For many people the potential upside is worth the experiment.
Well, people who are not above a threshold of experience yet are not in a position to self-assess and course-correct if their long term learning is being affected. And even less so if there is pressure to be hyper-productive with the help of AI.
Speculating here but I think even seniors who rely on AI all the time and enjoy the enhanced output are going to end up with impostor syndrome over the things they suspect they can no longer do without AI, and FOMO about all the projects they haven't yet attempted with AI despite working as hard as they can.
It’s particularly interesting that Anthropic came out yesterday and basically said, yeah, this stuff cannot be held right.
One can argue, convincingly perhaps, that Anthropic isn’t right and/or is marketing, but what they’re saying could be complete BS but the fact that there is doubt suggests that most people believe that no one can hold it right exists.
I’m quite pro AI, but given the radical asymmetry between the upside vs the downsides (the upside is at best maximum bliss for all existing humans, which has a finite limit, while the downside is the end of humanity which is essentially infinitely bad), our march forward in this area needs to be at least slightly more responsible than what we are doing now.
Many other patterns in the text, re-arranging to make it more obvious:
Why do we estimate stories? Because developer time is expensive and someone has to budget for it.
Why do we prioritise features in backlogs? Because we can’t build everything and we need to choose what’s worth the cost.
Why do we agonise over whether to refactor this module or write that debug interface? Because the time spent on one thing is time not spent on another.
We have compilers: either it compiles or it doesn’t.
We have test suites: either the tests pass or they don’t.
Planning. Estimating. Feature prioritisation. Code review. Architecture review. Sprint planning. All of it is downstream of the assumption that writing code is the expensive part.
... type systems, linters, static analysis. Software gives us verification tools that most other domains lack.
Towards the end this article contradicts itself so severely I don't think a human wrote this.
But this isn’t really about AI enthusiasm or AI scepticism. It’s about industrialisation. It has happened over and over in every sector, and the pattern is always the same: the people who industrialise outcompete those who don’t. You can buy handmade pottery from Etsy, or you can buy it mass-produced from a store. Each proposition values different things. But if you’re running a business that depends on pottery, you’d better understand the economics.
So which is it?
Will an industrialised process always outcompete a pre-industrial process?
Or do they not compete at all, because they value different things?
Hand made pottery cannot compete on price with industrially made pottery and therefore majority of pottery is made industrially.
100% human written code cannot compete on price with AI assisted code and therefore majority of code will be written with assistance of AI.
The aside about etsy handmade pottery is that because they can't compete with industrially made pottery on price so they were killed in mass market pottery products and had to find a tiny niche. Before industrialization handmade pottery was mass market pottery. It was outcompeted in mass market and had to move into a niche.
And that part of doesn't even translate into code. People are not buying lines of code, so you're not going to be buying handmade code.
Handmade pottery can offer variety (designs) not available in mass produced pottery. When you look at software, you can't tell if it was 100% handwritten or written with assistance of AI.
If the argument was about cost per unit output, bringing in Etsy didn't make sense at all, especially when they explicitly mention it was about valuing different things.
Handmade pottery can certainly be better quality than mass-produced pottery, just like handwritten code can be better quality than AI-assisted code. There is a spate of new MacOS apps that are clearly AI-written, with memory leaks, high CPU usage and UI that doesn't conform to MacOS conventions (in one instance I'm aware of, the interface has changed completely between updates). Of course users can tell the difference.
If you're going to spend a lot of time making sure the AI-generated code is perfect, does the industrialisation analogy still hold? There's a spectrum here from vibe-coded to agentic to Copilot-level assistance to no AI assistance (which may be a little silly) of course.
This is interesting because the cost of cloning code is zero. The human written code could be cheaper than the AI one because of the cost of distribution. The same does not apply for pottery because to create/distribute an extra bowl, you need >0 resources.
My point is (and the issue I have with the article) is that the quality of code (whatever that means) is not measured by the number of lines. Whether the code is generated by AI or humans, the market is not going to care. Same where it didn't care whether it was written by someone in Silicon Valley or in the middle of East Asia.
Quality indie software in a niche that Ikea is not addressing can make a decent income unlike a lemonade stand.
And unlike at (this hypothetical) Ikea, you wouldn't have to maintain the impression of 20x AI-augmented output to avoid being fired. Well, you could still use AI as much as you want, but you wouldn't have to keep proving you're not underusing it.
Well, I can think of one reason why it wasn't that much more trouble. François Chollet had a nice tweet [1] on why removing human cognitive friction is resulting in needless software complexity.
[0] https://github.com/darrylmorley/whatcable/blob/main/Sources/...
[1] https://x.com/fchollet/status/2045929951539707957
reply