The only skill set that exists in knowledge work is formal logic.
But yeah I'm sure increasing energy input into symbolic systems will not increase their entropy and thus the demand for workers capable of formal logic to keep it in check.
Because dumping representations of formal logic on a exponentially increasing repository of compressed data will surely start giving out output with higher informational content than the input. Any time now. Just double the size of the model a few more times and we will start getting lower entropy outcomes for free.
"Most of what I've learned from talking to people about their workflows is counterintuitive and subtle."
Seriously? Are we at the point of doing rain dances for these models and describing the moves as "counterintuitive and sublte"? This is some magical thinking level self delusion.
Downvote all you like, or ignore this. Agency is being taken away from us, no one gets to say we didn't see it coming down the line because we did and we said something and our peers treated us like ignorant and self interested for pointing out the obvious.
> Are we at the point of doing rain dances for these models and describing the moves as "counterintuitive and sublte"
Yes, we are. LLM cultists can downvote as much as they like, but the reality is that with all the incantations and coaxing, we not only don't see a positive effect, we see a net negative effect on the quality of our software. On the ground level of OSS we are drowning in AI generated slop: https://www.theregister.com/2025/07/15/curl_creator_mulls_ni...
On the enterprise level we are creating more and worse code, faster. In the news, we are generating more and worse and factually wrong articles, faster. In our personal lives, guided by our sycophantic LLMs assistants, we offload more and more cognitive chores, becoming dumber and dumber.
LLM-produced slop is like the plastic of text: cheap and easy to use and proliferate, and then 40 years later you realize it's everywhere in the environment and in your brain, and that it's toxic to all life.
Also in general I find developers usually overestimate the value of good code outside of mission critical systems. The true value return of spending way more time and effort to improve code quality often simply isn’t worth it from a holistic perspective.
> The true value return of spending way more time and effort to improve code quality often simply isn’t worth it from a holistic perspective.
Sure, if you want to do with 100 developers what 10 can do better, cheaper and faster, don't spend time on code quality. Just vibe it. More work for the experts to fix that mess later.
> And like plastic, a fantastic material for many jobs
Yes, by the same logic arsenic has many pros (and some cons, admittedly) in paint production, thorium is a not bad material for toothpaste (google it), and asbestos is just dandy for fire insulation (with some teeny tiny negatives for your lungs).
The fact that plastic is in the bloodstream of every living being and it's a strong endocrine disruptor and there is literally a garbage patch of it floating in our oceans is a small price to pay for the convenience and pleasure of drinking coca-cola from a lighter-than-glass bottle! Fantastic material indeed! What other fantastic materials are there? Lead for pipes? White phosphorus for matches?
There isn't a hint of bad faith in the Parent's comment, it's just a sarcastic tongue-in-cheek way to say that it's not really cool to poison the fuck out of ourselves just because x material is kinda nice for y.
But yeah I'm sure increasing energy input into symbolic systems will not increase their entropy and thus the demand for workers capable of formal logic to keep it in check.
Because dumping representations of formal logic on a exponentially increasing repository of compressed data will surely start giving out output with higher informational content than the input. Any time now. Just double the size of the model a few more times and we will start getting lower entropy outcomes for free.