I don’t totally disagree, insofar as a very low GPA is probably a countersignal of common sense and work ethic. The problem you get is by converting these things from measures to targets, and then putting them on a permanent record.
Suddenly everyone is competing for limited slots, the minimum standard for hiring goes from high GPA to perfect GPA, any misstep in your learning process, any teacher who didn’t like you, any elective that may have enriched you personally but you weren’t particularly good add it, etc… gets distilled down into a numerical value (like a credit score) that bureaucrats treat as some sort of object truth. The ATS filters you out without you ever having had a shot, orgs optimize for low-risk tolerance individuals and organizations are starved of potential creative problem solvers and other types of change agents.
You might be missing that a lot of companies are giddy that the mgmt can just vibe code stuff and there's no opportunity for engineers to be involved, (except for when it crashes?). I use AI tools and they are nice, but the mgmt are mostly not logical and need someone to sort through their bullshit.
Yeah no. Almost all companies I've chatted with - from MSPs to C-Suite of F10s - expect and demand humans-in-the-loop. I'm also on a couple boards and we've aligned on the same expectation as well.
Look, AI/ML and especially LLMs are powerful, but there does remain a degree of instability and non-determinism which will require human intervention to remediate.
That said, there is a lot of dev work in companies that is a cost-center, and those are the portions that will start getting vibe coded and deployed in product with little-to-no oversight (eg. a support portal for SMBs at an enterprise), but the equivalent feature would have already been an afterthought even without LLMs and probably given to a couple SWEs we'd be fine re-orging in a quarter anyhow.
> Yes just not driven or owned by engineers. That's what I'm seeing from company and a few peer's companies
I mean, it depends on the feature/product and how critical it is to the health of the business.
Like I mentioned in my edited comment, there is a lot of dev work in companies that is a cost-center, and those are the portions that will start getting vibe coded and deployed in product with little-to-no oversight (eg. a support portal for SMBs at an enterprise), but the equivalent feature would have already been an afterthought even without LLMs and probably given to a couple SWEs we'd be fine re-orging in a quarter anyhow because we cannot justify spending $500K-750K a year (the backend cost of 3 FT SWEs or Contractors for a company) on a customer form which nets $0 in revenue and is not directly tied with pipeline generation.
We are probably talking past each other but I am saying I see:
Leaders thinking they will basically prompt out new revenue generating features with no human engineers to "figure it out". Not cost centers, low hanging fruit, etc. No these are not giant corps like Google or whatever and likely run by morons, but it was easier when they did not think they were "empowered". There is no opportunity for engineers to "think in higher abstractions" or whatever in these cases.
> Leaders thinking they will basically prompt out new revenue generating features with no human engineers to "figure it out"
Yeah and I'm telling you as one of those leaders that most of the leaders I am meeting with know this is unrealistic and non-tech enterprises.
I think the issue is, a lot of SWEs think their work actually matters to the bottom line (and PMs and execs will massage their ego - I'm guilty of doing this as well) but in reality they don't matter because they are working in a cost-center product or feature.
Every SWE on HN should sit down and ask themselves whether or not
1. The feature they are working on directly generates revenue for their employer.
2. If it does, does it generate revenue equivalent to at least 1% of overall revenue per year.
3. Whether the cost of your team of SWEs+PMs are putting the feature/product in the red (ie. If you are 3 Eng and 1 PM working on a product who's revenue is only $500K/yr).
If all of those questions are negative, your product/feature is at risk from LLMs but was already at risk of being offshored.
I've unfortunately stopped reading articles before reading comments here as it's all mostly garbage now. I'm not sure what people are trying to accomplish with generating blogs aside from either clout farming or marketing for their companies.
I originally thought AI-assisted writing would help synthesize what felt like original ideas I had, that I just wanted to get out there without the laborious task of editing. I didn't expect the writing to end up feeling so incredibly tired and watered-down, but upon more reflection on how the models actually work, it's not all that surprising. Uniqueness in writing is both in the style/structure and the message, and all AI seems to do is find the local maximum of both. Lately I've found myself going back to writing things myself (not all the time, depends on the task), and wishing there was a way I could just completely eliminate the slop from certain things I look at. I worry about all our minds, and the garbage-in, garbage-out net effect of this.
Honestly if you've bought (oh sorry "licensed") a movie, I'd have 0 problem torrenting what you've paid for vs dealing with these games. Companies just want forever subscriptions, not purchasing in any case.
You jest but I was flabbergasted when doing some AI backed feature that the fix was adding a "The result you send back MUST be accurate." to the already pretty clear prompt.
When I read comments here, Im beginning to realise many people get lost in the noise and cant seem to figure out what exactly is the true bottle-neck of producing great software that people (be it consumers or folks employed at firms) purchase/use.
Writing code faster alone doesn't change a great deal. Frankly it'll just create a larger influx of noise. Focus is very difficult to do, it'll become harder in the advent of LLMs.
LLM change the dynamic so an individual or small team can replicate the work large companies. Especially if they worked on that large application before.
reply