As much as I don't want it to be, this is a misrepresentation of what the Nvidia exec was likely saying. "Humans are cheaper than AI!" vs "We have one engineer spending $2 million annually on compute".
You can think of one engineer as manager. Question is whether that $2 million spend is more or less than spend on employees doing the work the AI has delivered under direction of one engineer.
If the same work would have been cheaper if done by people, then yes AI is more expensive.
I don't understand how a company can have IP copyright rights on code that is inherently uncopyrightable (in the unlikely event scotus rules that way).
Worst case, meta will sue the programmer who produced infringing code.
I mean if the code is not copyrighteable that does not mean anything; it's just public domain code except that meta will just use good old security by obscurity to protect it. If somehow a meta programmer vibes code, say, VVVVVV, and Terry Cavanagh recognizes it on his facebook feed and sues meta, and wins, all that will happen is that meta will take down the copy of VVVVVV, will fire and sue the engineer that vibe coded it and call it a day.
I spent 20 years in industry before moving to academia, and this resonates for me. I'm not naive enough to think that we'll do the right thing here, but I can dream.
I spent 8 years in academia (2004 - 2012) before moving to Industry. But as I've aged I've thinking of going back. I made it good in Industry so I have enough to jump to Academia without worrying about money... but I just hated the "publish or perish" mentality, and writing papers (I wonder what's the state of that now with LLMs... back in the day I was reviewer for some journals and most papers were pretty bad).
> Those people are going to be the absolute most dangerous possible thing you can do to a company.
I hear you, but here's the thing: the companies don't give a shit about software quality any farther than it takes to keep you coming back as a customer. And it's actually been like this for a long time. They're going to hire people who can ship who-cares-how-buggy software as fast as possible. It's better for the bottom line.
And that pains my soul and pains me as a consumer (because we already had to put up with too much crap software before genAI started producing it in reams), but there's very limited money in the kind of quality you're talking about.
I hear stories from people interviewing now--the interviewers react negatively if you tell them you're working on keeping your programming skills fresh. They just want to know how many agents you can run at a time and how many lines of code you can generate per day.
Personally, I think someone skilled in software development working with genAI is going to be more productive than someone not skilled working with genAI, but I don't think that's even being selected for now.
Grim days.
The one thing that gives me hope is that every time we ask our graduates who are now in the field (and all work with AI) if we should drop classic CS education and only do AI, they all emphatically reply in the negative. Yes, we need some AI education in there, but they want the foundation, too.
It is rather backwards. I've not seen things quite as bad as interviewers wanting to know how many agents you can run, but the attitude of "launch & fix later" is always present and kind of depressing.
Then I think of the companies (not necessarily software) that have had long term success and their products have been quite high quality at least at some point in time. The count of genAI instances someone can keep in flight is certainly a weird metric that I think will hurt the companies who choose to ignore quality.
Unfortunately it's a long process as it's possible to get very far with great marketing and sales with a poor quality product too. Then cash out before customers figure out that there's something else that is better. I have no idea if this pattern will ever self-correct.
Off topic: I followed your guides for network programming years ago getting my tiny C server/client setup working. Thank you so much for writing them!
This isn't just another translation layer, though. It's squishy and stochastic. It's more like saying "managers think at a higher level of abstraction". Which is true, but it's not the same as compiled code.
GenAI is like a non-deterministic compiler. Just like your manager's reports except with less logical thinking skill. I'd argue this is still problematic.
reply