Hacker Newsnew | past | comments | ask | show | jobs | submit | swiftcoder's commentslogin

> heavily regulated industries like healthcare, education, and transportation have seen basically no innovation in 50 years

Wut?


I'll grant you some of the more recent driver-attention monitoring features, but you'd be hard put to make the case that the blind-spot warning during lane changes, the cross-traffic warning when reversing out of a parking space, and the emergency brake when the car in front of you brakes hard, don't all save lives (and, perhaps more relevantly to the industrial players, collision insurance claims)

Do you have a source for the cameras and ADAS driving up the cost of the cars dramatically?

The €14k Dacia Sandero ships with camera-assisted emergency braking and lane assist. By the time you get up to a €24k MG 4, you get full level 2 driving. These don't seem like very high price thresholds


The big one for me is ballooning dependency trees in popular npm/cargo frameworks. I had to trade a perfectly good i9-based MacBook Pro up to an M2, just to get compile times under control at work.

The constant increases in website and electron app weight don't feel great either.


At some point it's hard not to care about the work you do everyday. And if you care, then you are going to find yourself donating a Saturday here or there to solving big DevEx papercuts that you can't convince management to care about.

Should it be this way? No. Is it this way in practice? Unfortunately often.


Deal with really big log files, mostly.

If you work at a hyperscaler, service log volume borders on the insane, and while there is a whole pile of tooling around logs, often there's no real substitute for pulling a couple of terabytes locally and going to town on them.


> often there's no real substitute for pulling a couple of terabytes locally and going to town on them.

Fully agree. I already know the locations of the logs on-disk, and ripgrep - or at worst, grep with LC_ALL=C - is much, much faster than any aggregation tool.

If I need to compare different machines, or do complex projections, then sure, external tooling is probably easier. But for the case of “I know roughly when a problem occurred / a text pattern to match,” reading the local file is faster.


I mean... if you spent $7,000 on it, do you really want to hide it away under the desk?

Yes, because you’re buying a tool, not a conversation piece.

Por que no los dos? ¯\(ツ)/¯

That’s why my Lian Li anniversary edition is next to my desk. Also because it is nearly as tall and wouldn’t fit under it.

Honest question: do you want them to? Most of us aren't running high-profile OSS projects, and drive-by PRs are a pretty widespread complaint about GitHub's model of opensource

> Repo hosting is the kind of thing that ought to be distributed/federated.

Hence Tangled and ForgeFed (which I believe is integrating in Forejo)


I hadn't heard of either of these, but I'm interested.

I think at this point the bigger barrier to me with leaving GitHub (professionally, at least) is all the non-GitHub stuff that integrates nicely with it and badly or not at all with other solutions. And like, I don't blame tool providers for making a rational economic choice in that regard, but if leaving GitHub means leaving seamless Sentry, Depot, Linear, editor plugins, AI integrations, etc that makes it a tougher pill to swallow.

I worked for years at a shop that had in-house GitLab and we felt this pain first hand all the time. GitLab tries to be a one-stop shop and own the whole project management and testing/deployment workflow by building everything in house, but there were always gaps and it was hard not to be jealous of places that just did everything on GitHub and could use whatever best in class saas stuff they wanted.

Gitlab has been tracking a federation feature since at least 2018 [1], and I expect bitbucket, sourcehut, gitea, and others would move quickly on something like this as well, but there needs to be a protocol defined and some kind of plan for handling spam/abuse.

[1]: https://gitlab.com/groups/gitlab-org/-/work_items/16514


> The only reason you remember is because it beat Kasparov

There is an additional fascinating aspect to these matches, in that Kasparov obviously knew he was facing a computer, and decided to play a number of sub-optimal openings because he hoped they might confound the computer's opening book.

It's not at all clear Deep Blue would have eked out the rematch victory had Kasparov respected it as an opponent, in the way he did various human grandmasters at the time.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: