Hacker Newsnew | past | comments | ask | show | jobs | submit | enether's commentslogin

A very in-depth tool that acts as a easily browse-able reference to many Apache Kafka internals like configuration options, error types, the wire format (by version), config advice and version upgrade diffs.

They make $8-9B a year (~90% profit margins) selling software to mainframes, which were deployed ages ago but still have to be maintained because critical COBOL business code was written on their systems - and migration is too riskly/costly.

To give you an idea:

- of the risk in regulated industries like banking: a UK bank was once fined *$62 million* for botching a mainframe migration and causing downtime. - of the difficulty and risk in non-tech industries: Australia once spent *$120 million* trying to migrate its social security system off mainframes... and failed.

Mainframes are not their only business, of course, but it's a major cash cow that's under appreciated. I, for one, didn't know that business keeps growing.

Coincidentally, I wrote about the topic of mainframes with relation to IBM's acquisition of Confluent here today: https://blog.2minutestreaming.com/p/ibm-confluent-acquisitio...


exactly. i posted about the same thing today https://x.com/twitter/status/2025949280251597291

It's an accelerant, both good and bad. How that plays out in companies where the majority are below-average is a nuanced and concerning case


I think it's worth it to build your own miniaturized versions of OpenClaw/claw-like agents. It's easy enough to build and the confidence of it being in a language you're familiar with, small enough surface area to limit risk, etc. seems worth it imo


In a way I think handing off digital tasks to an AI bot that does it behind the scenes IS reducing technology in your life.

But has to be done carefully to not increase the scope of things being done


we will need some sort of payment-block checkmark for use of social media soon enough. This claw phenomenon is opening the floodgates of spam even more than before


Mm, not at all. The usual LLM doesn't have its own file system, browser, persistent memory of all actions, etc. The usual LLM experience is you open chatgpt.com and have a singular chat session.


> I built this because running Kafka locally for development is painful — gigabytes of RAM, slow startup, ZooKeeper/KRaft configuration. I just wanted something that accepts produce requests and gets out of the way.

This is not true. Kafka's latest `-native` images are very fast to start up (~100ms) and use relatively little memory.

https://cwiki.apache.org/confluence/display/KAFKA/KIP-974%3A...


Would be cool to release the (miniscule amount of) code for this, and maybe have people standardize around a library for it.


Has anyone actually built event-driven agents yet? Or any agents, outside some basic dev workflows? (PR review, etc)


I’d argue that many new AI startups are simply agents built on top of existing, established workflows. With today’s agent SDKs and AI coding tools, building them is incredibly easy.

But as people ship faster, often without understanding scalable system design, we’re heading toward an era of slow, fragile, and unscalable Saas. I believe that eventually products built on solid infra early will outlast the wave of slop--like how facebook outlived friendster. That's why I built Calfkit.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: