Isn't the main problem that the building blocks the modern web is based on are not a good fit for what we do with it?
CSS is a total mess. HTML is a mess. JS is okay, but is not a high quality language.
We would save so much time and money if we would have a modern base to build on. Sadly this will probably never happen, because company interests will try to corrupt the process and therefore destroy it.
How are CSS and HTML a mess? Combined, they're an incredibly powerful layout engine that works almost the same across all environments and devices while also featuring easy accessibility.
When taking a bird eyes view on CSS it will be hard to oversee that CSS is a mixture of different concepts that evolved over time with a lot of inconsistentsies. It is possible to make it work, but it's not pretty.
Same for HTML. If the web would be reimagined today, there is a very low chance that we would create HTML as is.
Its almost ironical that we still use the Terminal - and many use it like in the eighties using Bash - and seem to have forgotten that we should invent a better terminal & shell than doing all the workarounds to handle the quirks of the current systems.
We did! Nushell and Powershell are both so much better than bash that it's not even funny. There's zero reason to use a bash derivative in this day and age, they only persist through inertia + a minority of masochists actually like bash.
You can install whatever shell you want on your system, and there are tons of better alternatives out there. Where it's actually sticky is in the lowest-common-denominator cases, when you need to throw together a quick series of commands that need to be run on a semi-arbitrary system. In that case there are VERY few options because you have to use whatever is already on the system. Sometimes you're lucky enough you can safely assume bash is present, but in many cases you have to assume only sh. Unfortunately both are a difficult/footgun enough language that it greatly helps to start out using it daily to get familiar, and that's where most people end up peaking. Additionally you may need to manually jump into that semi-arbitrary system and do things manually. You're limited by the minimal tools already there, so you better be familiar enough with them already. This is the very reason I've learned some basic Vi commands as well, even though I would never even consider using it otherwise.
In my experience Power shell is actually a terrible replacement because you have to learn not only every command, but the structured return and available input formats of every command in order to do anything. The design of almost all shells is specifically to avoid that and reduce to a common universal input/output format. Powershell could certainly be useful as an intermediate choice between a full blown language and shell scripting though, if you needed more power but not so much to make it worth the load of a real language, but would also require it to be ubiquitous already (which it's not).
Make a better system, and we'll consider using it.
A Terminal + Bash/ZSH is soooo sticky because they are VERY good at what they do once you learn the basics and quirks. And now with LLMs, CLIs are even better because LLMs talk in text and CLIs talk in text.
Microsoft tried with PowerShell to design a better system; it "technically" is better, but not "better enough" to justify the cost of switching (on Linux). The same is true of nushell; it is "better", but not better enough to justify switching for most people.
I believe we're at "peak input method" until someone invents Brain<->Computer interfaces.
I use the Terminal all the time and write my own CLI tools, but I'm feeling more and more the limits of the current system. With the years I have used almost all available shells (EShell was even my default for some time). Right now my favorite shell is Nushell, but still, it feels dated compare to what is possible on modern computers.
> Make a better system, and we'll consider using it.
It's on my TODO list, but it will break with all conventions and tools (no TTY). My idea is to bring the chain-things-together idea to the 21st century using a keyboard first GUI.
Powershell and nushell are both miles better than bash and its descendants. I switched to nushell and you couldn't pay me to go back to bash. I only wish that it was kosher to install it on servers at work, but alas it isn't so I have to suffer with bash when managing systems.
I'm doing propety-based test since years for frontend stuff. The hardest part is, that there is so much between the test inputs and the application under test, that I find 50% of the time problems with the frontend test frameworks/libs and not in our code.
And sometimes you find errors in code that absolutely should never have errors: I found an (as of yet not-root-caused) error in sqlite (no crash or coredump, just returns the wrong data, and only when using sqlite in ram-only-mode). Had to move to postgres for that reason alone. This is part of the reason why I have a strong anti-library bias (and I sound like a lunatic to most colleagues because they "have never had a problem" with $favorite_library -- to which my response is: "how do you _know_?"[0], which often makes me sound like I'm being unreasonably difficult).
Sometimes, only thing you can do is let the plague spread, and hope that the people who survive start showering and washing their hands.
[0]: I once interviewed at a company that sold a kind of on-prem VM hosting and storage product. They were shipping a physical machine with Linux and a custom filesystem (so not ZFS), and they bragged about how their filesystem was very fast, much faster than ZFS or Btrfs on SSDs. I asked them, if they were allowed to tell me how they achieved such consistent numbers. They listed a few things, one of which was: "we disabled block-level check-summing". I asked: "how do you prevent corruption?". They said: "we only enable check-summing during the nightly tests". So, a little unsettled, I asked: "you do not do _any_ check-summing at any point in production"? They replied: "Exactly. It's not necessary". So, throwing caution to the wind (at this point I did not care to get the job), I asked: "And you've never had data corruption in production"? They said: "Never. None". To which I replied: "But how do you _know_"? My no-longer-future-coworker thought for a few seconds, and realization flashed across his face. This was a company that had actual customers on 2 continents, and was pulling in at least millions per year. They were probably silently corrupting customer data, while promising that they were the solution -- a hospital selling snake-oil, while thinking it really is medicine.
> I found an (as of yet not-root-caused) error in sqlite (no crash or coredump, just returns the wrong data, and only when using sqlite in ram-only-mode).
You should report this to the SQLite developers - they are very smart and very interested in fixing SQLite correctness bugs!
Didn't get around to reporting it (huge backlog of tasks). Luckily I am working on a project that _has_ to support SQLite, so if I run into the bug again, I'll report it.
I don't believe that I can tell you the name of the company (they made me sign some NDAs, before the interview, and I have no clue how enforceable those are). Also, this was in 2019, so I would be shocked if they did not fix the problem by now -- especially after I interviewed there (plus I can't be the only one to have noticed this, since).
That said, you have a few data-points if you want to try to triangulate them yourself: physical vm-hosting and storage product, existed since at least 2019, used linux kernel as hypervisor, custom FS, international customers across 2 continents. All of those data-points are my recollection from 2019.
PBT allows us to test more combinations without writing hundreds of tests. Yes, it's about user flow inside a single module of our gigantic application.
I use quicktheories (Java) and generate a determistic random test scenario, then I generate input values and run the tests. This way I can create tests that should fail or succeed, but differ in the steps executed and in the order with "random input".
Maybe increase the living conditions for people under 60?
All this talk is just about the symptoms, but the cause is that young people are born into a deeply unfair world where losing is by design (so that the baby boomers can continue to profit).
If someone in their 20 can start a family without being financially broken, things will improve.
I get your point and think in a similar way. The difference between AI and the coconuts is -> there is no way deaths by coconuts increase by 10000000x, but for AI it's possible.
The reasons we have not - and probably will not - remove obvious bad causes is, that a small group of people has huge monetary incentives to keep the status quo.
It would be so easy to e.g. reduce the amount of sugar (without banning it), or to have a preventive instead of a reactive healthcare system.
I’m not so sure that’s true. There are many examples of OpenAI putting in aggressive guardrails after learning how their product had been misused.
But the problem you surface is real. Companies like porn AI don’t care, and are building the equivalent of sugar laced products. I haven’t considered that and need to think more about it.
Wouldn't be the trick to let AI code the app on first requests and then let it run the code instead of have it always generate everything? This should combine the best of both worlds.
I still remember feeling like a badass using Git-SVN on my SVN-based team, and being the only guy who could actually correct bad merges and restore lost data without having to plead to our SVN admin.
And now I'm feeling like a badass using jj on my Git-based team :)
Having used git submodules, I see a lack of them as a feature. I honestly think that a script that checks out a specific commit from an external repository, combined with adding the path to the .gitignore is strictly better than submodules.
sure, but there are projects that use them already. If jj wants to replace git, it needs to work with people's existing projects without significant changes (ideally none at all)
Changing git hosts happens less frequently than changing clouds, which is infrequent. Changing VCS tools is even less frequent than either of those
The development world is much different and entrenched than it was when the move from svn -> git happened. Think of all the tooling, integrations, and automation we use these days. That was not happening 15+ years ago. I don't think svn as an analogy holds much water, tbh
Sounds like I still need git for much of this, so jj and the things it does is and additional thing to manage. I've been told intermixing them is not a good idea
At the same time, not everyone requires those features. All I mean to say is, the degree of support is varied out of the things mentioned, not just "no support for any of them."
The docs say "no" to support for all of those except tags, which is "partial" because jj can check it ou, but not create them, which hopefully is clear in my comment that "new tags" is still a "no"
I'm going to take the docs for what they say about support over an HN comment
You can trivially run `git tag` and create one, or in your forge and then pull it down. Creating one is not directly supported directly in jj's cli, but if you create a tag, it's in the repo just fine.
It’s not. It’s that there’s no inherent advantage to doing it natively when it works just fine via git. There’s more important things to do first, like many of the things on your list.
CSS is a total mess. HTML is a mess. JS is okay, but is not a high quality language.
We would save so much time and money if we would have a modern base to build on. Sadly this will probably never happen, because company interests will try to corrupt the process and therefore destroy it.
reply