I don't think that is the case. The kinetic energy of these super-energetic particles is often compared to a tennis ball. But that energy isn't released at once, so even if it would interact with yourself, that interaction creates a particle shower that takes most of the energy with it. I don't think we can feel one of our atoms getting violently ripped apart.
indeed, but note that c^2 is just a factor to convert between units here and is completely arbitrary (or rather, c is so high because our units are human scale)
indeed, in the most natural systems of units in this area, we set c = 1 as to simplify the equations
After having done the switch to nixOS, I can confidently say that managing a system any other way (like with apt/brew + 20 handwritten bash scripts) really is neanderthal technology and nix is superior in every single way.
It's also great for the AI era, copilot is really good with that stuff.
My experience using NixOS on desktop is that it's 95% wonderful, 5% very painful.
If you run into friction with NixOS, you may need to have a wider/deeper understanding of what you're trying to do, compared to the more typical Linux OSs which can be beaten into shape.
Such a pity that the article didnt touch on running rust nightly or the sometimes statefull nature of user configs of some programs. The 5% painful part was just around the corner.
Fair point but I've stopped trying to declaratively manage stuff in nix that has its own idiosyncratic state management. That way youre just using nix to run an installer.
Yeah, I've been using Unixey stuff for almost twenty years now (most of it Linux, and fell for the siren song of macOS for about four of them).
I liked Arch and Ubuntu and Mint and OpenSUSE well enough when I used them first, but once I actually tried NixOS it felt so obviously correct that it started to bother me that it's not the default for everything.
Being able to temporarily install things with nix-shell is game changing, and being able to trivially see what's actually installed on my computer by quickly looking at my configuration.nix is so nice. "Uninstalling" things boils down to "remove from configuration.nix and rebuild".
The automatic snapshots upon each build allows me to be a lot "braver" when playing with configurations than I was with Arch. I was always afraid to mess with video card or wifi drivers, because if I screwed something up and if I didn't know how to get back to where I was, I might be stuck reinstalling to get back to a happy state. This didn't happen that often but often enough to have made me a bit weary about futzing with boot parameters or kernel modules. Because of the automatic snapshots with NixOS, it's much easier (and more fun) to poke with the lower level stuff, because if I do break something in a way that I don't know how to fix, the worst case scenario is that I reboot and choose an older generation.
This is a bigger deal than it sounds. For example, with my current laptop, there was a weird quirk with my USB devices having to "wake up" after not being used for more than thirty seconds, meaning that I might start typing and the first three or four words wouldn't go through. After some digging, I found out that the solution is to add "usbcore.autosuspend=-1" to the kernel params. I did that and it worked.
If I had still been running Arch or Ubuntu, I probably would have just learned to put up with it, because I would have been afraid to edit kernel parameters because of the risk of breaking things in a way that I don't know how to fix.
I love NixOS. I have no desire to leave, or at least I have no desire to abandon the model. I've considered changing to GNU Guix System since I like Lisp more than I like the Nix language, but those FSF-approved distros can be a real headache for people who actually have to use their computers.
> those FSF-approved distros can be a real headache for people who actually have to use their computers.
This is now a tired discussion; the use of nonguix (https://gitlab.com/nonguix/nonguix) is well documented, there is no practical difference after initial setup between using a regular distro and one of “those FSF-approved distros”
I am sure you don't do it on purpose, but could you stop spreading FUD? (specially given you acknowledge “ I've considered changing to GNU Guix System”, so you're here relaying 3rd party opinions, not your own)
> like with apt/brew + 20 handwritten bash scripts
I just use apt, been doing it for 20 years, it works great. I've never in my time, heard of or knew anyone who wrote 20 (or any) bash script wrappers around apt. The one year I was painfully forced to use a Mac for work, brew worked similarly to apt, just used it, no need to wrap it with shell scripts.
Comparing highly functional and capable systems like apt and brew to neanderthal technology sounds like hype.
> It's also great for the AI era
That also sounds like more hype, similar to the pro-nix other comments so far which tout AI and similar to the article which I did read, also sounds like hype.
I've been using apt for 20 years too and was never a fan of it, canonical repos are never up-to-date and managing ppas is a pain. Yes I'm very hype about nixOS (and that's a rare thing for me), but it is just really really good.
I understand that "just check it out" is not the best advice because the setup cost to using nixOS is really high, and the learning curve is really steep, so it's not like you can give it a whirl for a few hours to experience the workflow. But believe me, once you are used to it, it just so so much more convenient. I'm currently managing my dev laptop, home PC, a WSL, and a hetzner server all in the same repository (allowing for a lot of code reuse). Everything is super orderly and split into modules, everything is declarative, I can roll back to a previous build of my system if I mess up installing nvidia drivers or iwd or bluetooth etc.
It's also not like installing software is harder than with apt (oftentimes it is easer, `programs.firefox.enable = true`) so after you've paid the setup cost there is just no downside. It's a bit like react vs jQuery, or Kubernetes vs hand-written deployment scripts.
This might be a me problem but I extensively manipulate the git history all the time which makes me loathe git hooks. A commit should take milliseconds, not a minute.
What one considers fast or slow may vary, but the general rule is something like the following.
- very fast? run it all the time (shell prompt drawing, if you want, like direnv)
- fast? run it in a pre-commit hook
- a bit slow? run it in a pre-push book
- really slow? run it in CI, during code review, etc.
Fwiw: I also rewrite history often-ish but it's never that fast for me because I have commit signing turned on and that requires verifying my presence to a USB smartcard on each commit. For me, it's okay if a commit takes a second or two. As it creeps up beyond 3 or 4 seconds, I become increasingly annoyed. If a commit took a minute I would consider that broken, and if I were expected to tolerate that or it were forced on me, I'd be furious.
I generally expect an individual pre-commit hook to under ~200ms (hopefully less), which seems reasonable to me. Some of the ones we have are kinda slow (more than 1s) and maybe should be moved to pre-push.
Since you seem especially sensitive to that latency, here's what I'd propose if we worked together:
If you own a repo, let's make all the hooks pre-push instead of pre-commit. On my repos, I like many hooks to run pre-commit. But since the hooks we use are managed by a system that honors local overrides via devenv.local.nix, let's make sure that's in .gitignore everywhere. When I'm iterating in your codebases and I want more automated feedback, I'll move more hooks to pre-commit, and when you're working in mine you can move all my hooks to pre-push (or just disable them while tidying up a branch).
I would reckon cleaning up your branch before opening a pull request is good practice. I also rebase a lot, aswell as git reset, and I use wip commits.
Slow hooks are also not a problem in projects I manage as I don't use them.
No, I would not and don't do that. It is better to leave the PR commits separate and atomic so reviewers can digest them more easily. You just squash on merge.
> Slow hooks are also not a problem in projects I manage as I don't use them.
You bypass the slow hooks you mentioned? Why even have hooks then?
> It is better to leave the PR commits separate and atomic so reviewers can digest them more easily.
So reviewers have to digest all of the twists and turns I took to get to the final result? Why oh why oh why?
Sure, if they've already seen some of it, then there should be an easy way for them to see the updates. (Either via separate commits or if you're fortunate enough to have a good review system, integrated interdiffs so you can choose what view to use.)
In a better world, it would be the code author's responsibility to construct a meaningful series of commits. Unless you do everything perfectly right the first time, that means updating commits or using fixup commits. This doesn't just benefit reviewers, it's also enormously valuable when something goes wrong and you can bisect it down to one small change rather than half a dozen not-even-compiling ones.
But then, you said "atomic", which suggests you're already trying to make clean commits. How do you do that without modifying past commits once you discover another piece that belongs with an earlier step?
> You just squash on merge.
I'd rather not. Or more specifically, optimal review granularity != optimal final granularity. Some things should be reviewed separately then squashed together (eg a refactoring + the change on top). Some things should stay separate (eg making a change to one scary area and then making it to another). And optimal authoring granularity can often be yet another thing.
But I'll admit, git + github tooling kind of forces a subpar workflow.
I do leave PR commits separate. In my teams I don't set up pre-commit hooks altogether, unless others feel strongly otherwise. In projects where they are forced upon me I frequently --no-verify hooks if they are slow, as the linter runs on save and I run tests during development. CI failing unintentionally is usually not a problem for me.
I was thinking that he's describing implementing an initial algebra for a functor (≈AST) and an F-Algebra for evaluation. But I guess those are different words for the same things.
According to the wayback machine, the change happened somewhere between Oct 7 and Oct 10. Interestingly there are no recorded snapshots on Oct 8 and Oct 9, perhaps the redesign caused a couple days of outage.
Haha hey lyc! I didn't forget, you guys were second family! You taught me a lot about maths and code, not sure where I'd be without you :) Learnt more while messing around with fractals and gfx than in all my time at uni.
Honestly I try not doing much computer stuff in my free time because I'm doing so much in my day to day but I'll stop by some time! I've been in the Chaotica discord for years but never said hello.
That's kind of you to say, and I'd love to meet up sometime just for the lolz, it's really far too close (with lots of other cool fractal ppl nearby) not to :)
Do you remember the email you sent me 12 years and 1 month ago, in which (among much other unhinged stuff) you called me a nazi because I mentioned I'm (part) German? I remembered your name; hello again! I see your project didn't go anywhere, that's too bad. I'm sure you'll understand if I decline to invite you to my Discord server :)
Did the switch to NixOS a few months ago on my Thinkpad and ChatGPT worked wonders. I'm not very experienced with Linux distros and have been an Ubuntu user for a long time. I don't think I'll be switching away from NixOS anytime soon, it's great.
The learning curve is still extremely steep but after the initial 10 hours of googling it just all falls into place.
At some time in the video he's casually played a groove on the piano to back the birds for a couple of second, then stopped ("Wait, what am I doing") :)
You can also see his modular setup in the background.
I didn't know of him until today. Instantly, a new inspiration.
Except making employers do only easy things will make them stagnate. People who do nothing but simple CRUD apps over and over won't even be particularly good at making CRUD apps... whereas the guy who builds an Unicode font renderer in his free time always seems to write better code for some reason.
Getting better at your job is not just a "personal want" but very much something that the employer appreciates aswell.
Of course reinventing the wheel isn't good in corporate because the reinvented wheel is buggier than the ready made npm package but employers should go out of their way to find hard problems to solve that they can pass to their employees. It's called a growth opportunity.
You can’t convince an employer with that attitude. They’re gonna keep exploiting their employees and “encourage” them to do their “personal development” in their free time.
Unless you work for enterprise consulting where employers appreciate replaceable cogs that they randomly drop into any project, and nicely out project budget regardless of delivery quality.
const x = await fetch(...); await x.json();
"intercept" code that runs before every request?
const withAuth = (res, options) => fetch(res, { ... do stuff here });
reply