Virtualisation is cheap. Docker + Debian - stable, with the official docker ppd - you're able to achieve a compromise of stability (and out of date packages) - while running fast moving projects that require later build tools than those available in the distribution.
Docker to me is a proof that distro package management has failed. People would rather bundle a snapshot of an entire operating system with every application they use, instead of trying to install a compatible set of packages on the host OS.
Alpine Linux containers can be as low as 8MB in size. Virtualisation has a cost, that decreases every year. I disagree - many people want an efficient system where the package manager handles it - Arch linux.
In my use case - I started running Arch Linux and installing the packages I want - however aarch64 w/ prop Mesa drivers = bad time. too few AUR packages has aarch64 binaries - would end up compiling nearly everything. eventually I snapped trying to build my own PKGBUILD for printer drivers (that worked on my arch/hardware) instead of just using the provided install script that had ubuntu aarch64 support.
while a sleek minimalist system, ticking smoothly, is really nice - to get to that "happy place" on many bits of hardware can be tricky. this constantly getting stuck and having side effects across different programs on my system led to me taking the same approach I take deploying in production. just install stock os, get docker, then run docker compose pointing to your persisted config/data.
Forcing everybody to use the latest versions is as anti-user as forcing them to use versions from 5 years ago.
What people want is to develop and run their applications with the least amount of friction possible.
That is the purpose of the operating system: make it easier to write and run applications. Anything that gets in the way of that is bad.
What containers and other forms of virtualization provide is enhanced ability to deploy and run applications cheaper than without it.
Linux kernel and modern networking is incredibly sophisticated and Unix design of the past can't allow people to take full advantage of it in a reasonable manner. This is why they switched to containers and virtual machines.
Because, frankly, running one Linux operating per application actually makes more sense than what people were doing before.
Package management systems, no matter how sophisticated, can't solve these problems.
I'm not saying Docker doesn't work well. But whether you're using a slim OS in Docker or not, you're still choosing not to use your host OS, because your host OS is failing to provide equally stable, reliable and reproducible environment for your applications.
the irony is - if you stick to software of the time, you live in GNOME LTS land - it's pretty frictionless and stable. my experience is the bad times start rolling in as soon as you need to compile - sway for example - you need a newer version of meson than is on ubuntu LTS, and there's a tight wayland dependency too. its the shiny new software (often the ones posted on hn...) that end up needing the most up to date build dependencies.
i agree - it's a failing of the pkg management and devs. but also - the hardware developers. if you rely on prop blobs - docker is a way to liberate yourself somewhat from this - you dont need to have a well maintained upstream
It could just be that as a technology community we collectively are still currently getting our heads around what containerization is really essentially about. Maybe as most of us in the technology herd grok it at a deeper level, it will later become more obvious how tools prior to containerization can achieve the qualities we think we can only get from containers.
As a brain collective it may be before that we weren't conceptualizing our systems at an atomic enough level that containers introduced. Containers in service to our isolation mental models may be the true benefit of the trend. Even if the underlying technology dies or falls out of trend, the ideas will surely survive.
For my career I have learned to appreciate GNU/unix more by using containers.
yes well. docker sits atop the lofty shoulders of cgroups. i don't think containerisation is the only way, it's coercion with flatpak and snaps is fustrating.its a means to an end. a means that has mass adoption and is proven, even if it's not ideal.
Slightly tangential, but in fact all flight reservations through a travel agency are initially made without payment. The entire process is quite complex, due to compatibility with processes and systems that were built in the 70s... The flow is to create the booking (and hence receive a record locator or PNR), and then to attach a form of payment. If the payment matches the cost, it is then possible to request ticketing, at which point a ticket number will be generated, which is the "real" confirmation
I am a Turkish citizen, and we need a visa to visit EU member countries. You are not 100% sure if you will get the visa, so may want to make a flight and hotel reservation first, and then buy them if you get the visa. So reserving the tickets is preferred.
Maybe, but other countries still have it. Vietnamese have one of the worst passports on the planet. Their govt. won't let them leave without proof that they will come back, usually in the form of a job, cash in the bank, spouse and tour or reservation based trips.
Many airlines require you to have an onward flight out of a country before they will let you fly into it. This allows you to fulfill the technical requirement without having to pay for a plane ticket you aren't planning to use.
A reservation that is not paid is not ticketed. A reservation that is not ticketed is not 100% guaranteed
But I agree that this is till a problem for airline inventory management, and an online travel agency should definitely not be opening this feature to the general public!
There are several usecases that require unpaid reservations, the most common one nowadays is corporate travel, when you need to integrate approval flows - the approval step will be between the reservation and the ticketing.
Like just about every industry, there is a world of complexity hidden underneath the surface :)
I don't think so. If it's not customer facing and not team focused (e.g, a data entry clerk), then I don't think it would affect ones performance, and thus, not be "valuable" to the job.
How many products need constant feature updates? Aside from security/compatibility there are plenty of products that already meet everything required of them (to near everyone). How many new features does a file manager need? An PDF reader?
For Facebook - I don't think forced subscription would work. People would be forced to think about how much they value the service (a less ubiquitous one at that).
Not only feature updates. Also everyday bug, security updates and also operating costs.
1. Operating system
2. Programming IDE
3. Password Manager
4. Well I pay subscription for Netflix and Spotify but that's not really for the updates. Partly for new content, partly operating costs.
5. Facebook (I'd prefer to cover the operating costs by subscription, I really dislike ads)
6. Email - same as above
Also, actually I'd like all the services I use to provide feature updates now and then and continue trying to innovate.
Edit: also, if you do feature updates every now and then. Like it was with Windows or Photoshop. Then you end up having to support a lot of parallel versions with security and bug fixes which is wasted work really.
This one is a total non-starter for subscription pricing. I'm not willing to use software that will potentially cut access to all my accounts if I stop paying them monthly.
Not too mention when the features just become some massive bloat and ruin the core features of the product. Chat applications are pretty much the worst offender for this kind of thing.
> How many products need constant feature updates? Aside from security/compatibility there are plenty of products that already meet everything required of them (to near everyone). How many new features does a file manager need? An PDF reader?
Do you have any example of file managers or PDF readers with a subscription-based model?
True. How many new features do they need? What about word processors? Photo editors? Email clients? There are new features that could be useful, but for many users their needs are already fulfilled. Why should they have to continue paying for Photoshop as a service if they're happy with the current version? I apologise for my poor examples.
> Why should they have to continue paying for Photoshop as a service if they're happy with the current version?
Ultimately - because the person who owns the rights for the software wants to keep charging them and since it's their software they decide what is charged for it, end of story.
If it's not a good deal for you, go somewhere else. There's no point arguing about prices beyond that. There's no 'should' in prices - only what people want to charge and what people want to pay.
Right, they pay more because the company wills it and people (I think) are more receptive to subscribitions. This is what the author of the article is complaining about and the reason consumer advocacy groups exist.
Do you have more information on how Russia's Eastern regions are becoming more habitable? I know they have a development fund to incentive people from other regions to move and develop small businesses (especially in agriculture).
They're much better for agriculture than they used to be (though still pretty poor in Siberia). The change in productivity is wholly due to climate change. Warmer weather leads to a longer growing season, rain instead of snow, etc.
A lot of this land is nutrient packed as well. Siberia could be prime growing land if it gets warm enough. Similar to Ukraine today.
Saying that, there are a number of concerns about soil resilience in Siberia. If soil resilience doesn't improve, Siberia will remain very poor for agriculture.