It isn't so much that it is hard to build things - it is more that it is hard to distribute software for Linux because there are so many targets to build for. It is unreasonably hard to build binaries for distribution to more than one Linux distro. So people are disinclined to even try.
You can, of course, build static binaries (and people really should do that more often) and distribute tarballs, but then you might run into all manner of licensing issues. (Not to mention the tedious discussion about upgrading shared libraries under the feet of applications and praying it doesn't introduce hard to detect breakages).
One direction I'd love for distro designers to explore is if Linux distributions were to abandon the traditional splatterfest style of software installs, where an application scatters lots of files across several filesystems and depends on system wide libraries, and instead puts each application in a self-contained directory. Or rather, two: one for the immutable files (binaries, libraries etc) and one for variable state. That way you could choose if you want to do static linking or dynamic. Because any shared libraries could be part of the immutable application directory.
Oh yes, it'll result in a lot more disk use, but disk drives are cheap, and libraries rarely make up that much of your disk use anyway. But there is a fair chance you could address some of that with filesystem level deduplication.
You could also have systems for distributing shared libraries from trusted sources so applications do not necessarily have to include the shared libraries, but that they can be fetched as part of the install system. And then manage shared libraries locally with hardlinks or some form of custom filesystem trickery. This would have a few added benefits (opportunities for supply chain security, reducing size of distributed packages etc).
macOS does this only partially, but applications still depend on shared libraries and you have the horror that is the Library directories which are somewhat randomly structured, resulting in lots of leftovers when you try to remove applications. macOS has some interesting ideas, but the execution of them is....inconsistent at best.
The reason package managers don't work well is that they are trying to solve a lot of problems we shouldn't be having in the first place. And because nobody wants to simplify the problem they have to solve, distribution maintainers will keep writing package management systems that are a pain in the ass.
You can, of course, build static binaries (and people really should do that more often) and distribute tarballs, but then you might run into all manner of licensing issues. (Not to mention the tedious discussion about upgrading shared libraries under the feet of applications and praying it doesn't introduce hard to detect breakages).
One direction I'd love for distro designers to explore is if Linux distributions were to abandon the traditional splatterfest style of software installs, where an application scatters lots of files across several filesystems and depends on system wide libraries, and instead puts each application in a self-contained directory. Or rather, two: one for the immutable files (binaries, libraries etc) and one for variable state. That way you could choose if you want to do static linking or dynamic. Because any shared libraries could be part of the immutable application directory.
Oh yes, it'll result in a lot more disk use, but disk drives are cheap, and libraries rarely make up that much of your disk use anyway. But there is a fair chance you could address some of that with filesystem level deduplication.
You could also have systems for distributing shared libraries from trusted sources so applications do not necessarily have to include the shared libraries, but that they can be fetched as part of the install system. And then manage shared libraries locally with hardlinks or some form of custom filesystem trickery. This would have a few added benefits (opportunities for supply chain security, reducing size of distributed packages etc).
macOS does this only partially, but applications still depend on shared libraries and you have the horror that is the Library directories which are somewhat randomly structured, resulting in lots of leftovers when you try to remove applications. macOS has some interesting ideas, but the execution of them is....inconsistent at best.
The reason package managers don't work well is that they are trying to solve a lot of problems we shouldn't be having in the first place. And because nobody wants to simplify the problem they have to solve, distribution maintainers will keep writing package management systems that are a pain in the ass.