I actually think the opposite is true. Typically customers choose MacOS over Windows (when price is no object). The features in the Darwin/XNU systems are certainly very customer focused.
Having said that anecdotally I think that while it could be said that developers enjoy having new capabilities rather it is more often the case than not that when old APIs are deprecated and new APIs are used to replace them (forcibly) developers end up upset (see: Metal replacing OpenGL on MacOS / 32-bit application deprecation on MacOS).
> Typically customers choose MacOS over Windows (when price is no object).
But when they do so, I think they do it in spite of the lax attitude towards compatibility, due to other factors. Chief among them is the higher prestige Apple products have, the fact that many basic users only use very few programs that are all under heavy development (e.g. browsers) that lessens the sting, and many regular users have become resigned to being abused by their technology providers.
If you narrow the question down to one factor, we're going to break your stuff vs. we're going to do everything we can to keep your stuff working, I think most people would choose the latter
I think backward compatibility is a feature, and the lack of it can be considered an anti-feature as I mentioned in the article:
> As nice as backward compatibility is from a user convenience perspective when this feature comes as a result of a static kernel Application Binary Interface this trade-off is essentially indistinguishable from increasing time-preference (or in other words declining concern for the future in comparison to the present).
I would argue that users are not using the system due to the lack of backward compatibility, rather my contention would be that this feature comes at a cost that outweighs the benefit (also from the article):
> This can be seen with a continuous layering of hacky fixes, sloppily 'bolted on' feature additions (built in such a way that new additions don't conflict with existing APIs), and unremovable remnants of abandoned code segments left in place purely to ensure that applications continue to run. These issues not only cause their own problems but, in aggregate, cause a huge creep in the number of lines of code in privileged memory. This does not grow the exploitability of a system linearly but rather it causes exploitability to grow exponentially due to the fact that by there being more code to exploit, malicious functionalities can be chained together and made more harmful.
AFAIK OpenGL hasn't been removed, and isn't planned to be in the near future; it's just been deprecated. Apple isn't adopting new versions of OpenGL or Vulkan, but OpenGL code that used to run on macOS still runs on macOS.
> What I am upset about is the removal of a standard that worked well
That is not correct on either count:
1) OpenGL is a terrible fit for modern GPUs so I don't agree that it "worked well"
2) OpenGL was not removed anyway
> OpenGL support has been abandoned
That is not correct. Code continues to be written and maintained to keep OpenGL working on newer GPUs, both OpenGL ES on iOS GPUs, as well as OpenGL on mac GPUs.
> The fact that it is now officially deprecated is just a warning from Apple to remove it soon without bad PR.
macOS/iOS continue to support /many/ deprecated APIs, some of which have been deprecated for over a decade. Contrary to popular opinion, things are not removed just for fun. Things are removed when there is a sound security/technical reason, or when there is a high ongoing cost either to end users or the development process.
The alternative is to be MS and never remove anything, and where any changes to observable behavior of the system (or even moving internal struct values around!) can cause breakage and so is not done or requires inserting hacks to preserve behavior. If you think that doesn't impact individual engineer's decision making process ... well I don't know what to tell you. It must be soul-crushing to know if you change an internal data structure from uint16_t to uint32_t some crappy app that depends on being able to poke around will break. Surely such policies encourage some developers to do even more such hacky things, knowing MS will take the blame and end up making sure you can keep getting away with it.
> Code continues to be written and maintained to keep OpenGL working on newer GPUs, both OpenGL ES on iOS GPUs, as well as OpenGL on mac GPUs.
Yes, but from the outside this is not necessarily obvious. Just ask the VLC developers: they’re running into breakage constantly on macOS. There were a couple of builds back in the summer where OpenGL just wouldn’t initialize at all, meaning even system components like WebGL, iTunes visualizers, and certain screensavers wouldn’t work correctly (though I did notice that there’s a new one written in Metal that wasn’t affected…)
> The alternative is to be MS and never remove anything,
They do remove stuff, all the time, the transitions just happen to be a bit more gentle, with more years to prepare for it, although not always, e.g. 16 bit, MS-DOS support, WinG, WinRT (Win 8.x variant), WDDM, WCF, Remoting.
No, I am not moving goalposts. OpenGL has been useless for all intents and purposes in macOS for years already. The deprecation is the least of concerns.
> 1) OpenGL is a terrible fit for modern GPUs so I don't agree that it "worked well"
A terrible fit? What are you even talking about? It is as featureful as the latest D3D11 which is the API most games are based on. What is useless is the OpenGL version and drivers that Apple ships.
Please don't spread FUD. There are no security nor technical reason nor significant cost on maintaining a proper OpenGL.
32-bit macOS used a fragile ABI for Objective-C, meaning all ivars had to be in public headers and changing any of them changed the runtime layout of your class and _all_ subclasses. AKA adding a field was a breaking change. This imposed a huge maintenance burden up and down the stack. Often classes would include a void* pointer to a side-table where new fields could be added, or the public class was a wrapper around an internal implementation. Both of those have a performance cost (extra pointer chasing + extra mallocs, or double objc_msgSends for every method/property).
There are many other performance optimizations (eg non-pointer ISA) that were impossible on 32-bit, meaning every app that refuses to move to 32-bit imposes even more performance costs. Shipping 32-bit versions of frameworks also bloats the size of the OS and forces the OS to load a duplicate dyld shared cache. It even makes system updates and installs take twice as long (two shared caches need rebuilding after all!)
None of this accounts for future optimizations and improvements that would have been nearly impossible if the system still had to support 32-bit without forking half of the userspace frameworks... that kind of forking would be an absolute maintenance nightmare.
The deprecation of 32-bit has been obvious for over a decade. It is based on sound technical reasons. You may not agree with those reasons but they have nothing to do with not caring.
You might instead ask why a developer would choose to ship a 32-bit app/game anytime after 2010 when 64-bit was available and obviously the future? Should all Mac users continue to pay the cost in disk space, memory, and performance? How long should they pay that cost? Were you hoping macOS would support 32-bit applications in 2030? 2040? What about the undeveloped features and unfixed bugs due to the overhead of continuing to support 32-bit?
In an ideal world, 32bit would continue to be supported for as long as people use 32bit applications. If there is no good reason to port an application to 64 bit (other than the OS developers forcing it), why do it? Windows supported 16bit up until windows 10,and there were absolutely no mainstream desktop applications or even games on 16bit for over a decade at that time. That seems like a good standard.
The move to 64bit on the other hand is not all upside. There is overhead for pointer-heavy data structures, which has kept some applications from preferring it for a long time, mainly in the gaming arena. I also remember that browsers were very reticent in moving to 64 bit, until the security benefits from ASLR were overwhelming (and web apps started being so memory hungry that 2-4GB started looking small).
Also, the clear technical deficiency that you cite with ObjectiveC on 32 bits (the breaking ABI from adding private fields) is an oft defended feature in the most used native programming language in the world, C++. There, it is done because of performance optimization reasons - to avoid indirection for each structure allocated on the stack.
Overall, my point is that the technical case for 64 bit is not nearly as clear-cut as you make it out to be for many applications. I do agree that the ecosystem case for moving to 64 bit is pretty strong, but I don't think it is enough to say that developers who don't do it are idiots and dragging everyone back.
And just to be clear, I'm not being defensive or bitter, the applications I work on moved to 64 bit as soon as we could, probably 7 years ago, with very good technical reasons to do so - they were very memory hungry and benefitted greatly from having access to more than 2 GB RAM (Windows).
What a load of BS. The 32-bit ABI is independent of the 64-bit one. Nobody is asking Apple to backport new changes or APIs to 32-bit because nobody cares about new apps in 32-bit. We do care about not breaking the existing ones, though.
The rest of your arguments about storage size, update time, "optimization" etc. are not just irrelevant, but also solved more than a decade ago without penalties in other operating systems.
The funniest thing is the last one about 2010 devs. Go ahead, go back in time to 2010 and tell everybody to make their hundreds of millions of LOC and third party dependencies (many without source) and duplicate their testing cost just so that Apple cannot be blamed to drop 32-bit support ten years later.
Let me tell you the reality: Apple is a hardware company, not software. Apple cares about selling iPhones, not enterprise long-term support or non-casual gaming. On the other hand, Linux and Microsoft and other companies care about users and customers and they are paid for that because they are a software shop. That's the difference.
There is a lot of weird stuff in Darwin that makes me wonder... Why do they deprecate useful things at other layers and keep this clunky Mach thing? They could switch to FreeBSD or Linux and probably perform better. It is hard to take them seriously when they say they don't like to maintain creaky stuff when there is Mach...
You might think I don't understand, but I do know about kernel development and yes, the IPC is pretty unique for instance.
But it can be recreated with Unix domain sockets and what have you, in exchange for a kernel with more active work going on, not just by a single entity, more wood and arrows behind it so to speak. And mach is dated, from the microkernel experimentation era of the 80s and 90s, with a lot of overhead of its own.
The fact remains that Linux outperforms macOS on the same hardware. Yes I have run comparisons.
> But it can be recreated with Unix domain sockets and what have you
Or, heck, they could port the IPC and anything else worth keeping. Even just building a private fork of FreeBSD with the Darwin personality grafted on top would be viable.
I think as core counts increase, in 5-10 years we’ll see operating systems that run solely on a dedicated low power core while the other processes run in tickless mode with a more topology aware scheduler and almost no context switching or core migration.
This is already how game consoles and low latency systems work for the most part.
At that point microservices might become more palatable since the context switching won’t damage the rest of the processes performance as much. And Linux’s performance advantage might dissipate as scheduling and cache pressure become less relevant.
Well, unless they want to play games or run any non-mac software … I don't know where you get your idea that customers prefer MacOS, but apart from some niche communities (designers, some subgroups of developers) this simply doesn't hold.
Only FOSS developers are upset with Metal, everyone else appreciates not having a clunky 3D API still based on C, and is already taking advantage of it on their 3D engine.
Plus, regarding the theme being discussed here, the use of Metal is transparent when using SceneKit, SpriteKit, Core Graphics,....
It is very much not the case that only open source developers prefer cross-platform standards to Metal and all the other vendor-specific APIs.
I have to say that I don't know why you dismiss open source graphics stacks. Anyone who has worked with the open source Mesa knows it is a breath of fresh air compared to the proprietary Qualcomm drivers, Mali drivers, Apple OpenGL drivers, or (horrors) fglrx.
And I've spoken with the Unity engine developers, who say they still consider OpenGL ES 2 support essential. Sure, Unity uses Metal (and I use Metal) because Apple forces us to to get maximum performance, but does anyone really want to write both Vulkan and Metal? Valve wouldn't have acquired MoltenVK if they were really itching to write Metal!
Let's not forget that OpenGL is only alive thanks to NeXT acquisition, and Apple"s trying to cater to UNIX devs on their survival years, OpenGL wasn't even on the radar for Copland, rather QuickDraw 3D.
Which actually was probably the only OS that provided an usable OpenGL SDK, that neither ARB nor Khronos ever delivered.