Calling these "philosophies" is probably marketing fluff, but the dichotomy is definitely a real thing in interaction design; interacting with a computer as an agent (querying, dialogs, etc.) vs interacting with a computer as a tool (physically manipulated interfaces). All of the companies cited build products in both categories, but Google (and now increasingly Facebook) definitely skew towards the computer-as-agent applications, and Microsoft and Apple towards computer-as-tool.
At an organizational level I'm inclined to agree with you that the altruism ascribed by the article is far too optimistic, but I would imagine that there are individuals in each company who do think along these lines and try to design along them. (I do, however, find Google "giving users back their time" a bit hilarious as their business model is precisely to get users to spend as much time as possible looking at ads - maybe they free us from mundane tasks so we have more time to browse ads).
I disagree with this particular dichotomy being a real thing. The "computer as an agent" vs. "computer as a tool" interactions are functions of the problem being solved. You'd think that CLI interfaces are tools, but then tools like ls or mysql are clearly agents querying things.
IMO all those companies are on the spectrum of trying to remove user's agency from the problem. That happens when you build your tools to perform ever more complex queries and actions, while simultaneously making the user unable to see all the relevant details and tweak them. That happens when you dumb down your tools. All of the companies mentioned are guilty of it. Apple software is powerful and mostly very well made, but it's continuously dumbing down and locking the user away. Microsoft software used to be "bicycle for the mind", but it too is getting dumbed down every iteration and evolves towards pretty fluff. Google is simply much further down this spectrum, but they're not in a separate category.
It's not about productivity. A big part of productivity is interoperability, and that's not something platforms want to give you. It's all about getting you to use their software, so they'll bait you with pretty interfaces and faux-productivity promises, in hope you pay for it (with money or data).
True. And Microsoft and Apple are starting to show worrying signs of going to the other side with their own garbage AI "assistants" among other things. Windows 10 supposedly lets you switch it completely off, but it's a very dark pattern. They make it incredibly esoteric and there's no telling if a future strongarm update might just turn it back on and expand its capabilities dramatically without telling you except for some obscure never-read EULA footnote. Give it even more power to gobble up your data for dubious advantages. There's really no way to use Windows 10 satisfactorily without hacking at the registry with a machete.
I do think that individual developers and designers have good human hearts and altruism, but the corporate structure is devised to keep that under control and serve shareholder value above everything.
There was a talk at the latest CCC which likened corporations to very slow, analog AI which responded to stimuli and attempted to increase share value by any means necessary over years and generations. And that this slow, silicon-free AI has already taken over our governments and captured almost all regulation. It was an interesting idea.
At an organizational level I'm inclined to agree with you that the altruism ascribed by the article is far too optimistic, but I would imagine that there are individuals in each company who do think along these lines and try to design along them. (I do, however, find Google "giving users back their time" a bit hilarious as their business model is precisely to get users to spend as much time as possible looking at ads - maybe they free us from mundane tasks so we have more time to browse ads).