Part of the reason e.g. Cellebrite is obsessive about not telling people many specifics about their product capabilities outside of NDA is that Apple is quite serious about trying to fix these things, and "we can crack every iPhone before the 14" probably tells them a fair bit about what might have a flaw.
Tools like that lose a lot of value if anyone paying enough attention can infer they exist, even indirectly, like if all the TSA agents you know suddenly switch to Android phones, or some of them tell you not to bring iPhones through security and won't tell you why, or a thousand other vectors for rumors to start.
All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
There is a difference in targeted software supply attacks vs. weakening encryption for everyone by introducing a master key. Apple would be required to cooperate by US law, it may never become public either. But as I said, Apple doesn't have to know, or "know". This feature inherently compromises security. Contrary to device encryption, OS update security depends on a single key held by Apple (rather several devOps guys...), which could be stolen, leaked or shared.
Would you bet, the NSA can't sign iOS updates?
> So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
Of course. This is reserved for targeted attacks against journalists and other enemies of the state.
> All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
None of those articles are inconsistent with the claim that Apple cares about security, though?
"We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
But yes, I would probably, at the moment, bet that if the NSA can sign a custom iOS build on consumer hardware, Apple doesn't know about how, both because that's a very hard secret to keep, and because you'd see a massive uptick in people avoiding Apple devices in governments that might be of interest to US intelligence if even a rumor of that got out.
> None of those articles are inconsistent with the claim that Apple cares about security, though?
You are moving the goalpost.
> "We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
I'm not the one moving the goalpost; my argument was that Apple's incentives are not in favor of them permitting even the appearance that they might allow that kind of compromise, your argument with that wall of articles appeared to be that Apple has a history of making decisions inconsistent with that, which I disputed. If that wasn't your intended argument, you might wish to be more explicit than a wall of links and "As if Apple users would care...".
> They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
Yes, and my argument is that the plumbing for either multiple release signing keys, one of which is never seen in the wild, or to avoid a second "iOS 13.1.5" or whatever with different build information showing up in various telemetry that would leak this existing, is very difficult to have built without far too many people who would spread rumors about it coming about, and even that rumor would be a problem.
So the most plausible thing, to me, would be that if such a capability exists, it's a "nuclear option" for whoever holds it to only use in a circumstance where it's so important they don't mind potentially never being able to use it again, whether that's because it's an exploit chain that will be fixed or because it's been coerced out of the target company and they will probably be compelled to fix it if it gets out.
Part of the reason e.g. Cellebrite is obsessive about not telling people many specifics about their product capabilities outside of NDA is that Apple is quite serious about trying to fix these things, and "we can crack every iPhone before the 14" probably tells them a fair bit about what might have a flaw.
Tools like that lose a lot of value if anyone paying enough attention can infer they exist, even indirectly, like if all the TSA agents you know suddenly switch to Android phones, or some of them tell you not to bring iPhones through security and won't tell you why, or a thousand other vectors for rumors to start.
All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.