Snowdon: The All-Seeing "i": Apple Just Declared War on Your Privacy

 

The All-Seeing "i": Apple Just Declared War on Your Privacy

By Edward Snowden, Edward Snowden's Substack

26 August 21


“Under His Eye,” she says. The right farewell. “Under His Eye,” I reply, and she gives a little nod.

y now you've probably heard that Apple plans to push a new and uniquely intrusive surveillance system out to many of the more than one billion iPhones it has sold, which all run the behemoth's proprietary, take-it-or-leave-it software. This new offensive is tentatively slated to begin with the launch of iOS 15⁠—almost certainly in mid-September⁠—with the devices of its US user-base designated as the initial targets. We’re told that other countries will be spared, but not for long.

You might have noticed that I haven’t mentioned which problem it is that Apple is purporting to solve. Why? Because it doesn’t matter.

Having read thousands upon thousands of remarks on this growing scandal, it has become clear to me that many understand it doesn't matter, but few if any have been willing to actually say it. Speaking candidly, if that’s still allowed, that’s the way it always goes when someone of institutional significance launches a campaign to defend an indefensible intrusion into our private spaces. They make a mad dash to the supposed high ground, from which they speak in low, solemn tones about their moral mission before fervently invoking the dread spectre of the Four Horsemen of the Infopocalypse, warning that only a dubious amulet—or suspicious software update—can save us from the most threatening members of our species.

Suddenly, everybody with a principled objection is forced to preface their concern with apologetic throat-clearing and the establishment of bonafides: I lost a friend when the towers came down, however... As a parent, I understand this is a real problem, but...

As a parent, I’m here to tell you that sometimes it doesn’t matter why the man in the handsome suit is doing something. What matters are the consequences.

Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.

How?

The task Apple intends its new surveillance system to perform—preventing their cloud systems from being used to store digital contraband, in this case unlawful images uploaded by their customers—is traditionally performed by searching their systems. While it’s still problematic for anybody to search through a billion people’s private files, the fact that they can only see the files you gave them is a crucial limitation.

Now, however, that’s all set to change. Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and—yada, yada, yada—if enough "forbidden content" is discovered, law-enforcement will be notified.

I intentionally wave away the technical and procedural details of Apple’s system here, some of which are quite clever, because they, like our man in the handsome suit, merely distract from the most pressing fact—the fact that, in just a few weeks, Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.

Why is this so important? Once the precedent has been set that it is fit and proper for even a "pro-privacy" company like Apple to make products that betray their users and owners, Apple itself will lose all control over how that precedent is applied. As soon as the public first came to learn of the “spyPhone” plan, experts began investigating its technical weaknesses, and the many ways it could be abused, primarily within the parameters of Apple’s design. Although these valiant vulnerability-research efforts have produced compelling evidence that the system is seriously flawed, they also seriously miss the point: Apple gets to decide whether or not their phones will monitor their owners’ infractions for the government, but it's the government that gets to decide what constitutes an infraction... and how to handle it.

For its part, Apple says their system, in its initial, v1.0 design, has a narrow focus: it only scrutinizes photos intended to be uploaded to iCloud (although for 85% of its customers, that means EVERY photo), and it does not scrutinize them beyond a simple comparison against a database of specific examples of previously-identified child sexual abuse material (CSAM).

If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the “Disable iCloud Photos” switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand. As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.

So what happens when, in a few years at the latest, a politician points that out, and—in order to protect the children—bills are passed in the legislature to prohibit this "Disable" bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud? What happens when a party in India demands they start scanning for memes associated with a separatist movement? What happens when the UK demands they scan for a library of terrorist imagery? How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering “extremist” political material, or about your presence at a "civil disturbance"? Or simply about your iPhone's possession of a video clip that contains, or maybe-or-maybe-not contains, a blurry image of a passer-by who resembles, according to an algorithm, "a person of interest"?

If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer. And yet an answer will come—and it will come from the worst lawmakers of the worst governments.

This is not a slippery slope. It’s a cliff.

One particular frustration for me is that I know some people at Apple, and I even like some people at Apple—bright, principled people who should know better. Actually, who do know better. Every security expert in the world is screaming themselves hoarse now, imploring Apple to stop, even those experts who in more normal circumstances reliably argue in favor of censorship. Even some survivors of child exploitation are against it. And yet, as the OG designer Galileo once said, it moves.

Faced with a blistering torrent of global condemnation, Apple has responded not by addressing any concerns or making any changes, or, more sensibly, by just scrapping the plan altogether, but by deploying their man-in-the-handsome-suit software chief, who resembles the well-moisturized villain from a movie about Wall Street, to give quotes to, yes, the Wall Street Journal about how sorry the company is for the "confusion" it has caused, but how the public shouldn't worry: Apple “feel[s] very good about what they’re doing.”

Neither the message nor the messenger was a mistake. Apple dispatched its SVP-for-Software Ken doll to speak with the Journal not to protect the company's users, but to reassure the company's investors. His role was to create the false impression that this is not something that you, or anyone, should be upset about. And, collaterally, his role was to ensure this new "policy" would be associated with the face of an Apple executive other than CEO Tim Cook, just in case the roll-out, or the fall-out, results in a corporate beheading.

Why? Why is Apple risking so much for a CSAM-detection system that has been denounced as “dangerous” and "easily repurposed for surveillance and censorship" by the very computer scientists who've already put it to the test? What could be worth the decisive shattering of the foundational Apple idea that an iPhone belongs to the person who carries it, rather than to the company that made it?

Apple: "Designed in California, Assembled in China, Purchased by You, Owned by Us."

The one answer to these questions that the optimists keep coming back to is the likelihood that Apple is doing this as a prelude to finally switching over to “end-to-end” encryption for everything its customers store on iCloud—something Apple had previously intended to do before backtracking, in a dismaying display of cowardice, after the FBI secretly complained.

For the unfamiliar, what I’m describing here as end-to-end encryption is a somewhat complex concept, but briefly, it means that only the two endpoints sharing a file—say, two phones on opposite sides of the internet—are able to decrypt it. Even if the file were being stored and served from an iCloud server in Cupertino, as far as Apple (or any other middleman-in-a-handsome-suit) is concerned, that file is just an indecipherable blob of random garbage: the file only becomes a text message, a video, a photo, or whatever it is, when it is paired with a key that’s possessed only by you and by those with whom you choose to share it.

This is the goal of end-to-end encryption: drawing a new and ineradicable line in the digital sand dividing your data and their data. It allows you to trust a service provider to store your data without granting them any ability to understand it. This would mean that even Apple itself could no longer be expected to rummage through your iCloud account with its grabby little raccoon hands—and therefore could not be expected to hand it over to any government that can stamp a sheet of paper, which is precisely why the FBI (again: secretly) complained.

For Apple to realize this original vision would have represented a huge improvement in the privacy of our devices, effectively delivering the final word in a thirty year-long debate over establishing a new industry standard—and, by extension, the new global expectation that parties seeking access to data from a device must obtain it from that device, rather than turning the internet and its ecosystem into a spy machine.

Unfortunately, I am here to report that once again, the optimists are wrong: Apple’s proposal to make their phones inform on and betray their owners marks the dawn of a dark future, one to be written in the blood of the political opposition of a hundred countries that will exploit this system to the hilt. See, the day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used.

I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices—and I can’t think of a threat more dangerous to a product’s security than the mischief of its own maker. There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.

I would say there should be a law, but I fear it would only make things worse.

We are bearing witness to the construction of an all-seeing-i—an Eye of Improvidence—under whose aegis every iPhone will search itself for whatever Apple wants, or for whatever Apple is directed to want. They are inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner.

To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.

Or maybe I'm confused—or maybe I just think different.

Email This Page

 

Comments  

We are concerned about a recent drift towards vitriol in the RSN Reader comments section. There is a fine line between moderation and censorship. No one likes a harsh or confrontational forum atmosphere. At the same time everyone wants to be able to express themselves freely. We'll start by encouraging good judgment. If that doesn't work we'll have to ramp up the moderation.

General guidelines: Avoid personal attacks on other forum members; Avoid remarks that are ethnically derogatory; Do not advocate violence, or any illegal activity.

Remember that making the world better begins with responsible action.

- The RSN Team

 
+8# elizabethblock 2021-08-26 13:40
It seems to me that if Apple, or the government, or whoever, knows everything, they know nothing. TMI. But I may be wrong. I probably am.
And anyway, think about what they DO know, e.g. Proud Boys' plans in Portland, and don't do anything about.
 
 
+8# ReconFire 2021-08-26 16:50
All Americans need to realize "big tech" companies are not our friends, they've almost all been bought in one way or another by our "intelligence" agencies, because they aren't restricted by the same laws that government agencies are. What they know about us will only be used against us, or to further the the governments interest towards fascism or authoritarianis m, not to right any "wrongs".
 
 
+9# ladymidath 2021-08-26 15:51
'This is not a slippery slope. It’s a cliff.' It is a cliff indeed, a terrifying cliff that hangs over an abyss. So many governments will see this as a gift to be able to use against their citizens in order to spy on them and arrest and or harass them to silence dissent. In Australia, we have already had the disgusting and disturbing event of a state Premier sending his personal goon squad, an anti-terrorist unit to arrest a Youtube producer because the Youtuber comedian and social commentator was exposing the Premier's corruption. Surveillance is out of control as it is with governments spying on their populace. The 'won't someone think of the children' is a factious argument and one designed to stop debate. Apple seems to be poised to cross a Rubicon and we will all end up being worse off for it.
 
 
-4# barryg 2021-08-26 17:29
Wow, I've been an Apple fan for 30 years. Now I can never own another iPhone. This will probably happen to all phones.

And you think COVID vaccines are not part of this. The Israelis just announced that the vaccines do not work. But you'll have to get vaccinated anyway. I'm a scientist, so don't tell me there is any science supporting these gene therapies.

That may look off-topic. I'm just pointing out that there are so many avenues of repression manifesting.
 
 
-1# Steppen-Wolf 2021-08-27 17:08
////

I'm so sincerely sorry that the avoiders and deniers of truth are voting your nothing-but-fac ts comment into the red. I wish I could vote it into the green.

These RNA vaccines are extremely dangerous, and they are causing thousands (soon to be millions?) of people to have serious adverse reactions and/or effects after getting vaccinated with them.

Now, I just got word that a guy I was conversing with in my HUD apartment complex's laundry room five days ago, has contracted covid (presumably the so-called "delta variant"), so I may have been exposed to it. I can only hope that this isn't some ploy to attempt to, in violation of my human and civil rights, force me, an autoimmune and/or immune-compromi sed elderly-disable d man, to get vaccinated, a vaccination that would very likely kill me just as infection with the virus would (so I very likely wouldn't be a carrier long enough to infect anyone---I'm now self-isolating and staying inside my apartment until fourteen days have passed from when I may have gotten infected by that guy).

The eugenicists are pulling out all the stops, and seeking to force vaccination and "vaccine passports", like never before. Thus, we are truly in an extremely-evil, dystopian time.

(Continued below.)
 
 
-3# Steppen-Wolf 2021-08-27 17:09
////

And we have most "Americans" already completely brainwashed to bow down to and support all of this tyranny, people who will undoubtedly rat on the unvaccinated and seek to get them into trouble with the government. Talk about draconian!

All of us, much more than from the virus and the "vaccines" themselves, are extremely, and increasingly, unsafe at the hands of government; and, as I've long been saying, we ain't seen nothin' yet. Soon the government will be murdering people who refuse to get vaccinated, if they aren't doing so already.

We're so "frelled" and/or "fracked"!

So, people, in case I likely don't live much longer (won't some if not most of you be so glad if I don't make it---don't lie!), I take this opportunity to say goodbye, and to send the message that I truly wish ALL OF YOU, and "yours", NOTHING BUT THE VERY BEST.

////
 
 
0# Questioner 2021-08-26 19:54
If we constantly demand and expect that our "Big Brothers" keep us "safe", don't we have to cede to them the tools to do that? What do we have to give up to be "kept safe"?
 
 
-5# advokaat 2021-08-26 22:31
"Or maybe I'm confused—or maybe I just think different." Snowden writes this from his "sanctuary" in Russia. Does he not see the irony?
 
 
+2# economagic 2021-08-28 19:56
Do YOU not see the irony that this branch of the Evil Empire is pointing out that the Land of the Freebooters and Home of the Freeloaders is neither free nor brave but at war with its own citizens, ostensibly "for their own good"?

When a dying empire cannot trust its own citizens--or merely re fuses to do so--the fuse has been lit.
https://readersupportednews.org/opinion2/277-75/71172-focus-the-all-seeing-qiq-apple-just-declared-war-on-your-privacy

Comments

Popular Posts