When a tragic Islamic terrorist attack struck San Bernardino, California the Federal Bureau of Investigations (FBI), which failed to stop the attack, pointed the finger at Apple. One of the jihadists was a county worker and had a government-issued iPhone. The FBI and National Security Agency were both unable to unlock the phone or access its memory. When Apple refused to build a backdoor vulnerability for the FBI to access the phone, the crime prevention agency successfully sued the company.

Apple refused to comply with the judge's order.

The FBI asked the judge to compel Apple to comply and even rolled out the red carpet, asserting that the technology company could take custody of the evidence, build the backdoor, allow the FBI to remotely hack it and then Apple could destroy the means thereafter. Part of this was an appeal to the public too—a way of attempting to force Apple's hand.

Apple then made public what was private. The company, under CEO Tim Cook's leadership, had actually been consulting the FBI on various methods for hacking the phone. In fact, the FBI had botched one of the suggested techniques after a mistake. The agency wasn't willing to risk another gaffe.

Mysteriously, over a month later, the federal government withdrew its lawsuit against Apple. It announced that it had found a consulting company to bypass Apple's security measures to get access to the iPhone used by the terrorist that was owned by the local county government.

Apple was mum.

What really happened? We'll never know.

See, when it was reported that an Israel-based cybersecurity firm helped the FBI, the agency used The Washington Post, a favorite dumping ground for intelligence and federal law enforcement officials, to allege that it was in fact an Australian company that had successfully helped retrieve access to the iPhone and its iCloud backups. When leading media companies sued to get access to confirm the story, a judge blocked the information requests.

Apple remained mum.

Was Apple successful in defending its hardware and customers' privacy rights—even after it had secretly been consulting with the FBI to train the agency to hack iPhones?

Several years later, another jihadist attacked. This time it was at the Naval Air Station in Pensacola, Florida. We'd later learn that the immigrant jihadist was radicalized the year of the San Bernardino terrorist attack, though there is no known or alleged connection beyond coincidence echoing through time.

The immigrant Islamic terrorist was another iPhone user. In fact, he owned two iPhones. He shot one of his phones and the other was damaged.

After not being able to access the phones after getting a court order, then Attorney General Bill Barr blasted Apple saying the company had not provided "substantive assistance" in their investigation.

The public was once again enthralled in a pressure and public opinion game, as if destiny itself was whittling away at a protection using a succinct narrative; to stop jihadist terrorist attacks, we need backdoors to iPhones and their online backups. Never mind the fact that the FBI didn't stop these immigrant jihadist attacks in the first place.

Like in the San Bernardino saga, Apple went public too.

It turns out, Apple had been helping the FBI all along—again. Cook's company had provided the government with "gigabytes of information" including "iCloud backups, account information and transactional data for multiple accounts."

Months later, the Department of Justice and the FBI announced that they had gained access to the iPhones by other means.


Each time Apple has been approached by the Department of Justice, it cooperated with consulting the government but warned that building a backdoor would fundamentally make all its customers less safe and more vulnerable to attack.

In Pensacola, they argued, "We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers."

This only echoed their priorly stated position from the San Bernardino case. "The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

Apple, correctly, asserts that building hardware to be hacked will be hacked.

Brett Max Kaufman, an attorney with the left-wing American Civil Liberties Union, characterizes this public feud, where the security state attempts to manufactures public opinion in order to get Apple to build them a universally applicable backdoor, like so: "Every time there’s a traumatic event requiring investigation into digital devices, the Justice Department loudly claims that it needs backdoors to encryption, and then quietly announces it actually found a way to access information without threatening the security and privacy of the entire world."

What no one knew was that sometime in between the two tragedies, Apple decided to scrap its plan for end-to-end encryption option for iCloud backup customers. The catalyst for this decision? Complaints from none other than the FBI.

End-to-end encrypted backups would mean that Apple wouldn't have a copy of the key allowing them to access a particular user's data.

It appears that the high-level sources from Apple, speaking to Reuters two years after the fact, were attempting to help the company pushback against the government's allegations during the Pensacola case.

Right on time, the pandemic arrived.

Apple has been a willing participant in everything COVID-19. It helps aid global governments in contract tracing, identifying and alerting targeted individuals, encouraging lockdowns, and auto-installing phone applications.

There were no complaints from the government or privacy or antitrust cases litigated. Apple and the world's governments were one.

It has paid off. Apple's profits have reportedly doubled during the pandemic.

It's almost difficult to pause and take a moment to appreciate that, in just the span of a year, we're in a world that is wildly more profitable for the giants and less profitable for the people; and we are more surveilled and those spies, both government and corporate, have even less accountability. There appears to be a correlation.

What's not surprising but still outrageous is that Apple has a new surprise proposal for its customers. Unlike the time Tim Cook was defending the San Bernardino terrorist's government-issued iPhone, he doesn't—this time—believe "[t]his moment calls for public discussion, and we want our customers and people around the country to understand what is at stake."

No, full steam ahead. Later this year, when you update your phone's operating system, the change will take effect—no calls for public discussion.

Apple is now ready to build a backdoor. In fact, they've already built the two backdoors. The first backdoor will monitor any uploads from your iPhone to your iCloud backup. Your images and videos will be scanned using a client-side machine learning artificial intelligence piece of software they've installed on your phone and in conjunction with a third-party database and alert authorities if the bot believes those media files violate children. The second backdoor will monitor all iMessages (previously end-to-end encrypted) between confirmed minors and anyone they message. If something explicit is sent or received, it can alert a parent.

It goes without saying but should be said so that we're clear. We should fight child abuse and protect their electronic interactions, particularly with the filth found online and with adults.

Corporate actors could do more to protect children online and deliberately choose not to. Twitter and YouTube are filled with pornography and predators. These companies spend their resources on policing political content and building algorithms to suppress certain social conversations they don't want to trend but don't put these resources behind getting rid of content that is harmful to children.

None of what Apple is doing is in good faith. They are wrapping our values around their agenda to make our devices weaker. These backdoors will be exploited by hacking thieves, foreign adversaries, and government spies.

It's not a coincidence that they're doing this during a pandemic lockdown.

I agree with the Electronic Frontier Foundation when they warn that this is "not a slippery slope; [it's] a fully built system just waiting for external pressure to make the slightest change."

We've come full circle. Apple will build universal backdoors on the devices it sells to people; they've ended plans to securely allow us to encrypt our data, end-to-end; and they are surveilling us and making a lot of money doing it.

Cook rightly warned the public that building the technical ability to do one thing, that bypasses the security of a closed system, encourages an inevitable social pressure to make us less safe. How does he justify today's backdoor with yesteryear's comments? Couldn't we better protect children by having Apple use its App Store monopoly to filter content served by applications? Has Apple considered not selling its device to minors? What limiting principle prevents Apple from working with totalitarian governments? Will Apple publicly report how many minors it successfully protects?

What's Apple's real endgame for client-side media scanning?

I suspect Apple will end up using this technology backdoor to subvert end-to-end encryption applications running on their devices. They'll also use this for ad tech and the government will never punish them because they'll use this same backdoor. The harm here is enormous.

This is only the beginning. Having the technical ability to scan our images and videos on the client-side, check it against a third-party database, and send signals in and out allows anyone who gains access and the ability to do anything.

FBI sources eventually admitted, to The Los Angeles Times, that the San Bernardino jihadist never had materials relevant to the attack on his government-issued Apple iPhone. As for the jihadist in Pensacola, no new information was ever released, and presumably discovered, that wasn't already confirmed—without access to the iPhones—months prior. However, the federal government alleges that after having access to the phones, the US conducted an operation against an overseas al-Qaeda operative.

Our convenience is the end of our privacy and the beginning of their profit and the surveillance state. This is the beginning of the technocratic corporate security state.