Apple rejects ‘false claims’ by Justice Department regarding Islamic terrorist attack in Pensacola

U.S. Attorney General William Barr said on Monday that the FBI had cracked the iPhone encryption of the Royal Saudi Air Force trainee who killed three American sailors in a December attack at a U.S. naval base in Florida and found evidence linking him to the Islamic terrorist group al Qaeda.

breaking into Apple iPhones: iPhone passcode lock screen
iPhone passcode lock screen
Barr called on Congress to take action forcing Apple and other tech companies to help law enforcement agencies get through encryption during criminal investigations. “Apple’s decision has dangerous consequences,” Barr said. “Many of the technology companies that advocate most loudly for warrant-proof encryption … are at the same time willing to accommodate authoritarian regimes.”

Christopher Wray, the director of the FBI, said on Monday that this time no third party was able to provide a technique to unlock the phones. He did not detail the methods the FBI devised but said “the technique that we developed is not a fix for our broader Apple problem — it’s of pretty limited application”.

“We received effectively no help from Apple,” he added.

Apple issued a statement in response, which was shared on Twitter by Bloomberg News’ Mark Gurman:

The terrorist attack on members of the US armed services at the Naval Air Station in Pensacola, Florida was a devastating and heinous act. Apple responded to the FBI’s first requests for information just hours after the attack on December 6, 2019 and continued to support law enforcement during their investigation. We provided every piece of information available to us, including ‌iCloud‌ backups, account information and transactional data for multiple accounts, and we let continuous and ongoing technical and investigative support to FBI offices in Jacksonville, Pensacola, and New York over the months since.

On this and many thousands of other cases, we continue to work around-the-clock with the FBI and other investigators who keep Americans safe and bring criminals to justice. As a proud American company, we consider supporting law enforcement’s important work our responsibility. The false claims made about our company are an excuse to weaken encryption and other security measures that protect millions of users and our national security.

It is because we take our responsibility to national security so seriously that we do not believe in the creation of a backdoor — one which will make every device vulnerable to bad actors who threaten our national security and the data security of our customers. There is no such thing as a backdoor just for the good guys, and the American people do not have to choose between weakening encryption and effective investigations.

Customers count on Apple to keep their information secure and one of the ways in which we do so is by using strong encryption across our devices and servers. We sell the same iPhone everywhere, we don’t store customers’ passcodes and we don’t have the capacity to unlock passcode-protected devices. In data centers, we deploy strong hardware and software security protections to keep information safe and to ensure there are no backdoors into our systems. All of these practices apply equally to our operations in every country in the world.

MacDailyNews Take: A backdoor into iOS products would weaken national security, not strengthen it. Encryption is easily accessible and widely available to anyone who wants to use it.

Encryption is binary; it’s either on or off. You cannot have both. You either have privacy via full encryption or you don’t by forcing back doors upon Apple or anybody else. It’s all or nothing. — MacDailyNews, March 8, 2017

There have been people that suggest that we should have a back door. But the reality is if you put a backdoor in, that back door’s for everybody, for good guys and bad guys. — Apple CEO Tim Cook, December 2015

This is not about this phone. This is about the future. And so I do see it as a precedent that should not be done in this country or in any country. This is about civil liberties and is about people’s abilities to protect themselves. If we take encryption away… the only people that would be affected are the good people, not the bad people. Apple doesn’t own encryption. Encryption is readily available in every country in the world, as a matter of fact, the U.S. government sponsors and funds encryption in many cases. And so, if we limit it in some way, the people that we’ll hurt are the good people, not the bad people; they will find it anyway. — Apple CEO Tim Cook, February 2016


    1. The FBI? No. It wasn’t the FBI who lied about this. It was the Trump crime syndicate’s consigliere, Bill Barr. No surprise; lying is his job description.

  1. Having anyone in the US government (past or present) complain about American companies being “willing to accommodate authoritarian regimes.” when talking about Saudi Arabia is rather rich.

    Remember the big fuss the US put up about Jamal Khashoggi?

    Me neither.

  2. Apple needs to pose a very simple question in these situations: Do you want reliable, secure e-commerce, e-business, and e-collaboration or not?

    Having people intercept my personal emails is creepy and unacceptable, but giving others the ability to intercept and decrypt financial transactions and information? That’s what a “back-door” would allow.

    Apple needs to do a mini-infomercial as slick and engaging as their “SnowBrawl.” ( which explains that the internet—as we know it and use it in the age of COVID—is completely and utterly dependent on ironclad encryption protocols. Start opening backdoors into devices and the internet, and the whole system collapses.

    Apple could do their own version of “think of the children” which would show their side of the story.

  3. Oh, It’s probably easier than we think. Apple should merely suggest that the only reason bad-actors use the iPhone for their secrets is because it’s secure. And only be very naive can think the bad-actors would continue to use an iPhone for their secrets, were it to be made insecure. They’d simply store their secrets elsewhere. So making the iPhone insecure would accomplish precisely nothing. Not for Apple, not for Apple’s customers and not for the FBI.


Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.