Go to Top

Apple Can Help the FBI Crack a Terrorist’s Phone, They Just Won’t Do It

When individual privacy rights and societal good collide

iStock_000060216890_MediumDoes a dead terrorist’s right to privacy trump our rights to exist in a safe world?  Lofty philosophical question indeed….but not quite as simple as it sounds.  Let’s put the moral debate aside for a moment and examine the technical challenge presented by today’s news that Apple has been ordered by a federal judge to help investigators break into the San Bernadino terrorist’s iPhone.  The iPhone, used by Syed Rizwan Farook, who opened fire at an office party last year, killing 14 and injuring 22, is owned by Farook’s employer, who is cooperating with the investigation.  FBI investigators believe this phone may contain valuable information and details about the crime, including contacts.  Investigators have been unable to bypass the phone’s encryption, and Apple refuses to help.  I have been asked by numerous friends and clients if my firm could break into the phone if given the chance.  Here is the scoop.

iPhones manufactured after 2014 are equipped with encryption that locks the data on the phone, rendering it virtually impossible for anyone, including law enforcement and even Apple engineers themselves (in theory) to access the device’s data.  Apple claims they have made the encryption available to help consumers protect themselves from hackers and other risks to their private information.  Law enforcement has cried foul, reminding Apple that there is a greater public good that is stepped on if creepy criminals like child molesters and terrorists can hide their misdeeds behind encryption technology.  Who is right?

iStock_000020042572Large(1)Consider this: The Manhattan district attorney’s office has reported that since September of 2014, encryption technology has prevented them from executing 155 search warrants on devices running Apple’s iOS 8 operating system, in cases that involved homicide, attempted murder, sexual abuse of a child, sex trafficking, assault, and robbery.

Technically, any password can be cracked.  We do it in our lab every day by utilizing “Brute Force” password “guessing” software that essentially rapidly attacks the phone with every mathematically conceivable combinations of  passwords millions of times and tries to log into the phone.  It is a math issue.  We can guess some password within a few days.  The problem is that some of these newer phones will erase the data on a phone after the password has been guessed incorrectly ten times.  Ouch.  Beyond that, the newest smart phones also contain a feature that forces a brief delay between password log-in attempts.  This stretches the amount of time it takes for brute force password cracking software to hack into a phone from several days, to several years.

Apple computers logo on Apple store in Montreal,Canada.So, Apple does have the technical ability, of course, to create a back door that would enable law enforcement to crack this phone.  In doing so, Apple argues, the security-disabling software could easily fall into the wrong hands (national enemies and hackers that want to steal your confidential data).  Apple says it will fight the court order.

I believe Apple CEO, Tim Cook, is quite sincere in his powerful open letter to Apple customers published this week.  He articulates the duty his firm has to protect the security software that keeps our most private information safe.  He is right when he says that any backdoor Apple creates may eventually be stolen and used by the bad guys to steal our confidential data and ultimately, this can place our national security at greater risk.

Now, back to the moral question;  do you agree with Apple…or law enforcement?  Personally, I love Apple computer, and their products.  I bought my first Mac when they first went to market, in 1984.  I believe the Apple scientists are smart enough to develop a work-around that assists law enforcement and does not compromise our data privacy if it falls into the wrong hands.  I hope they will try.  Unfortunately, if they do not, our legislature may pass a law which limits encryption in the private marketplace, and unintended consequences could significantly diminish data security for generations to come.

, , , ,

About Jeff Hartman

Jeff is a 30 year veteran of the corporate security, computer forensics, and eDiscovery community and a co-founder and partner at 4Discovery. 4Discovery is a leading provider of computer incident response and computer forensics services to attorneys, corporate security executives, and the information protection community.