Written Testimony for the United States Senate Committee on the Judiciary on Smartphone Encryption and Public Safety


December 10, 2019

Good morning Chairman Graham, Ranking Member Feinstein, and members of the Senate Judiciary Committee. On behalf of my Office and our partners in state and local law enforcement, I thank the Committee for its work and attention to this vital issue of local, state, and national public safety.

The single most important criminal justice challenge in the last ten years is, in my opinion, the use of mobile devices by bad actors to plan, execute, and communicate about crimes. Just as ordinary citizens rely on digital communication, so do people involved in terrorism, cyber fraud, murder, rape, robbery, and child sexual assault.

For this reason, lawful, court-ordered access to these communications has become essential for us to prevent crime, to hold people accused of crimes accountable, and to exonerate the innocent.

Until the fall of 2014, Apple and Google routinely provided law enforcement access to their mobile phones when they received a court-ordered search warrant. That changed when they rolled out their first mobile operating systems that, by design, often make the contents of smartphones completely inaccessible. In doing so, Apple and Google effectively upended centuries of American jurisprudence holding that nobody’s property is beyond the reach of a court-ordered search warrant.

In 2014, my Office stood in the vanguard of American law enforcement sounding the alarm about the dangers of default smartphone encryption.[1] In subsequent years, I have delivered this call in testimony to the U.S. House and Senate, including members of this committee in 2015,[2] and joined with law enforcement leaders in the U.S.[3] and Europe[4] in op-eds that explained the public safety import of this issue. My Office has also published five annual reports on Smartphone Encryption and Public Safety providing unique and valuable data and analysis on this topic.[5]

Apple and Google, meanwhile, have framed this issue as an either/or proposition. Either we can have user privacy or lawful access, but we can’t have both, they say. And they’ve been successful in propagating this message, even though it’s not true.

My Office is not anti-encryption. Far from it. We routinely use encryption in the course of our daily work, whether in guarding our city’s critical infrastructure against cybersecurity threats or soliciting tips on crimes against immigrant New Yorkers, and we recognize its value in our society and across the world. That does not mean encrypted material should be beyond the law when a judge signs a search warrant – especially when we’re talking about evidence tied to a child sex abuse case or a potential terrorist attack.

Apple and Google have maintained their absolutist position that no form of lawful access can be reconciled with privacy concerns. Yet they have not demonstrated to law enforcement leaders what, if any, damaging effects to user privacy their pre-2014 cooperation with law enforcement caused.[6] Further, they have decided for their own private business interests that the Fourth Amendment grants a right, not just to privacy, but to anonymity. This is wrong, and it upends the careful balance our Constitution strikes between privacy and public safety interests.

I. HOW SMARTPHONE ENCRYPTION AFFECTS PROSECUTORS AND VICTIMS OF CRIME

So how has default smartphone encryption affected law enforcement and crime victims? Let me answer these questions with two brief examples from my own Office.

The first involves child sexual abuse. A babysitter at a local church in Manhattan was identified as having shared images of child sexual assault online. Pursuant to a search warrant, his encrypted mobile phone and other devices were seized. Over time, we opened the devices using technology from a paid consultant. We then discovered the suspect was, not only sharing images of child sexual assault, but sexually abusing children himself, and recording the abuse as well. Based on this evidence, we charged him and a jury convicted him of predatory sexual assault of children.[7] He was subsequently sentenced to 100 years to life in prison.[8]

In the second example, we were not so lucky. My Office was investigating a case of sex trafficking, and obtained an encrypted phone from a suspect who was incarcerated on a different case. In a recorded telephone call from prison, the suspect told an accomplice that he hoped his phone had the newest encrypted operating system.

The inmate said to his friend, “Apple and Google came out with these softwares that can no longer be [un]encrypted by the police … [i]f our phone[s are] running on iOS8 software, they can’t open my phone. That may be [a] gift from God.”

In fact, we were never able to view the contents of his phone because of this gift to sex traffickers that came, not from God, but from Apple. As a result, our investigation of sex trafficking was blocked by encryption.

II. A GROWING PROBLEM WITH RAMIFICATIONS FOR OUR PUBLIC SAFETY AND ENTIRE SYSTEM OF JUSTICE

Our most recent internal data from our fifth annual report on Smartphone Encryption and Public Safety[9] puts this growing problem into sharp relief:

First, my Office receives, in criminal investigations, on average 1,600 mobile devices each year, with almost half of those being Apple devices. The percentage of locked Apple devices has increased substantially over the past five years, from 60 percent in 2014 to more than 82 percent we obtained over the past two years. So that means, for Apple devices alone, we receive over 600 locked and encrypted devices each year.

Second, more than 50 percent of the mobile devices that we’ve received this year are connected to investigations into crimes of violence, such as homicides, sex crimes, and assaults.

Our statistics illustrate the alarming frequency in which smartphone encryption forces my Office to investigate and prosecute our city’s most serious criminal offenses without access to key evidence. To be clear, we are in some cases able to gain entry into these phones by using lawful hacking tools we’ve paid hundreds of thousands of dollars to private companies to obtain.

In one notable case, a forensic search of an armed robbery and kidnapping suspect’s phone made us aware of numerous text messages that had been exchanged between various unknown parties at or near the time of the kidnapping. These messages had been deleted and were not viewable by investigators – that is until, after months of attempts, a third-party vendor helped us access deleted texts that had been exchanged before, during, and after the kidnapping. This new evidence helped us identify and charge three other culprits.

Such third-party workarounds are cost prohibitive, however, for all but a handful of local law enforcement agencies, like mine in Manhattan. They are simply out of reach for many of the smaller and rural communities that you represent. And the price we pay doesn’t guarantee access, since the process doesn’t work in roughly half the cases. The paid workarounds simply give us a better chance of getting into a phone using automated guesses, and Apple and Google have methods to slow down our rate of guessing. This cat-and-mouse game[10] can stretch across weeks, months, or even years, and that time line is unacceptable for a criminal justice system that has strict statutes of limitations and speedy trial requirements.

This issue also matters in another important way that few people appreciate: in a number of important cases, our ability to open and access phones has led to the exoneration of people wrongly suspected or arrested for crimes.

In one such case, two defendants were identified by eyewitnesses as part of a gang assault in which a large group of people attacked three men and two women. Based on evidence successfully extracted from an encrypted phone, it was determined that the defendants were not present for the assault at all, and they were exonerated prior to trial.

I believe everyone on this committee and Americans generally want to avoid miscarriages of justice. So do I. Our ability to access devices enables us to protect our two-fold obligations – to hold the guilty responsible and to protect the innocent from injustice.

III. SMARTPHONE ENCRYPTION IS A LOCAL LAW ENFORCEMENT PROBLEM

The smartphone encryption debate is often framed as a national security issue. The F.B.I. reportedly paid $900,000 to have a private vendor unlock the San Bernardino shooter’s iPhone after Apple told authorities it could not access the device.[11] The mass shooters at Sutherland Springs, Texas[12] and Dayton, Ohio[13] also left behind locked phones that stymied the completion of investigations – investigations that might help communities and law enforcement stop the next mass shooter.

While these are obviously important national cases that demand significant attention and resources, I believe the smartphone encryption debate should center more around the threat it poses to local security in towns across our nation. The majority of collateral damage incurred due to locked mobile devices occurs at the local and state levels, where it is estimated up to 95 percent of American criminal cases are handled. Prosecutors in your home states are all now facing these intractable challenges.

The impact is felt across the country. For instance, it is my understanding that the Florida Department of Law Enforcement alone possessed 418 locked devices earlier this year. In addition, the Raleigh (N.C.) Police Department had 281, the Tennessee Bureau of Investigation had more than 100, and the Charleston County (S.C.) Sheriff’s Office had 70.

As I noted earlier, the workarounds by third-party vendors that sometimes succeed for our office are not an option for most local prosecutor’s offices, due to the prohibitive costs involved. Thus, two versions of justice exist: one for major cities that can afford such workarounds, and a second for smaller agencies that lack the financial means.

Why should justice be made unattainable for victims in these localities for the sake of Apple and Google’s bottom line?

Their decisions to advertise privacy, above all else, make a loud statement that they’re not concerned about victims where key evidence is inaccessible due to their locked devices. Earlier this year, no less an authority than Rene Mayrhofer, Google’s Director of Android Platform Security, belittled the locking out of law enforcement as an “unintended side effect” [14] of its latest security features.

Unintended or not, the reality remains that these tech titans are doing tremendous damage to our justice system, particularly justice at the local and state levels, by choosing to render themselves incapable of complying with a judge’s signed order.

IV. WHY THE CLOUD IS NOT A SUBSTITUTE FOR LAWFUL ACCESS

Law Enforcement is often told that we do not need access to a mobile device to conduct a thorough investigation. Proponents of smartphone encryption say we are living in a “golden age of surveillance,” and we should therefore obtain evidence from alternative sources, such as data saved on “the cloud.”

My Office does, in fact, regularly obtain evidence from cloud providers pursuant to search warrants, in the form of emails, photographs or videos, and other data that has been backed up from a device.

However, the cloud is an imperfect and incomplete solution to the encryption problem, since the most critical evidence is often only available on a device itself.

This is true for three main reasons:

  1. More storage exists on devices than on the cloud. For instance, an iPhone 11 and iPhone 11 Pro come equipped with a minimum of 64 Gigabytes of storage (and, in the case of the iPhone 11 Pro, a maximum of 512 Gigabytes). Meanwhile, Apple provides only 5 Gigabytes of free storage on iCloud by default.[15] Therefore, not all information can be backed up to the iCloud unless a user purchases additional storage data.
  2. Even if a user chooses to purchase more data storage, the user has the option to choose which applications to backup to the iCloud. A user can simply decide to not backup communications, videos, or photos that are incriminating or otherwise critical to an investigation. The user can also opt out of backing up data to the iCloud entirely.
  3. Data is available through the cloud only when it has been saved to the cloud. Often a device that is in use during the commission of a street crime – such as a robbery or shooting – is recovered before the evidence is saved by the device to the cloud. The only way to access that data is through the device itself.

V. CHANGING WINDS, DISPELLING MYTHS

Ideally, Apple and Google would do their part to help create a balanced technical and legal solution to the problems caused by their encryption decisions. Absent this contribution, the changing winds of public sentiment around Big Tech, in the wake of Facebook’s Cambridge Analytica[16] and Google’s Project Dragonfly[17] scandals, has recently created a climate that will support a legislative solution.

Project Dragonfly, in particular, raised a host of questions about Google’s planned adherence to China’s strict internet censorship rules. Among those questions: if Google is willing to obey an authoritarian government’s censorship rules for search engines why won’t it do what is necessary to comply with lawful court-ordered search warrants in the United States?

Similar questions on censorship surround Apple’s activities in China. Knowledgeable observers suggest Apple – a self-proclaimed champion of consumer privacy in America – does not abide by the same standard when it comes to protecting the privacy of protestors in Hong Kong, because it’s better for its bottom line to acquiesce to China’s wishes.[18]

To be clear, I, as well as prosecutors across America, are not asking Apple or Google for something extraordinary. We are not asking for a “backdoor” mechanism that would allow our offices to surreptitiously snoop on private citizens. Nor do we want “surveillance” of smartphone communications.[19] Instead, we are asking these companies to comply with warrants issued by impartial judges upon findings of probable cause: something I explained in letters to Apple CEO Tim Cook and Google CEO Larry Page in 2014. [20]

Some in the tech sector have sought to stoke fear that this type of lawful access will morph into a sweeping data collection apparatus that places consumer privacy at risk. I can assure anyone with such a concern that the search warrant process is subject to strict constitutional protections, which have been successfully overseen by impartial courts for over 200 years.

The same cannot be said for Facebook or Google – which harvest our private data,

sell it to others for extraordinary profit, and, on occasion, lose millions of people’s private information due to hacks. Just last month, we learned that Google’s “Project Nightingale” gathers the personal health data of millions of Americans, without informing patients.[21] Likewise, the 2018 security breach that exposed the accounts of 50 million Facebook users[22] demonstrates how the tech companies’ priorities are not about protecting privacy after all.

Finally, Facebook CEO Mark Zuckerberg announced in March planned privacy changes involving end-to-end encryption for Facebook Messenger, WhatsApp, and Instagram.[23] In doing so, Zuckerberg conceded that, with billions of people using these services, there would be some who would use these newly encrypted services for “truly terrible things like child exploitation, terrorism, and extortion.” Law enforcement leaders from the U.S., the United Kingdom, and Australia have since signed an open letter publicly opposing these changes.[24]

In 2018 alone, Facebook was responsible for 16.8 million reports of child sexual exploitation and abuse to the U.S. National Center for Missing and Exploited Children.[25] The National Crime Agency estimates these reports resulted in more than 2,500 arrests, with 3,000 children brought to safety. Yet Zuckerberg’s announced changes would dramatically restrict the ability to generate these reports: again, because a private company has made a business decision to render its products inaccessible to itself or law enforcement. Simply put, Facebook’s planned end-to-end encryption will make it harder to detect – and stop – child abuse and similar crimes.[26]

It’s deeply troubling to think the overwhelming majority of these reports would cease if child sex predators were able to “go dark” because of Facebook’s business decision. My Office, which is one of the leading anti-trafficking agencies in America, frequently relies on Facebook messages obtained through appropriate judicial process to build cases against traffickers. A world in which children can be recruited and groomed on Facebook – with no hope of law enforcement intervention – is a world in which we, collectively, are failing our children.

VI. CONGRESSIONAL ACTION IS REQUIRED TO SOLVE THIS COMPANY-MADE PROBLEM

Five years since the smartphone encryption sea change, it is unconscionable that smartphone manufacturers, rather than working with government to address public safety concerns, have dug in their heels and mounted a campaign to convince their customers that government is wrong and that privacy is at risk. Because Apple and Google refuse to reconsider their approach, I believe the only answer is federal legislation ensuring lawful access. Tech goliaths have shown time and again they have no business policing themselves.

Of course, as in any industry – especially when it comes to public safety – federal regulation has been important for many decades in the communications industry.

For example, when telephone companies went from using copper wires to using fiber optics and digital signals, law enforcement could no longer rely on previous technology when it came to wiretaps, so Congress passed the Communications Assistance for Law Enforcement Act (CALEA), mandating that telecom providers build into their systems mechanisms for law enforcement to install new forms of wiretaps when approved by a court. CALEA has worked. It has saved lives, and it has withstood constitutional challenge.  It has not stifled innovation, as its opponents feared. And it has not caused American consumers to migrate to foreign competitors in search of greater privacy.

The same is true in the financial services industry. Beginning in the 1970s, as law enforcement learned more about how criminals were using banks to move money, Congress passed new laws to require financial institutions to adopt new technologies and procedures to detect money laundering; to better know their customers; to maintain customer data; and to make that data available to law enforcement pursuant to a court order. Over time, government and industry came together to develop protocols and procedures to effectively implement those new laws, and a broad consensus emerged. Banks and investment firms did not want to be conduits for crime and terror. 

My sincere hope is that, with appropriate congressional leadership and legislation, a similar result can be achieved with this industry, too.

I anticipate that today Apple will tell you it is impossible to maintain keys to open one of their devices without creating a hole for cryptocriminals themselves to gain access. I have two responses to this:

  • First, in 2016, Apple’s then-general counsel acknowledged that the company’s process for unlocking phones in response to warrants prior to 2014 had never led to a security breach. [27]
  • Second, this new criminal justice problem is the direct result of these private companies’ decisions to redesign their products. I’m not a technologist, but I’m confident the problem can be solved by a company re-design as well. As President Kennedy once said, “Our problems are man-made, therefore, they can be solved by man. No problem of human destiny is beyond human beings.”   

A middle ground exists in the smartphone encryption debate. The right balance between privacy and public safety can be achieved by (1) requiring a court-ordered search warrant, and (2) limiting the information sought to data at rest (for example, the photos and messages that are already on your phone). In other words, I’m not talking about surveillance of live discussions or other communications while they are in progress. This middle ground on encryption is the position “most likely to enable fruitful debate among diverse communities-of-interest,” according to the Carnegie Endowment for International Peace. [28]

Given the current impasse, I am convinced this middle ground will only be achieved through federal legislative action, which is why I am here today. Allowing private companies in Silicon Valley to continue to assert themselves as the unregulated gatekeepers of critical evidence is dangerous, and warrants legislative intervention.

Thank you for inviting me to testify today.

 

 

[1] Vance Jr., Cyrus R. “Apple and Google threaten public safety with default smartphone encryption.” The Washington Post, 26 September 2014. https://www.washingtonpost.com/opinions/apple-and-google-threaten-public-safety-with-default-smartphone-encryption/2014/09/25/43af9bf0-44ab-11e4-b437-1a7368204804_story.html

[2] Written Testimony of the New York County District Attorney Cyrus R. Vance, Jr. Before the United States Senate Committee on the Judiciary. “Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy.” 8 July 2015. https://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Vance%20Testimony.pdf

[3] Vance Jr., Cyrus R., Jackie Lacey and Bonnie Dumanis. “Op-Ed: Congress can put iPhones back within reach of law enforcement.” Los Angeles Times, 11 May 2016. https://www.latimes.com/opinion/op-ed/la-oe-vance-congress-act-on-iphones-20160511-story.html

[4] Vance Jr., Cyrus R., François Molins, Adrian Leppard and Javier Zaragoza. “When Phone Encryption Blocks Justice.” The New York Times. 11 August 2015. https://www.nytimes.com/2015/08/12/opinion/apple-google-when-phone-encryption-blocks-justice.html

[5] Manhattan District Attorney’s Office. Report of the Manhattan District Attorney’s Office on Smartphone Encryption and Public Safety: An update to the November 2018 Report. October 2019. https://www.manhattanda.org/wp-content/uploads/2019/10/2019-Report-on-Smartphone-Encryption-and-Public-Safety.pdf. See also Manhattan District Attorney’s Office 2018 Report, https://www.manhattanda.org/wp-content/uploads/2018/11/2018-Report-of-the-Manhattan-District-Attorney27s-Office-on-Smartphone-En….pdf; 2017 Report, https://www.manhattanda.org/wp-content/themes/dany/files/2017%20Report%20of%20the%20Manhattan%20District%20Attorney%27s%20Office%20on%20Smartphone%20Encryption.pdf; 2016 Report, https://www.manhattanda.org/wp-content/themes/dany/files/Report%20on%20Smartphone%20Encryption%20and%20Public%20Safety:%20An%20Update.pdf; and 2015 Report, https://www.manhattanda.org/wp-content/themes/dany/files/11.18.15%20Report%20on%20Smartphone%20Encryption%20and%20Public%20Safety.pdf

[6] Bruce Sewell, Senior Vice President and General Counsel for Apple, Inc., Responses to Questions for the Record, “The Encryption Tightrope: Balancing Americans’ Security and Privacy,” at p. 2. Question 6(b)(1). U.S. House Committee on the Judiciary, 1 March 2016. Was the technology you possessed to decrypt these phones ever compromised? Answer: The process Apple used to extract data from locked iPhones running iOS7 or earlier operating systems was not, to our knowledge, compromised.

 

[7] Manhattan District Attorney’s Office. “DA Vance: Babysitter Convicted at Trial for Sexually Assaulting Two Children. 28 November 2017. https://www.manhattanda.org/da-vance-babysitter-convicted-trial-sexually-assaulting-two-children/

[8] Siegel, Jefferson and Shayna Jacobs. “NYC babysitter gets 100 years to life for raping two kids, recording the assaults.” New York Daily News, 23 March 2018. https://www.nydailynews.com/new-york/nyc-crime/manhattan-babysitter-100-years-life-raping-2-kids-article-1.3893108

[9] See Report of the Manhattan District Attorney’s Office on Smartphone Encryption and Public Safety: An update to the November 2018 Report. https://www.manhattanda.org/wp-content/uploads/2019/10/2019-Report-on-Smartphone-Encryption-and-Public-Safety.pdf

 

[10] Ramey, Corinne. “Manhattan DA: Locked Phones Continue to Thwart Criminal Probes.” The Wall Street Journal. 31 October 18. https://www.wsj.com/articles/manhattan-da-locked-phones-continue-to-thwart-criminal-probes-1541023682

[11] CNBC. “Senator reveals that the FBI paid $900,000 to hack into San Bernardino killer’s iPhone. 5 May 2017. https://www.cnbc.com/2017/05/05/dianne-feinstein-reveals-fbi-paid-900000-to-hack-into-killers-iphone.html

[12] Reigstad, Leif. “Investigators Want Apple to Turn Over Data from the Sutherland Springs Shooter’s iPhone.” Texas Monthly, 20 November 2017. https://www.texasmonthly.com/the-daily-post/apple-iphone-shooting-sutherland-springs/

[13] Wong, Scott and Harper Neidig. “FBI tells lawmakers it can’t access Dayton gunman’s phone.” The Hill, 8 August 2019. https://thehill.com/homenews/administration/456742-fbi-tells-lawmakers-it-cant-access-phone-of-dayton-gunman

[14] Franceschi-Bicchierai, Lorenzo. “Head of Android Security Says Locking Out Law Enforcement Is an ‘Unintended Side Effect.’” Vice, 30 January 2019. https://www.vice.com/en_us/article/yw8vm7/android-security-locking-out-law-enforcement-unintended-side-effect

[15] https://support.apple.com/en-us/HT201238

[16] Granville, Kevin. “Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens.” The New York Times, 19 March 2018. https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html

[17] Solon, Olivia. “Google’s ‘Project Dragonfly’ censored search engine triggers protests.” NBC News, 18 January 2019. https://www.nbcnews.com/tech/tech-news/google-s-project-dragonfly-censored-search-engine-triggers-protests-n960121

[18] Matsakis, Louise. “Apple’s Good Intentions Often Stop at China’s Borders.” Wired, 17 October 2019. https://www.wired.com/story/apple-china-censorship-apps-flag/

[19] Vance Jr., Cyrus R. “5 ways tech companies distort the encryption debate.” The Washington Post, 15 December 2015. https://www.washingtonpost.com/news/in-theory/wp/2015/12/15/5-things-tech-companies-dont-understand-about-encryption/

[20] See Attached Appendix B.

[21] Copeland, Rob. “Google’s ‘Project Nightingale’ Gathers Personal Health Data on Millions of Americans.” The Wall Street Journal, 11 November 2019. https://www.wsj.com/articles/google-s-secret-project-nightingale-gathers-personal-health-data-on-millions-of-americans-11573496790

[22] Isaac, Mike and Sheera Frenkel. “Facebook Security Breach Exposes Accounts of 50 Million Users.” The New York Times, 28 September 2018. https://www.nytimes.com/2018/09/28/technology/facebook-hack-data-breach.html

[23] Mark Zuckerberg. “A Privacy-Focused Vision for Social Networking.” 6 March 2019. https://www.facebook.com/notes/mark-zuckerberg/a-privacy-focused-vision-for-social-networking/10156700570096634/

[24] The United States Department of Justice. “Open Letter: Facebook’s ‘Privacy First’ Proposals.” 4 October 2019. https://www.justice.gov/opa/press-release/file/1207081/download

[25] Keller, Michael H. and Gabriel J.X. Dance. “The Internet Is Overrun With Images of Child Sexual Abuse. What Went Wrong?” The New York Times, 25 October 2019. https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html

[26] Farid, Hany. “Facebook’s Encryption Makes it Harder to Detect Child Abuse.” Wired, 25 October 2019. https://www.wired.com/story/facebooks-encryption-makes-it-harder-to-detect-child-abuse/

[27] Bruce Sewell, Senior Vice President and General Counsel for Apple, Inc., Responses to Questions for the Record, “The Encryption Tightrope: Balancing Americans’ Security and Privacy,” at p. 2. Question 6(b)(1). U.S. House Committee on the Judiciary, 1 March 2016.

[28] Carnegie Endowment for International Peace, “Moving the Encryption Policy Conversation Forward,” September 2019. https://carnegieendowment.org/2019/09/10/moving-encryption-policy-conversation-forward-pub-79573