In one of the only cases reviewed in which Wickr was said to have responded to a search warrant, an FBI special agent testified in 2021 that Australian authorities observed Michael Glenn Whitmore of Anchorage, Alaska, in several groups of Wickr users trading and distributing child abuse material.
In one group, users commented on images of a 12-year-old, according to the complaint, and described in detail how they would abuse the child. The complaint said that he was part of at least five other Wickr groups they believed to be devoted to child exploitation.
According to the complaint, he admitted to sharing child sexual abuse material with “slightly less than 100 different people” using Wickr, among other apps
Whitmore has pleaded not guilty and is awaiting trial. A representative for him did not respond to a request for comment.
The complaint noted that a search warrant was served on Wickr for information about the account, which resulted in just the date of creation, the type of device used, the number of messages sent and received, and the profile picture of the account, which was described as “an anime image of three children wearing only diapers.”
In its “Legal Process Guidelines,” Wickr is explicit about the limited amount of information it’s willing to provide law enforcement. “Non-public information about Wickr users’ accounts will not be released to law enforcement except in response to appropriate legal process such as a subpoena, court order, or other valid legal process,” the page reads. “Requests for the contents of communications require a valid search warrant from an agency with proper jurisdiction over Wickr. However, our response to such a request will reflect that the content is not stored on our servers or that, in very limited instances where a message has not yet been retrieved by the recipient, the content is encrypted data which is indecipherable.”
Wickr says it prohibits illegal activities in its terms of service but has in the past been staunchly against law enforcement intervention on tech platforms at large. In 2016, the Wickr Foundation, the company’s nonprofit arm which began in 2015, filed a friend of the court brief in support of Apple arguing against providing law enforcement tools that would provide access to encrypted content.
“Deliberately compromised digital security would undermine human rights around the globe,” the brief reads. In the case, Apple was ordered to assist law enforcement to unlock an iPhone that belonged to a mass shooter in San Bernardino, California. The order was eventually vacated.
The debate marked a growing conflict between law enforcement and tech companies about encryption and potential access to evidence in encrypted environments. Wickr’s position at the time wasn’t new, and was largely representative of many companies looking to maintain the security of encrypted environments. But Wickr’s seeming inaction in developing alternative methods to prevent crime on its platform in lieu of a “backdoor” to get around encryption stands apart from other tech companies such as Meta or Microsoft, which developed the PhotoDNA technology that has been pivotal in identifying and fighting the spread of child sexual abuse material across the internet and is used to scan files in Microsoft’s OneDrive cloud.
Wickr’s origins
Wickr was founded in 2012 by a security-minded group of entrepreneurs including Nico Sell, an organizer of the hacker convention Defcon. The app applied encryption typically used by defense officials to personal messaging, stripping messages of any identifiable metadata, and giving users the option to sign up anonymously and have their messages self-delete.
By 2015, the company had raised $39 million in funding, seizing on a public just beginning to gain interest in data privacy. Sell, who did not respond to a request for comment, sold the company as staunchly pro-privacy, claiming early on that she had refused to give the FBI a backdoor into the platform. That same year, news reports started to trickle in about how the app was being used to commit crimes.