Monday, May 6, 2019

Child-pornography cases are being dropped around the country as prosecutors balk at defense demands for details about software that scans file-sharing programs


Child-pornography charges have been dropped in more than a dozen recent instances around the country because defense attorneys raised questions about software tools used to investigate such cases.

Could the pending U.S. v. Scott J. Wells case in the Western District of Missouri, which we have covered extensively, produce such a result. That's hard to say because charging documents indicate the Wells case began with a "cyber tip" from Facebook, rather than the use of investigative software. Also, the software scans for child porn on peer-to-peer networks, and there is no evidence Wells used such a network -- strongly suggesting he did not knowingly receive or distribute child porn.

Our recent research on the Wells case reveals signs that the prosecution's case is frightfully weak due to a lack of probable cause that has nothing to do with technology. That is particularly alarming when you consider that Wells has spent more than two years in federal detention, mostly at Leavenworth, KS, because U.S. Magistrate David P. Rush has deemed him a "threat to society," even though he's been found guilty of nothing, is virtually blind in one eye, and must use a walker to get around due to a benign brain tumor he's had since childhood.

Our recent discoveries in the Wells criminal complaint and affidavit are shocking, and they come on top of weaknesses in the prosecution's case on which we've already reported. New details about the case are set for upcoming posts.

As for the recently dropped child-porn cases, those are outlined at a ProPublica article titled "Prosecutors Dropping Child Porn Charges After Software Tools Are Questioned," with the sub-hed "More than a dozen cases were dismissed after defense attorneys asked to examine, or raised doubts about, computer programs that track illegal images to internet addresses." From reporter Jack Gillum:

Using specialized software, investigators traced explicit child pornography to Todd Hartman’s internet address. A dozen police officers raided his Los Angeles-area apartment, seized his computer and arrested him for files including a video of a man ejaculating on a 7-year-old girl. But after his lawyer contended that the software tool inappropriately accessed Hartman’s private files, and asked to examine how it worked, prosecutors dismissed the case.

Near Phoenix, police with a similar detection program tracked underage porn photos, including a 4-year-old with her legs spread, to Tom Tolworthy’s home computer. He was indicted in state court on 10 counts of committing a “dangerous crime against children,” each of which carried a decade in prison if convicted. Yet when investigators checked Tolworthy’s hard drive, the images weren’t there. Even though investigators said different offensive files surfaced on another computer that he owned, the case was tossed.

At a time when at least half a million laptops, tablets, phones, and other devices are viewing or sharing child pornography on the internet every month, software that tracks images to specific internet connections has become a vital tool for prosecutors. Increasingly, though, it’s backfiring.

How does the software backfire? Gillum has the details:

Drawing upon thousands of pages of court filings as well as interviews with lawyers and experts, ProPublica found more than a dozen cases since 2011 that were dismissed either because of challenges to the software’s findings, or the refusal by the government or the maker to share the computer programs with defense attorneys, or both. Tami Loehrs, a forensics expert who often testifies in child pornography cases, said she is aware of more than 60 cases in which the defense strategy has focused on the software.

Defense attorneys have long complained that the government’s secrecy claims may hamstring suspects seeking to prove that the software wrongly identified them. But the growing success of their counterattack is also raising concerns that, by questioning the software used by investigators, some who trade in child pornography can avoid punishment.

“When protecting the defendant’s right to a fair trial requires the government to disclose its confidential techniques, prosecutors face a choice: Give up the prosecution or give up the secret. Each option has a cost,” said Orin Kerr, an expert in computer crime law and former Justice Department lawyer. “If prosecutors give up the prosecution, it may very well mean that a guilty person goes free. If prosecutors give up the secret, it may hurt their ability to catch other criminals. Prosecutors have to choose which of those outcomes is less bad in each particular case.”

Our coverage of U.S. v. Wells iindicates courts tend to treat child-porn matters as cut-and-dried, with defendants often pressured into guilty pleas, whether they committed the offense or not. ProPublica's investigative work shows such cases can be more complicated than some courts want them to appear:

In several cases, like Tolworthy’s, court documents say that the software traced offensive images to an Internet Protocol address. But, for reasons that remain unclear, those images weren’t found on the defendant’s computer. In others, like Hartman’s, defense lawyers said the software discovered porn in areas of the computer it wasn’t supposed to enter, and they suggested the police conducted an overly broad search.

These problems are compounded by the insistence of both the government and the software manufacturers on protecting the secrecy of their computer code, so as not to imperil other prosecutions or make trade secrets public. Unwilling to take the risk that the sensitive programs could leak publicly, they have rejected revealing the software even under strict court secrecy.

Nevertheless, the software is facing renewed scrutiny: In another case where child pornography identified by the software wasn’t found on the suspect’s computer, a federal judge in February allowed a defense expert to examine it. And recently, the nonprofit Human Rights Watch asked the Justice Department to review, in part, whether one suite of software tools, the Child Protection System, had been independently tested.

The government often wants to have its prosecutorial cake and eat it, too, says one expert:

“The sharing of child-sex-abuse images is a serious crime, and law enforcement should be investigating it. But the government needs to understand how the tools work, if they could violate the law and if they are accurate,” said Sarah St.Vincent, a Human Rights Watch researcher who examined the practice.

“These defendants are not very popular, but a dangerous precedent is a dangerous precedent that affects everyone. And if the government drops cases or some charges to avoid scrutiny of the software, that could prevent victims from getting justice consistently,” she said. “The government is effectively asserting sweeping surveillance powers but is then hiding from the courts what the software did and how it worked.”

What about a big-picture view of child-porn cases? ProPublica's Gillum provides one:

The dismissals represent a small fraction of the hundreds of federal and state child pornography prosecutions since 2011. More often, defendants plead guilty in exchange for a reduced sentence. (Of 17 closed cases brought since 2017 by the U.S. attorney’s office in Los Angeles, all but two resulted in plea deals, ProPublica found.) Even after their charges were dropped, Tolworthy and Hartman are both facing new trials. Still, the dismissals are noteworthy because challenges to the software are spreading among the defense bar and gaining credence with judges.

Software developers and law enforcement officials say the detection software is an essential part of combating the proliferation of child pornography and exploitation on the internet.

“This is a horrendous crime, and as a society we’re obligated to protect victims this young,” said Brian Levine, a computer science professor at the University of Massachusetts at Amherst who helped develop one such tool, called Torrential Downpour. “There are a number of victims who are too young to speak, or can’t speak out of fear. This tool is available to law enforcement to rescue those children who are abused.”

Evidence is mounting, however, that the tool does not always work the way it's supposed to:

In cases where previously flagged porn isn’t turning up on a suspect’s computer, investigators have suggested the files have merely been erased before arrest, or that they’re stored in encrypted areas of a hard drive that the police can’t access. Defense attorneys counter that some software logs don’t show the files were ever downloaded in the first place, or that they may have been downloaded by mistake and immediately purged.

Scott J. Wells
Defense lawyers are given a bevy of reasons why porn-detection software can’t be handed over for review, even under a protective order that limits disclosure to attorneys and their experts. Law enforcement authorities often say that they’re prohibited from disclosing software by their contracts with the manufacturer, which considers it proprietary technology.

Prosecutors are also reluctant to disclose a coveted law enforcement tool just to convict one defendant. A Justice Department spokeswoman referred ProPublica to a government journal article, which argued peer-to-peer detection tools “are increasingly targeted by defendants through overbroad discovery requests.”

“While the Department of Justice supports full compliance with all discovery obligations imposed by law,” wrote lawyers for the Justice Department and the FBI, “those obligations generally do not require disclosure of sensitive information regarding law enforcement techniques which, if exposed, would threaten the viability of future investigations.”

Prosecutors, in essence, are telling the public to "trust us" with this sensitive technology. But as we've shown in U.S. v. Wells, prosecutors are not always deserving of trust:

“Courts and police are increasingly using software to make decisions in the criminal justice system about bail, sentencing, and probability-matching for DNA and other forensic tests,” said Jennifer Granick, a surveillance and cybersecurity lawyer with the American Civil Liberties Union’s Speech, Privacy and Technology Project who has studied the issue.

“If the defense isn’t able to examine these techniques, then we have to just take the government’s word for it — on these complicated, sensitive and non-black-and-white decisions. And that’s just too dangerous.”

The government apparently wants to keep the public in the dark about software used in child-porn cases. But ProPublica has shined a troubling light on the technology:

One common suite of software tools, the Child Protection System, is maintained by the Florida-based Child Rescue Coalition. Although the coalition says it’s a nonprofit, it has ties to for-profit data brokers and the data company TLO. (TransUnion, the major credit-reporting agency, has acquired TLO.) CRC has hosted some of its computer servers at TransUnion since 2016, according to a review of internet records collected by the firm Farsight Security.

A redacted user manual filed in a federal case, portions of which were un-redacted by Human Rights Watch and confirmed by ProPublica, indicates that the Child Protection System draws on unverified data gathered by these firms. It says TLO “has allowed law enforcement access to data collected on internet users from a variety of sources,” with enhanced information that includes “marketing data that has been linked to IP addresses and email accounts from corporate sources.”

“No logs are kept of any law enforcement query of corporate data,” the manual continued. It cautioned that subscriber data was unconfirmed, and that it should “be confirmed through other investigative means that are acceptable with your agency and prosecuting attorney.”

Software that relies on unconfirmed information from big data brokers, civil liberties advocates say, may not only point police to the wrong internet address owner, but it also enables them to gather a mountain of personal details about a suspect without a court order, sidestepping constitutional protections. . . .

Another widely used detection tool, Torrential Downpour, was developed by the University of Massachusetts a decade ago with U.S. government funding, court records show. Levine told ProPublica in an interview that the program is accurate enough to find probable cause for a search warrant, but that it can only be effective if police and the courts do their jobs. “The software is one part of an entire process,” Levine said, “followed by investigators and courts to produce reliable evidence and to follow a fair judicial process.”

No comments: