Torrential Downpour...
 
Notifications
Clear all

Torrential Downpour Defense Expert Report (USAvGonzales)

23 Posts
8 Users
0 Likes
4,513 Views
(@justlearningforensics)
Posts: 10
Active Member
Topic starter
 

here is an actual case file to chew on
this is federal case
USA v Gonzales - 9th District CP case

here is an over simplified version of events (as it relates to our purposes here)

the govt identified CP traffic from IP household of the Gonzales brothers
the tool that the govt was employing is a BitTorrent suite application - with the primary torrent client called "Torrential Downpour"

upon searching the home
1 Gonzales brother machine is found with contraband
the other no CP is found - though some limited LNK, MRU, and Torrent files infer CP was present on the machine
(limited access to discovery, the evidence of contraband is inference with links, no reference to other artifacts such as search, payments, unallocated images, etc)

as part of defense - the source code for the Torrential Downpour is compelled
Tami Loehrs, a noteworthy cyber defense expert, outlines a series of tests to do on the Torrential Downpour tool and its suite of components - ultimately judge decides that some tests are allowed, others not - the motion is granted in part for the 1 brother whose machine had no CP in-situ present on the machine (for the other brother it is denied)

The test was completed and the results published as part of the case
The defense has motioned to compel source code ANEW - as the test revealed limitations while also revealing error rates

So here are the critical questions for the experts here in this forum
Time to put on our trial hats - and go to trial and argue this case - from BOTH SIDES

understanding how to argue this expert report from both sides is important

recent cases are showing that judges are more willing to offer TESTS of these blackbox govt tools - offering a balance to surrendering the source code all together. the tests are often heavily biased (or redacted) to the govt side. The results are often cryptic and may be limited in their scope, or ability to extrapolate results to the exact case - but nonetheless this would be a fruitful discussion here on the pros/cons of the report, its applicability to the case (and other cases) and sharing concerns to the value created (or not) from this test.

where does this test hold merits?
In what scenarios does this test favor defense?
In what scenarios does this test fall apart?
How critical for testing is having access to other IT suite component referred to in the defense argument?
Does the govt need 3rd party independent testing of its scientific tools (avoiding such defense scrutiny)?

Here is the case document in PACER
USA v Gonzales
Case 217-cr-01311-DGC
Document 99
Filed 05/01/20

(Not sure how to share the 17 page PDF in this forum)
Here is the document Test Results from the PACER - uploaded to scribd

https://www.scribd.com/document/460188341/217-Cr-01311-DGC-All-Defendants-USA-v-Gonzales?secret_password=0JTpIRbnM9Ejgy3EtlrS

 
Posted : 04/05/2020 6:19 pm
(@justlearningforensics)
Posts: 10
Active Member
Topic starter
 

https://ecf.azd.uscourts.gov/doc1/025022044502

 
Posted : 04/05/2020 6:21 pm
tracedf
(@tracedf)
Posts: 169
Estimable Member
 

Good third party testing, if it consistently showed the software to be accurate and reliable, could result in fewer defense attorneys asking to have their experts review the software. But, who would the third parties be? If I'm going to tell the defense attorney I'm working for that the software is reliable and that we shouldn't bother reviewing it, I would want to know that it has been thoroughly tested and vetted on multiple occasions by reliable, competent professionals, including ones who are independent of law enforcement and/or do defense work. If all the experts putting their stamp of approval on it are associated with law enforcement, a defense attorney isn't going to be happy relying on that and I won't ask them to. If, on the other hand, I could say that many of my colleagues who do defense work have already tested it and found it to be a reliable tool, even though I did not personally test it, the defense attorney will be more likely to trust that and not ask me to do another review.

The problem right now is that Torrential Downpour is a law enforcement-only tool and it appears to have some number of false positives. There have been cases where the program identified a particular IP address as sharing CSAM material but when law enforcement searched they either found nothing, found other CSAM but not the identified files, or found partial downloads only. It's possible the identification was correct and they were unlucky and simply did not find everything. But, it's an indication that the program sometimes identifies the wrong IP address, that the user of the program is able to make a mistake as to the IP address, or that the program sometimes identifies a user who only shared part of a file. It would be a lot better if we understood the circumstances that can cause the tool or operator to make a mistake or had an established error rate.

Both testing and reviewing the source code would be valuable in determining the function and reliability of the program. Does testing favor the defense? Only if it reveals flaws in the program. How critical is testing? Well, if the case (or the validity of a search warrant) depends on the program being reliable, it's pretty critical.

Law enforcement tools and methods should be subject to scrutiny. One of the most common reasons for wrongful convictions is bad forensic science (e.g. bite mark analysis). Keeping law enforcement tools secret will only serve to perpetuate those problems. Ditto the circumstances and procedures for using those tools. Defense attorneys are supposed to challenge the prosecution's case. They are supposed to question everything. As long as these tools are secret, they are right to push back and ask for their experts to have access.

Think of this from the defense attorney's perspective for a minute. Their clients routinely swear they are innocent. They get a case like the one you referenced and the client insists the cops got the wrong guy. They have to decide whether to take the case to trial and fight it or to tell the guy to plead guilty. Would you want to tell your client, who swears he is innocent, to plead guilty when the tool that identified him might be wrong? That's not a great spot to be in. On the other hand, if your expert said the tool was reliable, or it had been repeatedly tested by others and found to be accurate, that's a much easier decision.

Referring to the opinion, I think that concerns about countermeasures are misguided. There are already things that an offender can do to avoid detection and, based on what we already know, there are ways that programmers could modify P2P clients to prevent their users from being identified by Torrential Downpour and similar programs.

 
Posted : 05/05/2020 1:18 am
(@angelo502)
Posts: 10
Active Member
 

Should the tool matter at all? Is it being made a big deal of when it doesn’t have to be? The fact that the image flagged according to a known hash doesn’t matter that much when it comes to the case. If police are basing a warrant on just the findings from torrential downpour then there could be an issue with the search. Is there anyway that this could establish PC by itself? Can you walk away from a CP case if getting a tip that there is CP coming from a specific address?

If you get fingerprints or blood from a crime scene that comes back to a certain individual, you still need to confirm that those prints or that blood come back to that specific person.

It is possible that the known sample of blood or prints were gathered under a false name or stolen identity. So when you get the newest sample and it hits for say John Smith you need to find John Smith and collect a known sample from him for the confirmation. Is this a similar circumstance to the torrential downpour scenario

Probable cause is a pretty easy burden to make. And NIST requirements basically just look to repeatability and reproduction as the standards that need to be met. Does torrential downpour meet those standards? How many times would the same image that was flagged be flagged by the same program? How many times would it be flagged by a similar program?

Perhaps a similar situation as it relates to narcotics? If an informant goes into a residence to purchase cocaine and comes back and gives it to the cops, it is pretty good. They go in and do it again and the cops are able to say “these drugs came from this house so we need to go in and search and arrest”

Does the CP image as it relates to a case from Torrential Downpour represent cocaine and Torrential Downpour is the informant?

These are just some questions I’ve always brain stormed on search and seizure topics. For defense attorneys it’s always better to argue how they got there as opposed to why they got.

I’m excited to read this case and thank you for the posting.

 
Posted : 05/05/2020 8:43 am
(@tootypeg)
Posts: 173
Estimable Member
 

is it possible to share an open link to the case transcript or the full case citation to find the documentation? I would like to read this case.

 
Posted : 05/05/2020 9:39 pm
tracedf
(@tracedf)
Posts: 169
Estimable Member
 

is it possible to share an open link to the case transcript or the full case citation to find the documentation? I would like to read this case.

The case

https://casetext.com/case/united-states-v-gonzales-221

And some background reading

https://www.propublica.org/article/prosecutors-dropping-child-porn-charges-after-software-tools-are-questioned

 
Posted : 05/05/2020 10:51 pm
(@armresl)
Posts: 1011
Noble Member
 

This case is kid of similar to the old program I-Look where defense wasn't allowed a copy to test how it put out its results.

Granted I-look wasn't a torrent program, it runs into the old "LE only" issue.

Something I don't see mentioned is version numbers. If in a specific version there was an issue with something which has any bearing on the case, then you need to explore that. You don't have to upgrade an item just because, but known issues and Detectives not upgrading creates a problem which they have to answer.

Lastly, I would say that most of the examiners I have run across (myself included) in the past 20 years would have no idea how to review code. We need to know the limits of what can be done not only to defend, but to prosecute a person, and be able to reach out to others who have this skillset in their memory banks.

There is no shame in bringing another set of hands onboard in order to defend or prosecute fairly and accurately.

 
Posted : 06/05/2020 6:53 am
(@angelo502)
Posts: 10
Active Member
 

Good point about the upgrades. Remember the version used for examination always ends up being old by the time it goes to trial and by the time the defense gets the case. For accuracy defense would have to go back to the same version

 
Posted : 06/05/2020 7:03 am
(@rich2005)
Posts: 535
Honorable Member
 

For me it ends up being more of a legal/process/resources issue than a digital forensics issue.

Even if the law enforcement tool was less than perfect in its identification of suspects, the evidence was still found on their computers.

I can't help thinking that if you were searching for an abducted child, and a sniffer dog indicated they might be in a few houses on a street, which, upon searching, the child was found, with another adult, in a basement, the fact that the dog didn't identify the exact house would be somewhat irrelevant (imo).

Of course if records from the Torrential Downpour form more of the case, ie records from it support the distribution charge, then scrutiny of it becomes more relevant/necessary, however, if not, then I see less of an issue.

The fact that the particular CA images weren't identified to me seems immaterial and is very little indication the Torrential Downpour program doesn't work. That's simply just a reflection that the computers weren't being examined at the exact time the program was connected to the computer remotely. The material could easily have been subsequently deleted/overwritten in the intervening time.

In reality most cases of any type are never going to be 100% definitive (and never have been). Hence the term beyond reasonable doubt (over here). The problem with digital evidence is high weight can be placed on it and yet it's extremely easy to come up with endless spurious reasons for doubt.

In many ways I can't help wondering (without knowing everything about the case) if the issue is more about the mechanics of the distribution element.

The nature of how torrents work, and the sharing of files being downloaded is generally accepted as sharing/distribution in courts here and over there, and that the user not being aware of their distribution doesn't seem to be an acceptable defence. I think you could argue that's a long-standing moral/legal issue and that the law isn't consistent on this across all types of case (I believe I'm right in saying for some crimes knowledge/intent needs to be proved but not others - at least over here).

If the defence argument is that the only records of distribution are from the Torrential Downpour program, and it therefore needs to be verified as being accurate, this seems reasonable. If it's simply seeking to suggest the program doesn't always find matching files, on the computers of suspects it identifies, then I think the argument is unjustified and spurious.

 
Posted : 06/05/2020 9:39 am
(@justlearningforensics)
Posts: 10
Active Member
Topic starter
 

is it possible to share an open link to the case transcript or the full case citation to find the documentation? I would like to read this case.

The case

https://casetext.com/case/united-states-v-gonzales-221

And some background reading

https://www.propublica.org/article/prosecutors-dropping-child-porn-charges-after-software-tools-are-questioned

https://www.scribd.com/document/460188341/217-Cr-01311-DGC-All-Defendants-USA-v-Gonzales?secret_password=0JTpIRbnM9Ejgy3EtlrS
here is the PACER Document - public - Test Results from the defense Expert

 
Posted : 06/05/2020 4:54 pm
Page 1 / 3
Share: