Christa: Hello and welcome to the Forensic Focus podcast. Monthly we interview experts from the digital forensics and incident response community on a host of topics ranging from technical aspects to career soft skills. I’m your host, Christa Miller.
With me today are three guests. I’m joined by Robert Peters, Senior Cyber and Economic Crime Attorney with the National White Collar Crime Center; Matthew Osteen, Cyber and Economic Crime Attorney also at the NW3C; and Joseph Remy, a New Jersey-based attorney who serves on the NW3C’s judicial and prosecutorial advisory board. Gentlemen, welcome and thank you for joining the show.
Guests: Thanks for having us.
Christa: I want to make sure our listeners know this episode is actually a tie-in to a new project Forensic focus is working on in conjunction with the NW3C and also with SEARCH. We’re publishing a legal guide for digital forensics examiners exploring major case law and legislation globally that could affect digital evidence and investigations, but we’ll get to that in a little bit.
I also want to make sure that listeners know the views expressed by the participants are not intended to constitute legal advice. If you have any questions, please consult an attorney in your jurisdiction.
And with that, let’s jump into the Q&A. So I’m going to start with a round robin here, starting with Robert. Since you began in this industry, what do you see as the top legal trend or issue standing out to you?
Robert: Well, I think two things stand out, Christa. And the first of these would really be the trajectory of fourth amendment rights, or rather the trajectory of the jurisprudence surrounding application of the fourth amendment.
Historically, at least in the United States we have this third party doctrine, where if you convey some sort of information or data to a third party, you no longer have a reasonable expectation of privacy in that data. And certainly that has implications for things like getting search warrants and complying with those types of protocols.
So that is really no longer the case. And we see that through through a lot of cases, including Riley v. California where we see the judiciary really looking at how interconnected we are to technology, just in our daily lives. And we see that certainly with startling clarity in Carpenter v. United States, which applies these protections and this understanding of how interrelated technology is to our daily routines into things like cell site location information.
Even though you, the customer, have voluntarily provided that data to the service provider — which would be your wireless service provider — there’s still a reasonable expectation of privacy in that. And so that trajectory that we see not just at the Supreme court level but also throughout the circuit courts of appeal is really significant, and really a recent development.
The other thing I would focus on is the explosion of child sexual abuse material cases. And we see that just through some of the statistics of the cases that have been reviewed by NCMEC. If you’re not familiar with NCMEC, it stands for the National Center for Missing and Exploited Children. And they’re empowered by statute to review child sexual abuse material in an effort to identify victims as well as coordinate with law enforcement.
And just to show you how stark this explosion is: in 2004, they reviewed 450,000 files depicting child sexual abuse material; in 2015 that jumps to 25 million. And then in 2018, tech companies reported 45 million photos and videos explicitly depicting child sexual abuse. So it’s a huge explosion, and it looks likely to continue, and there’s some implications there for encryption and other things that maybe we could pick up in a little bit.
Christa: Matt, how about you? What do you see as the top trends or issues?
Matthew: Yeah, I would echo Robert’s viewpoint that the fourth amendment doctrine has significantly changed over the last decade or so. That’s a trend that’s quickly expanding. You know, the Jones case was in 2012, so less than than 10 years ago. And it really kind of shaped the way we look at fourth amendment doctrine, because previously courts had relied pretty heavily on the reasonable expectation of privacy stance for evaluating these cases. And then Jones comes along — that case being where police placed a GPS tracker on a vehicle and then traced its whereabouts — and they went back to a trespass justification for why that violated the fourth amendment. Something that nobody really saw coming, because the courts had largely abandoned trespass as an area to evaluate the fourth amendment for 40-some-odd years previously.
And so that really, in my opinion, marked the change of as we’re going to start dealing with more technological capabilities, we’re going to start using different means to evaluate the fourth amendment. We’re going to start expanding fourth amendment protections in ways that the courts hadn’t previously.
And then you saw that again in Riley in 2014 where the courts held that a cell phone is a very special area to humans. There’s some very flamboyant language in there, that cell phones are kind of an extension of the human soul: that we keep so much data in the palm of our hands now that police need a warrant to get that. And so you started seeing the courts recognize that the fourth amendment protects technology in a way that it doesn’t protect the traditional areas that the fourth amendment has protected.
I would also say that anonymization software is up and coming pretty significantly. With Bitcoin and blockchain being able to hide where money is coming from and where money is going, the law hasn’t really caught up with how to deal with that right now. A lot of the cases are: How do you seize Bitcoin? How do you dispose of it as an asset, but not a lot in terms of the, how do you conduct an investigation there? What are the search implications of going in and tracing this money out?
Christa: And Joe, what is your take on the top legal trends or issues?
Joseph: Well, I agree with my colleagues, both Matt and Robert, but I see it as a larger issue now. The go-to for criminal investigators and prosecutors, both in the past and now, has been the Electronic Communications Protection Act under 18 USC 2701. That law was passed in 1986, it went into effect; and a lot has changed since 1986. We knew there was cloud technology but it wasn’t really readily available to the general public, let alone to the general public in the palm of their hand.
We never envisioned that there would be a forensic tools allowing us to search a lot of this data, both remotely as well as on the device; yet what we see now, as time has evolved, is that there is data on the cloud and obviously there’s data on our specific devices.
The issue that is arising though is both painting that data in a quick and expedient manner, and also the ability to obtain that evidence from the device. And I think that that’s most clear in a congressional hearing that occurred about two weeks ago on the dangers with encryption.
Now obviously encryption serves a purpose. I don’t want my medical information, I don’t want my financial information, unencrypted going through the worldwide web. But if law enforcement is armed with a warrant to obtain that information, whether it be a child exploitation case, a financial fraud case, a homicide case, there should be some ability to access that device in its unencrypted form, both from the device itself as well as from the cloud service provider, whether that be any Yahoo or Google or what have you.
And yet what we’re seeing now is essentially both from the cloud provider as well as through the device that that information is encrypted, that the information that was once available to law enforcement to solve crimes is no longer available, or is deleted, or it’s encrypted, and it’s very difficult to access. And as time goes on that’s the real issue I think is going to make the average — and I hate the term, the average Joe — unable to seek justice. Essentially the courtroom doors, whether it be a criminal courtroom or in some instances a civil court, it’s going to be closed to that individual.
And resolving that issue and balancing it with the privacy rights of you and I, and recognizing that essentially the warrant, where it’s reviewed by a judge and probable cause has to be found, is that appropriate balance as time as time goes on.
I do also agree that anonymization definitely is a key concern, as are voice providers or voice over IP providers and proxies where people host websites. They’re useful tools, but there have to be some limits and some record keeping in order to ensure that if something goes wrong, that law enforcement has the tools in order to hold the person accountable for their actions.
Christa: So how do you see these issues evolving in the future with not just the current technology, but emergent technology, in terms of how the courts are handling what is coming their way now, versus what they’re likely to see into the future?
Robert: Joe touched on this a little bit. You know, the Stored Communications Act and ECPA, those were passed in the 80s. And certainly the rate of speed with which technology has progressed, the text of the statutes hasn’t really kept up with the sophistication of that technology. But more so with the Supreme court’s analysis, particularly in Carpenter.
So let me break that down a little bit. Carpenter found, again, that cell site location information has fourth amendment protections. So that draws into question a lot of the Stored Communications Act provisions. It has… for those who aren’t familiar with the Stored Communications Act, it sort of divides up different categories of content into non-content categories; content categories; content that’s been stored more than 180 days; content that’s been stored less than 180 days. And if we’re to take the Supreme court’s rationale and Carpenter seriously, then that changes a lot of what’s in the Stored Communications Act.
So in terms of what to expect in the near future, we’re going to continually see, as Matt pointed out, increasing challenges to traditional doctrines. And I think one of the things that complicates this is that the Stored Communications Act arguably is not really consistent with the court’s rationale, or even express holdings in Carpenter.
Matthew: Yeah, I definitely agree with all that, Robert. I would also add that the courts seem to be taking a go-slow approach with a lot of these emerging technologies, and even just technology generally. It’s important to remember that in Carpenter, the technology they were discussing… Carpenter was argued, I think, in late 2017, maybe early 2018, at the Supreme court level. And the case itself happened in 2010/2011, so the technology drastically changed from the facts of the case to when the Supreme court was actually deciding how to handle this technology.
And so you had the court trying to predict where the technology is trying to go — where it might go — along with the actual facts of the case. So when Carpenter’s facts happened, you could use CSI data to track within roughly a mile area; that this phone was within a mile of this cell tower; and now you’ve reduced the pinpoint accuracy of that CSLI, not significantly, but it has reduced.
And so in oral arguments you had Sotomayor saying, well my phone can theoretically… officers would be able to see what side of the bed I sleep on based on where I put my phone. That’s just not the reality of where the technology is right now. So you see them trying to predict where it’s going to go, but also trying to give the technology some time to actually play out, so that they can see where the trajectory is going.
And along that vein, you’re seeing cases come out roughly every couple of years, not on a consistent basis. Like I said, Jones was 2012. Riley was 2014. And then Carpenter was just in 2018. We were supposed to get Microsoft: that case was argued, but was ultimately polluted by the cloud app. So it’s hard to gauge when we’ll see another technology case. But my sense would be that the court’s probably going to wait until a good issue either comes in front of them, or until the technology plays out a little bit so that they can see how their precedent might apply in the future.
Robert: And just to interject really quick Matt, you know, we even see that rationale that Christa, you just alluded to… or rather, that approach that you just alluded to, and Matt touched on as well. Even in the text of Carpenter, Justice Roberts talks about how the court needs to tread softly in areas of new technology so as to avoid embarrassment in the future.
So I think both of your practices are something that they’re not going to be particularly aggressive or proactive in.
Matthew: Yeah, absolutely. And, and as both you and Joe mentioned, you know, the Stored Communications Act and ECPA were from the 80s, and so the cloud act was passed in 2018 so that was the most recent update, but it didn’t significantly update any of the legislation there. It made the Stored Communications Act sort of have extra territorial application, but didn’t really modify many of the sections. It was more of an addition.
And it’s also important to remember that there are calls for Congress to legislate these issues, but the fourth amendment finds that there are additional protections through the fourth amendment, then legislation isn’t really going to be effective for providing a process there; at least, a process that’s lesser than getting a search warrant for a lot of this data.
Joseph: My impression generally is, I think we cannot fully address these issues purely with the court response. What we see is a very different outcome, even within the same state. Some of the issues brought up by my colleagues, along with myself, have to be dealt with on a legislative, as well as the judicial, in a regulatory framework. We can’t single out one branch and one area to deal with.
I think when you’re dealing with Carpenter, one of the issues in Carpenter itself was, well the user may not know that they’re being tracked or that that information can be handed over to law enforcement. It’s a simple remedy there: it’s a regulatory thing. I mean, that’s something that a federal agency, the FTC certainly, could actually envelop regulations on an address directly and cure some of that concern by the Supreme court.
But I think the larger issue has to come with updating the CPA; has to come from Congress; has to come from looking at also different responses by states in order to look at privacy, but also look at how to obtain data by law enforcement.
We see in California, obviously, the California Privacy Act coming into effect on the heels of the GDPR going into effect in Europe, and affecting how certain data can be a pain, and how certain data has to be handled. Congress right now obviously is hearing certain testimony as it relates to, well, we can’t have California have one standard and another state have a different standard. It would lead to very different results depending on which state you’re in, or what type of business you’re transacting. So I think what we’re going to see is Congress hopefully coming up with much more of a robust privacy act to cover all 50 states, and hopefully take the good parts of the GDPR and make that a reality.
Hopefully we can see the ECPA be updated to the 21st century, along with a preemption clause within it to cure some of those concerns. All balancing that with, obviously, the privacy concerns that presently is dealing with.
And the issue with encryption on devices: the concept itself is not new. The technology has evolved, but there was a concern in the 90s at the heart of a lot of drug epidemics and substance abuse where there was concern that law enforcement would not be able to wire tap phones on people who were running drugs in and out of this country. And what was the solution? The solution was the digital telephony act, or CALEA at the time. And obviously my hope is that as these issues continue to evolve and we look at maybe a CALEA 2.0 in that regard, and look at law enforcement and its ability to access certain data in the present age.
Are these easy solutions? Certainly not, especially when technology certainly not just evolves year to year, but in some ways day to day; and developing a program such that the legal framework can meet those challenges without having to run to a legislature, or to run to a court, in order to resolve those issues.
So the short answer to your question, Christa, is I do see Congress coming in and hopefully putting a lot of thought into looking at the privacy issues, along with the rightful access issues, or the going dark issues. I see regulatory responses, hopefully addressing and making consumers aware that certain data is being collected on a day to day basis when they choose to use a device.
And then also of course interpreting those laws as they come out, as well as addressing those concerns. Hopefully with educated investigators, educated lawyers, and an educated judiciary, which at times when you reach some of these decisions, you see maybe they don’t fully understand what encryption is as opposed to password protection, or how easy it is to get certain data from a third party such as Google, Facebook, or what have you.
Christa: So it seems like we’re talking at a very high level, and I think all three of you have alluded in one way or another to process. And so my next question is: How does all of this affect forensic examiners that you are encountering in their day to day work, where they may or may not have the guidance that they need either from laws or precedents set in courts?
Robert: You know, one thing that jumps out here, Christa, is certainly the efficacy of forensic tools and accessing cloud accounts, and how simple it is to access information that is in remote cloud storage from devices themselves, even if you don’t have forensic expertise. And I think this is where the legal concerns are particularly enhanced.
One of the things we discuss in two of the courses that actually all three of us, Matt, Joe and myself, have presented the Digital Evidence Basics and the Cloud Act: a two-day course that NW3C puts on for prosecutors. If an investigator takes a mobile device, and let’s assume they have a search warrant to search the data on that mobile device, but there’s nothing in the search warrant that authorizes them to access cloud accounts. If that investigator then opens up the mail app and swipes down to download additional messages that are not currently on that device but are in fact perhaps on a server elsewhere, and those messages download, then potentially we have a legal issue. And that is the same, particularly since forensic tools that can access those cloud accounts are pretty common.
Now that’s a helpful tool to have, particularly if the Stored Communications Act is worded in a way that enables that. But as the Stored Communications Act as currently written, there’s one of a couple ways you can use to access data that’s in remote storage.
First, get a search warrant for it. Now, a search warrant for the device is not the same thing as a search warrant for that remote data. So our search warrants have to be addressing both. So that’s point one. Secondly, you can get consent from the accused, which we do occasionally have. And third, you can obtain it by sending legal process to the service provider. That’s really the same as the first.
So in this situation — the hypothetical I just mentioned where we just have the search warrant for data on the device, but not for the cloud accounts — an investigator could be in a pretty bad scenario, given the potential penalties of the Stored Communications Act.
And where I think this is particularly problematic is just with the fact that many tools, specifically forensic preview tools, they’re very user friendly. You don’t have to have advanced certifications or advanced degrees to operate them. In many cases they might be operated by, say, first responders or probation officers who really don’t have the knowledge base to understand what the tool is doing and where it’s really drawing data from, whether it’s from the device itself or from a cloud account. So I think that’s primarily a training piece that we certainly need to emphasize with those individuals.
And then to Joe’s point, I really like his multipronged approach. I think there’s some easy policy solutions too, and perhaps even some Stored Communications Act tweaks that would still respect the intent of preserving privacy rights, but also reflect practical realities, and hopefully enhance law enforcement’s abilities to respond to those types of situations without unduly burdening them and adding unnecessary bureaucracy.
Matthew: Yeah, I agree. One area that I could potentially see is removing some of the criminal and civil penalties from the Stored Communications Act. And especially if you expand that to requiring a warrant to get a lot of this data, then you could see potentially 1983 being a better vehicle to vindicate somebody whose data was wrongly seized.
But then you have a lot of common law protections and immunities apply that would be very helpful to a forensic investigator who in good faith was doing the examination and drawing down some of this evidence. It would absolve them from that. And then on the search warrant side, you would potentially have the good faith exception if the law were to change after they did an extraction.
And for those who don’t know, 1983 is sort of a civil section that allows individuals to sue police — really any agent of the state or federal government — for violating their rights. So if you violate the fourth amendment and see somebody’s data, then you can potentially sue them for the violation of your fourth amendment rights. There are certain protections that apply there.
And I think that would be a good framework to protect forensic examiners from liability by being [indecipherable] through the Stored Communications Act. And then the good faith exception could potentially save an extraction that happened because of a change of law.
Robert: Joe, what do you think about Matt’s 1983 thought?
Joseph: I think it’s an interesting idea. I think it probably obviously needs to be thought out. I can see positives and negatives on both sides of the fence as to it. We do know, for instance with the National Center for Missing and Exploited Children and the internet and electronic service providers that cooperate with it, that they are given a legal protection under the law so long as they act in good faith and they comply with it. I believe that’s found under 18 USC 2252 A, but don’t quote me on that directly. I do know that it is found somewhere in that general area.
Now obviously as to Christa’s question directly: when I first started, and it wasn’t that long ago, it was maybe 11-12 years ago when we first presented digital evidence, and I’m certainly not old enough… I do know people, and I’m sure we all know people, who started off in forensics in the eighties and nineties… very, very different.
But the challenges I first saw were largely a defense bar that didn’t understand what a forensic image was, what certain terms within a forensic report meant. And now, particularly in the Northeast [indecipherable] and California, we’re seeing very well-crafted challenges by the defense bar, not just on what a forensic image is, but on how a practitioner may define that term. We used to define it simply as a bit by bit copy. In reality, it’s depending on the type of image you’re getting; whether it’s a raw image; whether it’s a forensic image’ is this close to a bit by bit copy, if it’s a raw image, because it skips over bad sectors when it creates that image and a lot of imaging software, and that’s a little caveat there. The other thing that the defense more particularly is aware of is Google Lab created collision or hash collision that they bring up.
Sometimes we forget when we’re on the stand, in many respects, that that was created in a lab. I could take a horse and put a spike on it, but that does not make it a unicorn in the natural occurring environment. Yet we see those challenges going on.
The other thing that was particularly used, especially with I think a lot of forensic labs in particular is… I think it was the Rosenbaum case. I believe it was Atlanta, Georgia, where a murder charge had digital evidence suppressed because there was a delay in the request for that examination and then that examination took time. It’s a rarity I see a forensic lab that does not have a backlog, and how do we address that? How do we address essentially one, making that request in a timely manner; and two, making sure that examination happens as quickly as it can while recognizing, obviously, resources aren’t unlimited, especially in a public sector government area where you could use that money to buy a new tool or to hire a new examiner.
Where is your money best spent in order to make sure that evidence is obtained in a timely manner and then can be brought to trial in a timely manner, such that then the prosecution and defense can obviously address the challenges that I alluded to earlier? These obviously are legal challenges that I think that our practitioners either have seen or will be seeing in the future as it relates to digital evidence on the forensic side. And obviously, this assumes that things like encryption and going dark and that whole debate doesn’t get worse as time goes on. And I do see that becoming more and more of a problem, unless we see more and more action by Congress as well as by the judiciary.
Christa: And that actually leads into the next question, or next point that I was going to raise. We’re talking a lot about the technology as it currently stands. I wanted to find out about emerging technology, things like virtual reality, the blockchain, et cetera, that potentially could throw additional monkey wrenches into this mix that we’re talking about, which already sounds complicated as it is.
Robert: Absolutely. And I’ll deal with the virtual reality piece of that. My background is as a sex crime prosecutor, you know, the potential of virtual reality; and not just the potential, but also the application of it, certainly raises some significant concerns in a variety of ways.
So the purpose of virtual reality is essentially to live the experience. You’re looking at a 360 degree view. A lot of times virtual reality in some markets is paired with teledildonics. So essentially sex toys that can be controlled and participated in by remote partners. So that creates significant concerns when we apply that to producers, manufacturers and consumers of child sexual abuse material. And already there’s a decent amount of existing literature talking about that possibility because of the far more interactive experience.
And so researchers believe those platforms will soon be applied in various contexts and in fact, in some contexts they have already been applied in areas of prostitution, digital brothels, sex tourism, certainly voyeurism; the application of virtual reality connected to drones, that’s a frequent topic as well, and particularly in the voyeuristic context.
So it raised a lot of issues. And the concern I have, having worked with several excellent investigators is: child sexual abuse material itself is incredibly damaging. It raises concerns for mental health, certainly. And that’s with a two dimensional image. So when we apply virtual reality to that, I certainly have concerns for the effects of that on law enforcement, and prosecutors to a lesser extent, that have to interact with it.
Now the silver lining for that is with the 360 degree view, the likelihood is much greater that we’ll be able to have more corroborative material, potentially more identifying information, in the cases of unknown victims. [It’s] not all that difficult for a manufacturer of child sexual abuse material to take a photo and ensure that the background is free of obvious, at least identifying, indicators. Whereas if you have a 360 degree view, which you see not just on high end virtual reality platforms but also lower end, a lot of this is actually very affordable. The odds are decent that we’ll see more of these guys slipping up, and so that has some potential.
One more thing I would note in terms of emerging technology. Of course artificial intelligence is significant. And I think maybe there’s an analogy to be made with some of the jurisprudence involving cell tower dumps. Delving into that a little bit, some investigative agencies have been really addressed quite sternly by the courts when they’ve come before the court asking for various things, but not having a plan of how to manage and ultimately dispose of innocent subscriber data.
So in the context of cell towers for example, you’re really pulling the data of every individual whose technology interacted with that specific cell tower. And so, you know, you’d have a good chance of getting your target. You also have a certainty of that impacting quite a few other individuals. And so courts have in fact in some cases… in at least one case out of Texas where the judge says, look, we’re denying your application because you don’t have a plan for this. And you’re not really taking the privacy implications of this seriously.
So the connection to artificial intelligence, maybe it’s a tenuous one, but I think it’s at least potentially relevant: artificial intelligence is certainly capable of producing mass amounts of data. So for example, if we were to apply artificial intelligence more proactively, say to identify grooming behaviors online, I’m [indecipherable] to identify potentially predatory behavior online, which frankly I think we can do. I know of at least one entity that is doing it. And I think in general that has a lot of of social pros in terms of identifying behaviors and learning more about these individuals who are seeking to exploit children. But if we’re generating a bunch of information and data on that, even if it’s from open source information, perhaps we should consider 28 CFR part 23; and that’s again specific to the United States code of federal regulations.
And that’s a guideline for law enforcement agencies that operate federally funded, multi-jurisdictional intelligence systems. And it shows those agencies: okay. How do you operate these criminal intelligence information systems effectively while also safeguarding privacy and civil liberties? So that might be a framework to consider as we start to deploy AI in these contexts.
Frankly, I think we should. Particularly with how limited our resources are in law enforcement, and particularly how so many of these individuals explored children with impunity because frankly, we don’t prioritize it enough. I think AI has great potential, but we need to apply it in a careful way.
Matthew: Yeah. And on the civil liberties front with AI in particular, I think in limited public ways that AI has been deployed, particularly with AI using Google and some search engines, it’s been kind of unavoidable that the AI can develop a bias, whether that be racism or sexism.
A lot of these AI programs are picking up some negative viewpoints from the internet and from the statistics that they’re pulling. And so I can imagine that there will be a lot of civil liberty advocates who want to sort of maybe pump the brakes a little bit on AI until we can get some of those concerns ironed out.
I would also talk about some cryptocurrency and blockchain issues here. I think the biggest legal issue with a lot of emerging technologies is going to be explaining them to a jury and breaking down how the technology works, what the point of the technology is, and how it was used to facilitate a crime.
And on the blockchain… I mean, effectively the blockchain being a public ledger and some cryptocurrency such as a theory, I’m allowing you to execute smart contracts, I’m aware of at least a few businesses who are interested in virtualizing the business entirely and running it off of a blockchain with theory-based smart contracts that execute automatically once certain conditions are fulfilled to do pretty much all of their business. Now that makes fraud more difficult, but not impossible.
And so financial crimes that could potentially loop in blockchain and smart contracts add an additional layer of legal concerns, but specifically with explaining all of that to a jury. Financial crimes are hard enough to break down for juries and explain where the money was moving, how the money was moving, why it was criminal, the way money was moving, and what was said. That all is a beast in and of itself. Then you add in trying to explain complex technology to a jury and the confusion can be overwhelming. And confusion typically benefits defense counsel more so than it benefits prosecution. So if the jury is sufficiently confused by the nature of the crime and the technology, then it’s going to be a lot harder to prosecute these crimes.
Joseph: I think that that’s dependent obviously on our ability to educate investigators, lawyers, judges and the general public in general. I think most people look at Bitcoin and they say, well, you know, if they don’t really fully get it, well what’s the point of it? And I can tell you this personally, I was one of those people, I said, cryptocurrency, what’s the point? I’d rather have money in my pocket that I know is negotiable and isn’t viewed more as a commodity. But when you look at the blockchain, the technology underpinning it, it makes sense. When did blockchain really fully start? In around 2008. What was going on in 2008? The great recession, people didn’t trust the banks because ultimately banks and major hedge funds and other things were looking at income and were hiding certain losses. So what was the solution to it?
The solution to it was, why don’t we create a decentralized public ledger? And with that decentralized public ledger everybody can see, and you know, what if one node or one person who has access to that ledger tried to alter it, the rest of the nodes know. The rest of the servers holding the ledger say, whoa, whoa, whoa, there’s something wrong here, because we’re seeing one of our computers has a totally different reading than the other one. That transparency in some ways prevents fraud.
Now, obviously it depends on whether we’re dealing with the public. Obviously blockchain or crypto environment or a private one. And what circumstances or areas in order to let that blockchain technology evolve. Do I have to provide some evidence of my identity? If I do, then that can at least clarify some of the fraud concerns.
So to me the blockchain in some ways makes sense, if the technology is used in such a way that somebody can be known on the ledger even by the people who are supposed to be operating it. I do agree that education is a concern overall. And I think that things like the dark web may interfere with the ability of artificial intelligence to do its job.
And the other issue presented, I think, by AI and also DNA and fingerprints being stored, not just on a device but also in servers and other things, is what do we do with that information then? And to me, that’s a much broader question than I think we can deal with in 30 minutes to 45 minutes or an hour. But I do think it’s a serious question that needs to be looked at very carefully.
But I do agree that we can our AI to our advantage. And I do think that the cell tower jump example is pretty apt in the scenario. So I do see some legal challenges, and I do see it being on an education level, and I think that we need to do a better job, obviously educating the general public along with law enforcement, prosecutors, and the judiciary on things like the blockchain because it is, at first blush, difficult to understand. But I think if we take the time — and obviously resources are certainly not unlimited — but to develop a curriculum. So when we understand it, I see a great positive coming out of that, and essentially a challenge actually becoming a strength.
Christa: So tell me a little bit about some of the ways that NW3C and SEARCH and the other organizations out there are tackling those educational challenges. How are they helping to improve the landscape, at the very least for forensic examiners, and the legal experts who are helping to shape case law and legislation in the country?
Matthew: Well, for NW3C’s part, we have a significant catalog of live in-person courses as well as online courses. Most of those are geared toward investigators, law enforcement, forensic examiners. In the past year or so, we’ve branched off and started a prosecutorial and judicial wing of classes. We have two live in-person courses there, and four online courses geared towards prosecutors.
But on the forensic examiners side, our high tech crimes section has a large course catalog going through a lot of these issues as well. And I would say that while the prosecutor classes are geared toward attorneys for forensic examiners, it can be extremely harmful. And we’ve gotten very positive feedback from forensic examiners who have attended those courses to understand the legal landscape and how at least the attorneys and the judges are thinking about these issues.
And, you know, as Joe said, education can turn a lot of this into a win for prosecutors. A lot of forces do attempt to do that. I would also say that we have a number of [indecipherable], search warrant templates or Communications Act templates, electronic wire tap and trace applications and warrants. Those are gated toward law enforcement. We also operate the law enforcement cyber center that has some resources that are open to all.
Robert: One more thing I would highlight on the NW3C side of things is the assistance piece. And certainly a significant history of providing technical assistance towards forensic examiners and investigators. Matt and I have partnered up to do… I believe it was 89 technical assists within the last year, actually considerably less than a year it took us to get to that amount, on a variety of issues, whether that be jury trial assistance, or developing memos on niche issues. We’ve developed specific memorandums, even with varying state laws. So just off the top of my head: Georgia, New York, Idaho, Minnesota, Arizona, we were able to assist a variety of jurisdictions, in a variety of ways. Again, from more statutory issues to more specific technological issues. And again, even jury trials.
Joseph: I also, by the way, echo Matt’s comments in that I do see investigators attending the prosecutor courses in NW3C and I think that is phenomenal. What I don’t see is a lot of prosecutors attending the forensic examiner courses.
And I can tell you when I started as a prosecutor in early 2009, there were no prosecutor courses available, but I took the NW3C courses because they were offered very locally in New York City and I was the only prosecutor there initially. Some of my colleagues scoffed at me saying, you’re not getting any kind of legal education hours for this. You’re just really wasting your time. But over time and attending this course, I built up better connections with my examiners who I can tell you unequivocally both in New York and the people I’ve worked with in New Jersey, they’re only after the truth. If the forensic evidence says this person either didn’t do it or there’s evidence to mitigate it, they have no problem admitting to that.
The other thing is, I began to understand their language. And in understanding their language I was a much more effective prosecutor, and I think there has to be greater cross-training in that area. I think NW3C is great because they come to you free of charge. They also offer a lot of great webinars on emerging topics, and obviously it’s impossible to cover every emergent topic.
Search.org is another great resource. Their ISP list, which goes beyond ISPs and goes through a lot of different resources and how to subpoena certain entities, is phenomenal. They have some training that’s really good. The National Computer Forensic Institute has expanded its budget through a grant through Congress, where you go down to Alabama and they have very good programs about varying topics and you come away with software and other good things.
And then the National Domestic Communications Assistance Center, NDCAC, has a great number of free tools along with white papers. Matt did mention the law enforcement cyber center, and I’m going to do a quick plug for that. That’s going to be found at www.iacpcybercenter.org. Search.org, obviously, is found at search.org. NCFI is, I believe, ncfi.usss.gov. And then NCCAC, if you type in national domestic communication assistance center, it will pop up. You can sign up for an account, but these resources collectively do a great deal in educating law enforcement. And if anyone listening obviously hasn’t either checked out these websites or looked into these resources, you need to start doing that because it’s going to help you, and it’s going to help your cases.
And you know, when you choose to obviously leave law enforcement for whatever reasons, I think it’s going to make you a more marketable and overall very well-rounded law enforcement professional. And I do consider everybody at work, and I expect everybody I’ve worked with to be a professional, to be as educated as they can be, to be the most intelligent attorney or detective in a courtroom. And that includes the judge, in some instances.
Our job is to promote justice and obviously to educate and to do the right thing. And I think these are organizations that meet that challenge. The question that I see though is law enforcement and prosecutors saying, well, I have so much to do that I can’t attend them. And I just don’t consider that to be a good thing because they think they’re doing more work, as opposed to they should be working smarter, not harder. And I think these organizations in the training, I think, encourage people to work smarter and more efficiently and also fix some of the problems that we’re seeing in these evolving issues that may get worse for us as time goes on.
Christa: So I want to come back to Forensic Focus’ new collaboration with NW3C and with Search.org, which is the quarterly legal guide that we’re now publishing that is available at forensicfocus.com and it’s also linked from nw3c.org and search.org. Robert and Matt, what do you most want our readers to understand about the laws and legal issues that we’ll be raising in that guide?
Robert: Yeah, thanks Christa. And first of all, it’s a great privilege to be involved in this project. I think it’s a tremendously important project to take on, for reasons that Joe just articulated. The ostrich defense can’t really apply to us on the side of investigators and prosecutors. We can’t have our head in the sand on these issues. We have to be engaged on them. And so we need to understand the fourth amendment jurisprudence in its trajectory. We need to understand how we can abide by, not only the jurisprudence, but also best practices on the investigation side of things. And so I think it’s a tremendously important guide. It’s exciting to be a part of that project. And I applaud Forensic Focus, and you Christa, for really pulling together an excellent team to put out these updates. I think it’s going to be tremendously useful and it certainly has been for me.
Matthew: I definitely echo Robert’s sentiments there. I’m very glad to be a part of it. I would just say that, for forensic examiners, the law is somewhat unstable in a lot of these areas. So issues that you might be familiar with, it’s still worth it to take a look at those sections. Just to familiarize yourself with some of the updates that have happened. Well settled law isn’t necessarily well settled, particularly when it comes to technology. So if you think you’re aware of a particular area of law, it doesn’t hurt to just still read into that and make sure that you are in fact up to date and that nothing has changed.
And obviously speaking as someone not involved directly in the process, I had a chance to review the legal quarterly that came out only a few days ago, and I’m in all of it. It provides a concise but clear review of emerging topics, useful case law and areas to look out for. So I couldn’t be happier with Forensic Focus. I’ve been a member of Forensic Focus for quite some time. It’s always proven useful for me and I hope it’s useful for everybody who comes to that website.
Christa: Thank you. And thank you listeners for joining us on the Forensic Focus podcast. You can find more articles, information and forums at www.forensicfocus.com. If there are any topics you would like us to cover, or if you’d like to suggest someone for us to interview, please let us know.
About The Contributors
Robert J. Peters is the Senior Attorney of the Zero Abuse Project, where he develops and delivers state-of-the-art training and comprehensive technical assistance to prosecutors and child abuse multidisciplinary team members on crimes against children.
Previously, Mr. Peters worked as the Senior Cyber and Economic Crime Attorney & General Counsel with the National White Collar Crime Center (NW3C), where his efforts included providing subject matter expertise on topics and trainings related to child abuse, human trafficking, and Internet-facilitated child exploitation, and acting as lead instructor for NW3C’s Judges & Prosecutors courses.
Mr. Peters served as Assistant Prosecuting Attorney in Marion and Hampshire counties in West Virginia, as well as Special Prosecutor in multiple state-wide jurisdictions where he specialized in the prosecution of sexual offenses, civil child abuse and neglect cases, and juvenile crime. Mr. Peters started and ended his tenure in Marion County with the jury trial and conviction of sexual predators.
In addition to his prosecutorial experience, Mr. Peters authored several child protection-related articles in peer-reviewed publications, including the Florida Journal of International Law, Handbook on Interpersonal Violence Across the Lifespan, and Christian Ethics Today. While in law school, Mr. Peters clerked at the United States Attorney’s Office, Western District of Virginia.
Mr. Peters served on the faculty of the 34th International Symposium on Child Abuse, where he developed and co-presented training involving the investigation and prosecution of child abuse perpetrated in the context of faith communities. Prior to serving as prosecutor, Mr. Peters designed comprehensive child protection policies for numerous entities including educational institutions, churches, and non-profit organizations.
Mr. Peters is founder and Chairman of the SHIELD Task Force, a 501©(3) nonprofit which partners with Child Advocacy Centers and local stakeholders to encourage reporting of sexual abuse and online safety. This initiative has brought age-appropriate abuse prevention education to thousands of school-aged children and numerous civic and community groups. Mr. Peters also serves on the WV Child Advocacy Network (WVCAN) Board of Directors and WV Human Trafficking Task Force. In 2019, Mr. Peters received the WV State Police Center for Children’s Justice Extra Mile Award for demonstrated professional leadership and personal commitment in going the Extra Mile on behalf of children and families.
Matthew Osteen is the General Counsel for the National White Collar Crime Center (NW3C), a 501©(3) nonprofit based in Richmond, VA, dedicated to providing financial and high-tech crime training and technical assistance to law enforcement. He also serves as NW3C’s Cyber and Economic Crime Attorney. Mr. Osteen is a subject matter expert on topics related to digital evidence, financial crime, intellectual property theft, and third-party data.
He develops the curriculum for NW3C’s Judges and Prosecutors courses and provides research for numerous other courses. He presents on emerging topics, such as the CLOUD Act, cyberstalking, and preparing digital evidence for trial. He provides content for a series of webinars on various aspects of internet-facilitated crime.
Mr. Osteen currently leads NW3C’s prosecutorial technical assistance program, responding to questions from, writing briefs for, and providing resources to prosecutors nationwide. He is frequently contributing legal content posted to the International Association of Chiefs of Police’s Law Enforcement Cyber Center.
Before joining NW3C, Mr. Osteen worked in the General Counsel Department of the West Virginia State Auditor’s Office. He graduated magna cum laude from Marshall University and obtained his Juris Doctor from West Virginia University.
Joseph D. Remy currently serves as an Assistant Prosecutor in the state of New Jersey as well as on the National White Collar Crime Center (NW3C)’s Judicial & Prosecutorial Advisory Board. As an Assistant Prosecutor, Mr. Remy drafts legislative proposals and investigates and prosecutes criminal offenses involving child exploitation (human trafficking, luring, and child pornography) and cybercrime (bank fraud, identity theft, network intrusions, and malware).
Mr. Remy previously served as a Deputy Attorney General from 2016 to 2019 within the Financial and Computer Crimes Bureau of the New Jersey Division of Criminal Justice. While a Deputy Attorney General, Mr. Remy spearheaded an effort to revise New Jersey’s child pornography law, as well as worked on proposals involving missing persons, cyberterrorism, and human trafficking. Mr. Remy also investigated and prosecuted a number of defendants for child exploitation and cybercrimes.
Prior to serving as a Deputy Attorney General, Mr. Remy served as an Assistant District Attorney in the New York County Manhattan District Attorney’s Office from 2009 to 2016 and as a criminal defense attorney at the Law Office of Edward T. McCormack, Esq. from 2008 to 2009. During his time as an Assistant District Attorney, Mr. Remy was assigned to Trial Bureau 50, where he investigated, prosecuted, and tried offenses ranging from simple assault to complex robbery cases. Mr. Remy also investigated and prosecuted computer source code theft, computer hacking, and identity theft rings.
Mr. Remy has received awards from the United States Department of Homeland Security and New Jersey Transit Police for his work. While working with various law enforcement agencies including Amtrak Police, NJ Transit Police, NYPD, the Port Authority Police Department, Homeland Security Investigations and the US Secret Service, Mr. Remy completed numerous training programs offered by the National Computer Forensics Institute, the NW3C, FBI Virtual Academy, and INTERPOL. He frequently lectures at international and national conferences such as the High Technology Criminal Investigation Association Conference, Dallas Crimes against Children Conference, and National Cyber Crimes Conference.
Mr. Remy holds a B.A. in History and American Studies, a minor in English Literature, and a secondary education teaching certification from the State University at New York College at Geneseo, and a Juris Doctor from Pace University School of Law. Mr. Remy is, among other things, a Certified Cyber Crime Examiner and Certified Blockchain Expert.