by Christa Miller
In cooperation with the National White Collar Crime Center (NW3C) and SEARCH, The National Consortium for Justice Information and Statistics, Forensic Focus is proud to offer a quarterly roundup of the latest legal developments around digital forensic evidence.
Comprising major legislation and case law from around the country, this guide is intended to help our readers get a better understanding of how different countries’ laws are adapting to the rapid pace of technological change.
In Part 1 of this inaugural guide, we cover the following:
- Data privacy laws come into effect in Kenya and the United States, and are tested in Europe.
- The key takeaways from the new bilateral information-sharing agreement covering cloud-based data between the U.S. and U.K.
- Can evidence collection violate ToS? WhatsApp v. NSO Group
- Fallout from the Coalfire pen testing arrests
Part 2 will cover:
- United States case law regarding technology — facial recognition, pole cams, geofencing, and third-party DNA databases — when it comes to search and seizure.
- The “reasonable suspicion” standard in the U.S. when it comes to border searches.
The material published here is for general public education. It is not legal advice for any specific situation. If the reader needs specific legal advice, the reader should consult a qualified lawyer.
More Data Privacy Laws Come Into Effect
In the nearly two years since the European Union (E.U.)’s General Data Protection Regulation (GDPR) went into effect, consumer data privacy has been in the news as companies work to comply — or are fined for noncompliance — and new laws come up for adoption. One of the most significant new laws is the California Consumer Privacy Act (CCPA), the strictest in the U.S.
On the international stage
As a quick recap, the GDPR recognizes a number of privacy rights for citizens of the European Union. According to the website GDPR.eu, these rights include:
- The right to be informed
- The right of access
- The right to rectification
- The right to erasure
- The right to restrict processing
- The right to data portability
- The right to object
- Rights in relation to automated decision making and profiling
Because the recognition of these rights gives citizens “more control over the data they loan to organizations,” those organizations targeting or collecting data on E.U. citizens — regardless of whether they themselves are located in the E.U. — are required to comply with GDPR regulations, including:
- Accountability for compliance
- Implementing appropriate technical and organizational security measures
- “By design and by default” data protection ingrained into everything an organization does
- The need to obtain and document data subjects’ consent — and to honor withdrawn consent
- When organizations are allowed to process data
- The need, in certain organizations, for data protection officers
When it comes to digital forensic investigations, DFIR practitioners need to ensure they have appropriate mandates in place, as well as appropriate policies and procedures, according to Jason Jordaan, Principal Partner at DFIR Labs, who presented on the topic at the SANS DFIR Summit Prague 2017.
For one, the GDPR mandates that data retention duration must be specified. A discussion forum thread on Forensic Focus reflected that European examiners generally retain data in both criminal and civil proceedings until the case is adjudicated and they’ve been advised they can delete the case data.
Of course, the GDPR contains provisions for criminal and other legal proceedings under Article 2 (material scope), Article 10 (processing of personal data relating to criminal convictions and offences), Article 23 (rights of the data subject), and Article 49 (transfers of personal data).
Also on the international stage, Kenya has passed new GDPR-compliant laws. In part to facilitate information technology investment, the new law is expected to help curtail predatory lending practices. It may also help as Kenya moves to digitize citizen identities.
Case C-507/17 Google v. Commission nationale de l’informatique et des libertés (CNIL)
In a case that may be of interest to investigators relying on open source intelligence (OSINT), as well as those working with victims of nonconsensual intimate image sharing or child abuse material, The Court of Justice of the European Union found in September that although Google “must remove links to sensitive personal data from its internet search results in Europe when required, it does not have to scrap them from searches elsewhere in the world.”
Even though the case was brought in 2016 under Europe’s “right to be forgotten” 2014 precedent, the GDPR codifies that law in its Article 17(2). While Google is notable for being the first major U.S. tech company to be fined for noncompliance with the GDPR, the Court’s decision this year balanced free speech and the public interest against the right to be forgotten.
In the United States: California Consumer Rights Under the CCPA
(Text © Ashton Meyers, NW3C. Used with permission.)
With the CCPA scheduled to take effect January 1, 2020 it is important to understand the rights Californians will gain. Many people fail to realize that each time we use the internet to do things like search the web, shop for merchandise, or conduct research, our privacy is at risk. The CCPA plans to help mitigate this risk by giving consumers more control in how their personal data is collected, handled, stored, and destroyed.
The CCPA will give Californians control over their data by establishing six fundamental rights. These rights include:
1) The right to destroy. Californian consumers will now have the right to request that their data be erased either for being incorrect, or because the company should no longer need access to it.
2) The right to be informed when their personal information is being collected, and for what purpose that information is being used.
3) The right to opt out of personal data being collected. The organization must inform consumers that they will be collecting personal data, and then they must provide the right to deny that collection.
4) The right to access any personal information that an organization has collected about a consumer. Upon request, an organization must disclose and deliver that personal information free of charge.
5) The right to data portability. When consumers request access to their personal information per the fourth right, it must be provided to them in a portable and readily usable format.
6) Finally, the CCPA gives all consumers the right to equal services and prices. This is to alleviate any discrimination that consumers choosing to execute their rights may experience. For example, this prevents organizations from charging extra fees or withholding benefits from those who request access to their personal information or opt out of its collection altogether.
The impact of the CCPA will be farther reaching than just within state borders. When this law takes effect, it will apply to not only California residents, but also any national or international organization that conducts business in the state. If successful, other states may follow suit and enact similar privacy laws. California may also be the testing ground for a future comprehensive federal law.
The U.S./U.K. Bilateral Data Sharing Agreement
(Text © Matthew Osteen, Esq., NW3C. Used with permission.)
When the Clarifying Lawful Overseas Use of Data (CLOUD) Act was passed in early 2018, Congress had two goals in mind. The first goal was simple and prompted by exigency; congress needed to give the Stored Communications Act (SCA) extraterritorial jurisdiction.
A case, United States v. Microsoft, had made its way up to the Supreme Court and threatened to limit law enforcement’s ability to obtain evidence stored on servers located abroad. Federal agents obtained a warrant under the SCA for emails associated with an account, the owner of which was suspected of drug trafficking. Microsoft stored the customer’s emails in a data center located in Dublin, Ireland. Microsoft ordered that the SCA did not apply to data stored abroad, as Congress had not affirmatively stated legislative intent for the SCA to have extraterritorial scope.
The CLOUD Act, passed before the Supreme Court could hear the case, provided language explicitly stating an intent for the SCA to apply extraterritorially, thus mooting the case.
The second goal of the CLOUD Act was to preempt and dispose of foreign domestic barriers to law enforcement access to data. For example, if, in the Microsoft case, Irish law prevents the disclosure of stored data without a valid Irish warrant, then Microsoft would have to violate either U.S. law requiring production of the data or Irish law prohibiting production of the data.
To address the potential conflicts of law, the CLOUD Act authorizes the United States to enter into data sharing agreements with foreign countries. The CLOUD Act itself does not provide much guidance on the specifics of any potential data sharing agreements.
On October 7, 2019, the text of the first such data sharing agreement was released. Across 17 pages, the U.S./U.K. bilateral data sharing agreement provides a streamlined process for law enforcement officers from each country to obtain data from the other.
The Big Takeaways
- Conflicts of Law. One of the primary goals of the agreement is to avoid conflicts of laws. What this means is that U.S. law enforcement seeking data under the agreement do not need to worry about the GDPR. Likewise, U.K. law enforcement seeking data under the agreement do not need to worry about the Electronic Communications Privacy Act (ECPA).
- “Serious Crimes.” The agreement covers data relevant to investigations of “serious crimes.” The agreement defines “serious crimes” as “an offense that is punishable by a maximum term of imprisonment of at least three years.” It is unclear if crimes can be aggregated.
- Legal Process. Data sought under the agreement can be obtained with legal process issued by the requesting country pursuant to the domestic law of the requesting country. This means that the same process used for obtaining data before the agreement will be effective under the agreement.
- Where to Serve Process. Orders can be served on the service provider from whom data is sought. However, the order must be transmitted by the designated authority of the requesting country. For the U.S., the Attorney General selects the designated authority; for the U.K., the Secretary of State for the Home Department selects the designated authority. The designated authority is then responsible for reviewing the order for compliance with the agreement, collecting the data from the provider, and passing along the data to law enforcement.
- Judicial Review. Under the agreement, service providers can challenge requests for data. Previously, challenges were handled by the courts. Under the agreement, challenges must be made to the designated authority of the country requesting data. If the challenge is not resolved, then the service provider may raise the challenge with the designated authority of its home country. The designated authorities of each country may then confer to mutually resolve the challenge and render a decision.
Overall, the agreement emphasizes the need for international cooperation when law enforcement investigate serious crimes, including terrorism. While many applaud the effort to reduce reliance on Mutual Legal Assistance Treaties (MLATs), privacy advocates have voiced concerns that the agreement eliminates judicial oversight and removes privacy safeguards.
Can evidence collection violate ToS? WhatsApp v. NSO Group
(Teext © Benjamin Wright, Esq. Used with permission.)
Facebook sued an Israeli company named NSO Group, claiming that NSO violated the terms of service of the messaging app WhatsApp (a Facebook property).
Allegedly, NSO had created spyware for government clients like police agencies around the world. The spyware would allow government clients to infect the smartphones of investigative targets so that investigative evidence could be collected. NSO was allegedly running a platform that was facilitating this infection through the WhatsApp platform.
NSO maintains that it is operating strictly within the bounds of applicable law.
One of the reasons this case is interesting is that it illustrates that all investigators and people supporting investigators need to read the terms of service for web pages and mobile apps that they may visit or access to gather evidence. Those terms of service may limit or forbid efforts by the investigator, including the collection of evidence. Violation of terms of service could undermine an investigation by an official investigator.
Additional Analysis (Orin Kerr, Twitter):
“I just read the WhatsApp v. NSO complaint, and it makes an interesting and somewhat novel CFAA claim. NSO hacked the target computers of WhatsApp users. But is that unauthorized access of *WhatsApp’s* servers, enabling WhatsApp to bring suit? As far as I can tell, WhatsApp’s main unauthorized access argument of its computers is this: In routing the malware through WhatsApp’s servers, NSO had to disguise the malware as legit WhatsApp traffic. It’s an interesting legal question of whether that counts as an unauthorized access of WhatsApp’s computer, as compared to of the users ultimately hacked.”
Fallout From The Coalfire Pen Testing Arrests
(Text © Benjamin Wright, Esq. Used with permission.)
Much of the work we do in cybersecurity and digital forensics exists right on the edge between good and bad. A textbook example is the arrest of two employees of Coalfire Labs, a fair-sized cybersecurity firm.
Coalfire had a written contract with the state court administration of the state of Iowa. The contract provided that Coalfire would engage in penetration testing of court systems. This testing included efforts to physically get into county courthouses.
The sheriff of Dallas County, Iowa, arrested two Coalfire employees after they broke into the Dallas County Courthouse around midnight, September 11, 2019. The employees produced a document signed by three officials of Iowa State Court Administration indicating the employees had authority to do what they did.
The sheriff identified numerous problems. The authorization document provided the cell phone numbers for three state court administration officials. When the sheriff’s deputies called cell phones of two of the officials, they gave conflicting information about whether the Coalfire employees were authorized to break into this courthouse so late at night. A third state court official could not be reached because his cell phone number was not correct on the authorization document.
The County indicted the two Coalfire employees for the commission of criminal trespass. See Coalfire Investigation Report, October 9, 2019, Faegre Baker Daniels, Commissioned by Iowa Supreme Court.
As of late November 2019, the investigation is ongoing and the incident has not been resolved.
Lesson: Prudent investigators create evidence to show they are good guys in a difficult situation. Such evidence includes having strong documentation that clearly leads to the conclusion that the professionals are acting with authorization.
In documents like contracts for investigative services, failure to choose words accurately and carefully increases legal risk. Choosing words carefully and accurately requires intellectual rigor and professional skepticism. A professional skeptic asks hard questions about whether a contract actually authorizes aggressive steps like lock-picking — one of the Coalfire employees’ techniques, even though the Iowa-Coalfire contract did not specifically mention lock-picking.
Another problem in the Iowa-Coalfire contract, according to the Coalfire Investigation Report, was whether the State Court Administration had the authority to authorize someone to break into a county courthouse, as that courthouse building is owned by the county rather than the state of Iowa.
A deeper analysis of this case was part of the presentations offered at AwarenessCon 2019 in Adel, Iowa, November 20, 2019:
The arrest and criminal indictment of Coalfire employees triggered anxiety and conversation within the penetration testing community. As a consequence, a pen test company named TrustedSec published an example clause that it added to its standard contract for physical penetration tests.
At AwarenessCon, David Kennedy of TrustedSec said the following language is published at the company’s GitHub account as open source contract language contributed to the community so that anyone may use it.
The clause provides that the client would pay $25,000 for each penetration tester who is criminally charged, in addition to all legal costs. A question for the community is whether this clause makes sense. It increases the risks to a client when the client hires penetration testers. Similar clauses might be appropriate in contracts for all kinds of digital investigators.
Attorneys from different parts of the world who would like to participate in this project are welcome to contact Forensic Focus’ content manager at [email protected].