Information Overload: Navigating Information Risk, Investigations & Privacy

Panelists:
Debra Farber, Privacy Consultant and former Sr. Director, Global Public Policy, Visa
Bennett Borden, Chief Data Scientist and Chair, Information Governance Group, Drinker Biddle & Reath LLP
Mark Antalik, Managing Director, BDO’s Forensic Technology Services practice


Join the forum discussion here.
View the webinar on YouTube here.
Read a full transcript of the webinar here.
Matt: Hello, everyone, and welcome to this webinar, sponsored by AccessData, to talk about managing information risk. I am your host, Matt Kelly, and I’m pleased that you’re here with us today. We have a great program lined up, so let me take a few moments to walk you through what we’re going to talk about today and how you can get involved. Then we can dive right into our discussion with our guests, who are waiting in the wings.

First, let me say this webinar is scheduled to go for one hour. I will go through the housekeeping details for a minute or two first, then I will introduce our two guests, and then we’ll get going with our discussion on information risk. We are going to talk among the three of us – me and our two guests – for probably about 40 to 45 minutes or so, and then we’ll open it up for Q&A after that, for the remainder of our time.

A little bit about myself – as I said, my name is Matt Kelly. I am the editor and CEO of Radical Compliance. That’s a blog where I write commentary and analysis about compliance and risk management issues, including information risk. I’ve been writing, talking, or thinking about governance risk in compliance for nearly 15 years, including 12 years as editor of Compliance Week. Some of you may know me from my time there.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


So what are we doing here today? Well, this webinar is the first of three that we will be hosting over the next six weeks. The title for that entire three-part series is ‘Information Overload: Navigating Information Risk, Investigations & Privacy in Today’s Environment’. That is the formal title, but really, what we want to do here is explore these nuances of what information risk really is, how corporate compliance, security, privacy, and governance executives at large enterprises, how you can govern all of that risk successfully. We will be looking at data governance, we will be looking at e-discovery, will be looking at data forensics, cybersecurity, privacy regulation, and all of that fun stuff. And that’s over the next three weeks in full.

So the title for our first session here today, as you can see, is ‘Information Risk and Compliance in Today’s World’. This is going to be more of the curtain-raiser for this whole series. Today we’re going to look at three specific subjects, which are, number one, defining information risk and the drivers of it; two, how to assign ownership of information risk across a large enterprise when different people probably need to help out in different ways with that; and then three, what are the technology changes that are making information risk more complicated and at the same time also, perhaps, more governable? Because we do have better technology, better software for this. So that’s what the agenda is here today.

To talk about that with me, we have two excellent guests. First is Bennett Borden. He is the Chief Data Scientist at the law firm Drinker Biddle and one of the only people I know who is both a lawyer and a data scientist at the same time. Bennett spends much of his time advising clients more on how to take advantage of data and information rather than just worry about the risks and the drivers – the information risks and more of a defensive posture. He talks more about how to get opportunities out of it. He works extensively in e-discovery and forensics for litigation, and he advises clients on privacy compliance as well. So he is guest number one.

Guest number two is Debra Farber. She is a long-time privacy consultant based in San Francisco. Debra currently hosts the Data Privacy Pact podcast every week – she just started that. In the past, she was a senior policy consultant at Visa on privacy and security. She is also currently the CEO of Orinoco.io, that is a software startup looking to help automate some of the workflow for privacy assessments and all of the work that gets done there. And Debra has been active in quite a few professional privacy organizations as well. So between the two of them we’ve got a lot of good experience and insight there.

Debra and Bennett, welcome to you both. Thank you for being here today.

Bennett: Thank very much, Matt. Glad to be here.

Debra: Thank you for having us.

Matt: So we are going to, I think, move to – first, as I said, there’s part one, about just defining what information risk is. And I’ll just pose the question to both of you in that very top-level way. How would you define information risk? Because it’s seems to me we have multiple constituencies in corporate America who define it different ways. So I’ll put it first to Bennett and then to you, Debra. But just take a minute or two and tell us what’s information risk in your minds. How would you define it?

Bennett: Thanks, Matt. Information risk is such an interesting topic, because I think it carries with it a great deal of facets of this idea of information risk. Some certainly is around the dangers, and that’s what risk means – the dangers that it brings to us, and that usually brings to mind things like security, and if it gets breached, then what does that mean for the company and what liability, the idea of privacy, risk… and to me though, I think one thing that we often miss on this is the loss value of information. And I think that’s one of the things we’ll talk about quite a bit today, is this idea that information carries a great deal of meaning with it, and I think one of the challenges we face as Information Age companies is how do we get that value out? And I think that is one of the biggest risks we face as well.

Matt: And Debra, what about you? How would you try and define information risk?

Debra: I would think I would define it by describing that… it would be that risk is a scenario that describes a feared event, and then all threats that make that possible. Similar to the way that we’ve been doing it for security for years, and it’s the severity of the risk… severity, time, likeliness would be the quantity of the risk. How high – low, medium, or high. You want to also talk about risk in terms of impact. And I just want to make sure that we realize it’s not just about the risk to security, but it’s also about the risks to your data loss and risk to privacy, which we measure in terms of impacts and harms. So typically, we think about the security element of it, but I just want to make sure that we set up a stage for the rest of our conversation.

But, well, privacy and security overlap, and you absolutely need security controls to privacy. They are in fact different in terms of impact to the business and impact to the user.

Matt: So another question – when you look at how information risk has changed, say, over the last ten or fifteen years, how much are people aware of what the risk is? I would think, normally, people would say, “Okay, it’s hackers stealing my data or it’s some sort of privacy breach – that’s my information risk.” And I’m going to guess it’s larger than that, but I’m trying to get a sense of how well do people out in corporate enterprises, how well do they understand what information risk really is? Whether it’s clients you’re talking to in the company or boards you might be consulting – what do they try and define or what are they struggling with? And again, Bennett, I’ll let you start, and then Debra, maybe follow up with you.

Bennett: Well, this is more of – really, what I think, one of the beautiful progressions of information understanding in corporate America, is that we are seeing information for what it is, which is this really amazing asset. If you think about what’s happened over the last, really, 20 years, the amount of information we have, the granularity of information that we have, is really quite extraordinary.

As the Information Age came upon us, we started implementing computerized or electronic processes into our workflows, so more and more of what we say and do and think and decide and collaborate upon in our entities is increasingly recorded somewhere. And that gives us just incredible insight into what we are doing as human beings, both in the commercial space, the public and private space. And the nice thing is we’re starting to see senior executives who have grown up in that information-inundated era, and so this idea that information is something that is fluid and measurable and leverageable for all kinds of things is starting to get through to our senior executives.

So those of us who have been raised on kind of the risk side of information, that it’s very hard to control it, and there’s these fears, and problems with it – that story only sells so far. Because people like to be told yes, not no. So being the Office of No for so long has not been fun. But we’re starting to see this merit value coming with the risk. So what’s interesting to me is information risk is starting to be more wholesomely and fulsomely understood, both in the negative aspects of it but also in the missed opportunities side.

And that’s what I see as how information risk is starting to be spread across the enterprise. So it’s becoming part of the business conversation as well as just the naggy, risk side that some of us have grown up in.

Matt: Well, let me follow up with you first, Bennett. I’ll push you later on in this conversation about maybe some specific examples of how companies can do that. But I’m kind of curious – do companies have… how concrete a sense do they have of what they could do? One way I’ve liked to talk about it before is that boards like to panic about information risk and cybersecurity, but they don’t know how to panic effectively. And I’m kind of wondering if there’s a corollary here, that they’d like to extract all this opportunity but they don’t know how to do it effectively. Or where are they in their understanding of what might be possible out there?

Bennett: It’s starting to develop nicely. And it depends on the industry that you’re in. The more strictly information creation and use and leveraging that you are, you tend to get there faster. So the Silicon Valley, the Facebooks, the Googles, the Microsofts, the companies that are actually into pure information as a product. And then, also, the highly regulated, highly litigious, but also highly leveraging companies – so companies that have to dip into their data for insight are starting to understand that it’s doable.

Everybody, every company figures out, “Oh gosh, yeah, we have too much information, we don’t know what’s in it. How do you do it?” But that’s kind of where they get stuck sometimes. But the nice thing is we’re developing a cadre of senior executives that are really starting to develop skills to analyze this problem and take steps to effectively deal with it.

So we are seeing some nice development across verticals, where companies are starting to get their hands around what I would call the general information governance problem.

Matt: Okay. Debra, talk to me a little bit about the conversations you have with people, your clients or other businesses you talk to, about how much the here and now of privacy and security regulation might kind of overtake them. There is a lot of good potential strategic advantage in what Bennett is saying, but they still are terrified of a breach disclosure that might happen or a data breach or some sort of privacy violation. What’s the tension like that you see with clients?

Debra: Yeah, there are a few, and I think we have to note that the data privacy industry is about ten or fifteen years behind information security. So the maturity of the industry is overall a little different here. And it’s still more about compliance than a business driver, but it is changing. And originally, you would use a breach to get a privacy budget, but these days, I’m seeing that breaches are really being used for security budgets. So you can kind of double-dip and say, “We need a budget for breaches, and it’s going to be a privacy budget and security,” and then you put the same amount of money at it.

So I’m seeing that the more breaches that occur, the more budgets that go to security. So the drivers around privacy are around… like you said, the highly regulated companies, but more so, coming out of Europe, driving the need for companies to allow for new rights that they never had to allow for before, which is going to require process changes in the organization. Because privacy really comes down to effective, responsible governance of personal data within an organization. So that’s looking at business process changes and whether there’s effective security controls of that personal information, and what are the privacy impacts that occur if those were misaligned. It’s not always about breach.

So I think changing the conversation from just focus on how much money can we lose to a focus on how can we gain trust from users? How can they feel like there’s enough transparency, where they’re willing to give us personal information? How do we design our products and services so that there’s more trust from users? Because sometimes it feels really creepy, that… all the targeting of advertisements and the business practices… the insights that companies have derived about individuals based on information that they may have directly collected about them or not. So a lot of this comes down to thinking about the consumer.

And the similar way that products folks and product managers are thinking about developing products, I think what I’m telling my clients is that they really need to have more eyes and ears and participation in the product team as services are being developed and products are being developed, so that they can help with the… build it right the first time, building for trust, building for security, so that it doesn’t become a matter of compliance, that it becomes a matter of building solid products that work for the ecosystem and for users and such.

Matt: Let me ask, I guess, maybe a pointed question there. Because it’s good advice to give a company, but how many of them are actually trying to do that? If you’re a business that has been around for many, many years, and now you’ve entered the hi-tech age, and now we have these privacy risks that are suddenly surging up around businesses – I can see that they might think, “That is a good idea, but I already have all of my processes set up.” It sounds really hard for them to make that shift. I’m just wondering, how much do they… I don’t want to say go into this kicking and screaming or they know that they have to but they don’t want to… but what’s their reaction when you tell them this is the sort of stuff that will need to happen?

Debra: I’ve had a lot of mixed reactions, and I think it really depends on the industry. So you have a highly regulated industry like healthcare, and you see there’s still a lot of immaturity there, a lot of hospitals don’t spend enough money on getting a Chief Privacy and Chief Security Officer. They’re just not spending enough time and effort on the effective maturity of privacy and security, and instead are still playing the compliance game.

And I think a lot of that has to do with the way the regulation of HIPAA had been written and implemented, as a very constricting regime that… as we’ve just heard, is very… it’s too restrictive, because it’s prescriptive, and it’s the Office of No. And that’s what we’re trying to get away from, in saying, “No, you…” “I know if I can help you in the business to do this or do that, and this would help drive more business.” But unfortunately, HIPAA says no. And there’s still so much of that in healthcare, and not enough focus on how do we gain trust from individuals that… yeah, I’d say there are certain industries that just are not focusing on the right things, even after many, many years of being in the space and being highly regulated.

Matt: So Bennett, let me circle back to you with a similar sort of question that – I do think that a lot of people would agree that good information governance is a good idea and it would be able to extract a lot of strategic value and advantage. Sounds great in the theory. But tell me a little bit more about how easy or difficult it is for companies to actually start doing that – making process changes or making governance changes. Are they eager to do this? Do they not necessarily know what to do? I don’t know if you want to give me some examples of what you’ve seen that’s worked well or not well. But what can you tell us? Sort of moving from the theory to the nitty-gritty level.

Bennett: This I think is one of the most hopeful pieces of this, is that companies really are taking on the larger IT problem. And we work with, literally, a couple of dozen companies right now who are putting in worldwide IT programs. Because there are so many different demands upon information. We want it for the business, we have all this regulatory risk, there’s litigation and regulatory stuff, and there’s all this clamouring for information, but when you strip all that down, it comes down to the fact that what we need to understand is what data we have, how long is that data useful, what do we do with it while it’s useful, and what do we do when it’s no longer useful.

The answers to those questions have all these different facets to it, these characteristics to it. When we create information, we should understand something about what it is, conversation about data classification comes into. Because once you understand what it is, you can act on it for all the purposes that you need.

So because companies are realizing it’s a complex but simple problem – right? Understanding what data I have and what it is. And once you get to that perspective, the hand-wringing starts to calm down a little bit, and it’s a problem that companies can actually address. The critical fact is you’ve got to have all the right voices at the table when making information-related decisions.

So we’ll talk a little bit about what structure works and who reports to whom and how companies are doing that. But whether formal or informal, the way that companies are moving forward is figuring out a way to elicit the rational lines of sight of the different information stakeholders, balancing their interests, and making a decision about a policy, a project, a business venture.

One example that I love to use often is – you take the marketing stakeholder. This is an important facet or information stakeholder. And they believe certain things about information – information is useful, it needs to be current and accurate, and it needs to be spread as far and wide as possible. Right? Perfectly rational view for marketing folks. Then look at the security folks. Some things are the same – information should be current, it should be accessible, right? But they care more about limiting the dissemination of information. Again, a perfectly rational view on how information is.

The problem is you’ve now got one stakeholder who wants to spread information and one who wants to prevent the spread of information. Who determines the right balance? That is where most companies fall down. They don’t have a way to elicit those perfectly rational views but then, most importantly, referee them. Where is the proper balance for our company in our risk profile and business objectives? And a successful IG program is one that has a way of eliciting those voices, balancing them appropriately for your company, and then making project-based decisions based on that referee.

Matt: So on that note, let’s move into that second section then, about how you do assign ownership of various pieces of information risk, and to your point, Bennett, information governance and the like. I’ll continue with you then, for a minute here, Bennett. So walk us through either who should be in the room as we decide these things or who should be chairing that conversation or how does a company figure out what works for its… I’ll get into one industry versus the next later on. But so how do they do this?

Bennett: It’s been really interesting to work with so many of these worldwide companies in very different verticals, different litigation and risk profiles, different regulatory profiles, and see what work for them. Because it really falls into a couple of general categories. One is where you have more of a formal structure, where there is a committee, which we usually call the information governance steering committee, that is chaired by someone – we like that person to be a Chief Information Governance Officer. But they take lots of names – Director of IG, Senior VP for Business and Data Analytics, whatever it is, doesn’t matter.

But then, in this committee, you have representatives from the major stakeholders, and that’s going to vary company to company. Certainly the business – because they’re the ones who generate and leverage the data. But then you have to have the privacy, the risk, security, IT, legal, e-discovery, all those people at the table. And then we can’t forget things like Finance and HR. So your key stakeholders who are really the drivers behind the creation, use, and disposition of data.

And there’s some way that the business comes before this committee. Like decisions to go to the cloud, like 365 or the Google Suite or something, BYOD policies, where are we going to issue devices or not? Are we going to claim ownership over devices or not? Things like security monitoring, compliance monitoring. All of these decisions are ones that will benefit from the input from these various stakeholders. Often, they will act on an issue through projects – because sometimes we forget, you can’t act on an issue. Someone says, “Oh, this is an issue.” Well, you can’t act on an issue, you can only act on a project.

So defining how we’re going to address an issue through a project, who owns that project, how it’s funded – those are often things that a committee like this decides, and then hands it off to a project.

The other flavor of this is when your company either is too dispersed or not yet quite mature enough to have a committee like that, they very often undertake what we call the project-based approach to information governance. And this is where, project by project, initiative by initiative, the company sets up a way to, again, elicit the voices, get the opinions of all these folks, and then work those opinions into the project.

Because how many of us have been involved in something like … a company decides they’re going to go the 365. And business decision, IT has gotten on board… and then three days before the migration is supposed to start, someone goes, “You know what, we should talk to legal,” or “We should talk to privacy.” And all of a sudden there’s this mad scramble to try to change the parameters or what we’re going to do, and you can see the conflict and the inefficiency that arises from that kind of thing.

So the more you can proactively incorporate these stakeholders’ opinions into your project – whether it’s a project-based approach or a committee approach – that’s where we’ve seen the most success in this phase.

Matt: Debra, let me ask a similar question to you, more specifically around privacy and security regimes or compliance duties or projects that companies want to bring in. I know that a lot of privacy and security risk today is more about good practices for handling data. And really, this is about humans and policies and how they conduct themselves. So I’m wondering if there’s a similar counterpart for privacy risk. We actually have to get everybody in and talk about “Here are our data practices that you all have to think about. Because privacy isn’t a phase you go through and the privacy team does it and then it’s done and we have no more risk. I know that’s not how it works. But should companies be bringing together like a data privacy and business practice council or committee? How would you see it working out?

Debra: Yeah, handling risk is similar no matter what the focus is. So with privacy, I’ve seen as being part of the information governance committee, and you deal with your privacy issues there. I have also seen completely separate committees, where data privacy is separate from security issues, because they are … they just kick off different projects and processes that, while the information that you garner from these projects or discovery efforts, they overlap with interest, they do kick off different projects.

So I guess I want to do a brief overview of what some of those – like a privacy impact assessment and how you would use that in the organization. But first, I want to say that I thought that was a fabulous review that Bennett just gave. The process of setting up the committee. And one of the most effective things you can do for what effectively is a change management process – you’re introducing changes to your organization, “Here’s a new technology we’re going to roll out. What are the privacy and security risks and implications and impacts and how do you account for that?” It could be internal, or it could be we’re about to buy a new company and roll it into the fold, and how is that going to impact the privacy of our communications, our product, how do we change our process, our internal data-handling processes so that they’re co-pathetic with ours? Or that we raise our privacy and security practices and controls to the level of the company we’re purchasing. There’s so many potential impacts.

So on the privacy side, that’s where you conduct a privacy impact assessment. However, before you can complete the assessment, you need to know where the data is stored, you need to know how it’s used, you need to know… again, assess risk, basically. And since so many of the efforts in privacy right now are the pre-risk-assessment phase, where you’re looking for what is that life cycle of data? And where does it live and how does it get there? Is it through a customer service agent? Is it direct input from the user?

So, so much of the effort right now in companies preparing for compliance with the EEA’s general data protection regulation, which is coming down the pipe in a year, it’ll be officially here. Are companies thinking about how they are changing their practices to comport with these new rules? But at the same time, it’s a wonderful opportunity to think about honing those governance practices.

So for instance, say, a RACI chart. If you’re not familiar with RACI – it’s R-A-C-I, it stands for Responsible, Accountable, Consulted, and Informed. So who in your organization is ultimately accountable for all things … privacy, in particular, initiatives, or a product or a service line, or a business unit. It could be within HR, it could be within Finance or a particular product line. Who is responsible for making some changes or rolling out… or making sure there are certain controls in place or privacy protections or user education or what not? Who is to be consulted there? They just need to know. I’m sorry – they need to be able to opine on whether or not there are additional risks that you might not have realized as part of that process, or impacts within the organization that need to be changed as part of this whole change management process.

And then who needs to be informed? Maybe it’s just somebody who needs to educate the salespeople or educate the user base? And all of those people are important in the success of whether or not you have effectively handled risk within the organization.

So weighing out those responsibilities and accountability and all those roles as part of these committees is super-effective in being able to then ultimately do the discovery work and then assess the risk to the product and the organization and the users and all the stakeholders in the process.

Matt: So what about that – along the lines of our third bullet point on our screen here – shifting towards not just an awareness of what we need to do and who should be involved, but at some point … the compliance and risk executives, you’re probably on our webinar today, they’re thinking, “Well, I need to come up with some metrics, I need to pull together a plan. I need to get a budget that I can put in front of the CFO and the board to say, ‘This is what we’re going to do.’”

Debra, I’ll start with you – maybe around privacy compliance issues. How can they, I guess, maybe, derive a substantive plan for others, to say, “We’re going to do this on this timetable, it’s going to cost this amount of money, and these are the metrics,” how do they develop that? And Bennett, on the far side of that, I’m going to ask you the same question for information governance projects generally.

But just on privacy compliance alone, Debra, what can you tell us about getting to concrete specifics that the board gets to review?

Debra: I would really think about this in terms of how you’re framing it. Instead of describing the costs and the specific risks you’re looking to mitigate, I would reframe that – how you’re supporting business drivers. The goal is not to be seen as a cost-setter and instead to be seen as like a business enabler. So instead of saying, “We need X amount of money so that we can fix…” Like these specific privacy impacts are what we need… we need access controls or we need… you know, line items.

I would frame more effectively, that we need money so that we can enable the sales team to bring in more revenue by closing deals quicker. So for instance, if we don’t have appropriate privacy practices that are clear and a public privacy statement that explains to vendors and partners and potential ad sponsors or users as to what the practices are or if it feels creepy, or if there aren’t policies in place that are required for compliance even, then it might actually slow down the sales process. You might lose billions of dollars of contracts if you’re trying to deal with large healthcare carriers or insurance companies or banks.

So that impact really resonates, if you’re messing with the revenue of the company. One of the best ways that I’ve been able to effectively get more money for changing business process – because it’s not as easy as some security project. We’re rolling out end user security, we’re going to encrypt this particular database, or we’re rolling out a data masking process. Those are concrete, specific security projects. With privacy, it very often has to do with changing business processes. It’s a little less concrete, because you’re talking about data flows across many systems as part of the business process. So it’s more effective to talk about the impact that the money you’re asking for will have on the organization, as opposed to “Hi, we’ve mitigated five risks,” and keeping track of it as almost like a ticketing system that you see in security, of like “Here’s how many bugs we fixed, here’s how many security compliance items we’ve ticked off.”

It’s a little different for privacy that way, and I would focus on the value to the business.

Matt: So Bennett – same question to you. And I hear a lot of parallels there, to probably what you might talk about, in enabling business. But at some point, a cynical CFO might actually just try and pin you down, like, “How much money do you want to do this? What do I get out of it?” How do people frame that?

Bennett: And critically important, right? And I completely agree with Debra, that you’ve got… a company succeeds when it makes money and minimizes its costs and risks, right? You’ve got three pretty big buckets of metrics. First of all, metrics are incredibly important – because one, you can’t manage what you can’t measure.

One of my favorite stories about Mayor Bloomberg in New York is he spent the whole first year of his first administration forcing every agency within the city of New York to come up with a measurement for its success, for what it did. And then, just turned it into a giant data science project. It was a data scientist’s dream. And that’s what we really, really have to do.

So every project that come to our company, we’re going to do this business initiative, we’re going to buy this company, we’re going to issue this device, we’re going to put in this program – whatever it is – has to be metriced out in those three major areas. One is the business value, whether measured in revenue or market share or number of customers, whatever you want to measure it as; what is the quantification of risk; and what is the quantification of cost.

Because really then, you stop arguing over the soft, amorphous stuff that nobody can get their hands around. So part of it is agreeing among the senior executives, what measures we’re going to use. But I’m a huge proponent of coming up with some measure. And also, you got to be smart with what measures you choose. Let’s choose one that we have data on and we can create, and choose ones that favor your position, right? [chuckles] One of my favorite sayings out there is if you torture the data long enough, it’ll confess. It’s a famous expression by Ronald Coase. And it’s true. So you can be clever with your metrics.

The other thing is figure out in advance what success looks like, figure out how you’re going to measure that, and periodically measure that and measure your progress. All the left brain folks and the compliance and security and finance folks are going to love that, as will the businesspeople. Because they can watch their little pie chart get bigger and bigger every quarter or whatever it is.

The other part of this is how do you get money? Now, sometimes, with the project-based approach, it’s obvious where the money should come from. It’s predominantly one line of business or it’s an IT infrastructure, it’s a capital expenditure, whatever it is. But some of these things, especially, the programmatic stuff, may have a little soft… just squishy, and everybody just hates it. So one thing that a lot of the companies we work with do is they actually put in what we call an information governance tax. And basically, they take some percentage – one percent or less – of every department’s budget. And it goes into a central fund that does these more enterprise-wide projects or programs or policies, and then people, they fight a little less over it, and you’ve got this pile of money that we have found really successful.

Matt: Okay.

Debra: I love that. [laughs] I’m going to use that.

Matt: If you’ve ever heard of a privacy tax, it sounds like a nifty thing that compliance officers are going to make note of.

In the interests of time, I’m going to not necessarily discuss much about the post-incident phase. For anybody that’s curious about that listening, we’re going to talk a lot about data breaches and investigations in our next webinar in two weeks. But I did want to talk a bit more about technology and its consequences.

And Debra, I’ll ask you first, because I suspect you have some more vivid examples and insights about this. But my question, first, is: how much has haphazard adoption of technology affected our information risk? Let’s say over the last 15 years or so, what have we blundered into, perhaps without necessarily knowing all the ways we could have a cybersecurity or privacy risk come back to haunt us? What have you seen?

Debra: Well, one of the areas I’ve seen is where there’s no centralized process for evaluating third-party vendors, and you have different areas of the business go, “Oh, here’s a good collaboration for our team!” Really innocently wanting to help the business, but bringing on and signing contracts on behalf of the organization. And where there’s some hidden risks there that I think a lot of people don’t realize is these cloud products.

So there’s this idea of “Oh, well, this is free. We’re not doing a paid version. We’re just…” Maybe we’re testing it out or we’re just doing the free version. But then they’re now putting maybe company data, something sensitive data, or even some personally identifiable information about, potentially, clients or – who knows? Depending on what you’re using the cloud service for. But that’s the whole point – you don’t necessarily know what the information risks are if you’re not vetting it with your security, privacy, compliance risk team. So the important thing I’d like to point out is that it’s important to have a vendor management process and to hold your business leaders accountable to certain requirements when looking at, especially, cloud products. But any new technology that you’re rolling out within the organization. Or external of the networks of the organization, potentially.

Again, you can’t measure what you don’t… you can’t monitor what you can’t measure, but you can’t measure what you don’t know exists, right? So just on the network, you can’t have your IT team being the ones looking at that, and … I think that’s a particular area where you need to have a process in place, and then hold people accountable.

Because what happens is it’s not so much that it’s a cloud service – it’s that if you read the terms of service, there’s usually no [indecipherable] for use of the product. For a lot of the security controls that are put in place around the product, they’re not offered in any of the free versions. And that’s the whole reason you’re paying more money for a larger user base and a more enterprise-related product. So especially if it’s free products or the smallest, non-enterprise grade, you’re using the consumer-grade service in a business, that causes more risk to the business than I think is originally intended.

Matt: Bennett, what have you seen as the big technology adoption risks or bad usage habits that have come back in the last ten or fifteen years to invite some headache we didn’t anticipate? What have you seen?

Bennett: You know, it’s very interesting. I think that the haphazard adoption – that’s the biggest problem. I am a huge technophile, I think data is awesome, and I think all the insights we get out of it are really brilliant, and very much encourage companies to move in that space. The problem I have is that companies don’t do this on purpose very well. Every company – I don’t care what you make – you are fundamentally an information-based company. But we don’t create and use and get rid of information on purpose – we kind of stumble into it.

Here’s the next proposal. Here’s the next idea. Hey, I really Yammer. Hey, I really need G Suite. Hey, I really have to have…

And we don’t think it through all the way. And that’s what has to stop. The issue behind us as an Information Age company is that we have to decide what information are we creating, do it on purpose, protect it while it’s useful, and get rid of it when it’s not.

So what most successful companies I have seen do now is they do what we call an IG impact assessment on any decision, really. So we’re going into this market, we’re buying this data, we’re putting out BYOD policies, whatever it is. There is a series of steps that they go through to evaluate it form all the different perspectives, what’s the business case, what is this gonna get us, what is it gonna cost us, what risks do we face, and then there’s a purposeful decision made, to contour – if they move forward – to contour it to fit both their risk and business objectives. And that’s where this on-purpose thing – that is really what successful companies are doing in the Information Age.

Matt: That sounds to me like it’s really more a challenge of focus, good policy management, and good training to employ these, to explain to them, “Actually, Google Docs is a cloud service, and you’re not supposed to do that until you read our cloud services policy about what we want to do.” It’s more about the adoption than the technology, when we look at this. And that seems like a very difficult message to pull together in a large enterprise and then to push out to so many different people, who might not just get it. They don’t necessarily know what they’re doing and the risks that they’re bringing about. Is that fair to say?

Bennett: That’s why we stop talking about technology and start talking about information. Because once you start getting into the reeds of any of it, people glaze over. What we’ve gotten very good at doing is helping companies explain, “Look, everything that we do comes down …”

Companies don’t do stuff, people do stuff. And people do stuff through their actions or their decisions. And those actions and decisions are effectuated and evidenced through electronic means. Period.

So that becomes one of our most precious assets. All of your thought processes and decisions and things you do, the orders you take and make, everything you do is being granulated down into this wonderful pile of information, and that’s so precious to us. So we’re going to take care of that as if it’s one of our most important assets. So we want you to think about “I am creating this information, and we’re going to think about how we’re going to store it and protect it and access it and share it.”

Once you start getting into that, then everything flows down from there more easily than you’d imagine, especially when so much of our workforce has grown up in the Information Age, where they inundated with information and kind of see it as a fluid asset that you can leverage and insight and get value out of. That message is becoming much easier to get through to your constituencies now.

Matt: Okay. So we have about ten more minutes here, so I’m just going to move to a few questions we already have from the audience. And if anyone listening does want to submit a question still, feel free, please do.

But the first question actually, I will pose to Debra, somebody here saying: I’m still struggling with whether privacy is a legal and compliance issue that belongs somehow with the legal department or compliance officer, or is this more of a security and data classification thing that my IT security team – should I house privacy there? What am I supposed to do. And I’ll give that question generally… like how do you triage where privacy should reside in a large organization.

Debra: Sure. And I have really strong opinions on this topic.

Matt: Alright, go for it.

Debra: [laughs] Yeah, so I personally believe that privacy counsel is a completely different job and role from privacy officer, the same way that security counsel would be different from security officer.

So I understand why privacy grew up out of legal – because it requires notice, choice, consent, and a lot of that is part of the contract process and looking at legal disclaimers. So when you have privacy overseen by counsel, and the entire program is overseen by counsel, I think it’s a big disservice. Because what you’re missing is all that data use aspect, which really – it’s change management, it’s project management, it’s governance, it’s looking at the entire lifecycle of personal data and its impact on consumers or risks to the data that the company is controlling on behalf of individuals.

So yeah, it starts out as a data governance process, where I think privacy and security aims overlap. And then once you go through that governance structure of classifying the data and the data owners and who’s in charge of it, that’s where the processes change completely. That’s where security takes over from assigning one of the appropriate, effective security controls, and what do you put in place from… the confidentiality, integrity, and availability of that data.

But privacy, you’re looking at how that data is shared, whether you have a legal right to process that data in the first place, how it’s used within the organization, is it in line with the reasons for which it was collected? So you’re really looking at the whole lifecycle of that data and its impacts on the individual user, which is, like I said before, an analysis across many different systems.

Matt: Okay.

[crosstalk]

Debra: Okay, we’re only looking at this, but that might even be part of business processes that span multiple companies. So you have to understand those impacts to other organizations, you need to understand what features are being built into the products you’re selling, understanding those… so it’s very different… I would definitely have that under a business operations role.

So the CPO should be at the same level as the Chief Security Officer. In more immature companies, I see that reporting… both of those, privacy officer and security officer reporting to a CIO. But it should be at the same level as the security officer in my opinion. In order to have that appropriate authority within the organization. If you just made it a compliance issue, you’re going to completely miss the business drivers aspect of it, and maintaining trust and transparency within the organization. So it’s way more than compliance.

Matt: The next question to you is somebody saying: I grasp the concept of trying to find more advantage in extracting opportunity from data. Just curious – do some industries lend themselves to this idea more than others or do some business functions, regardless of industry, do they lend themselves to this more than others? What would you say?

Bennett: The insight that you can get out of your data depends on what data you have. And if you’ve got more data about your customers, then you’re going to be able to get more insight about your customers. If you have more data about your production, like how you do business, that’s where you’re going to get your insights. Even if it’s on your employees … one of the really interesting pieces that every company has the opportunity to do is understand your own people.

One of the projects we’re involved in recently is doing sentiment analysis across our own employee base. Sentiment analysis, basically, is a way to figure out from people’s texts and emails how they feel about something. It’s often used for product reviews and brand names and things like that. But turning it on inside, it’s really amazing. You can watch people’s happiness levels go up and down. And that has great value for employee retention and for management structures. There’s collaboration insight you can get from analyzing your employees.

So I would argue that you can get value just about out of any set of data you have. It really depends on the data you have is where you get the value.

Of course, there are certain companies that are simply better situated because they already have kind of a data analysis or insight function. So what’s fun is to turn that function somewhere else. So there was one company that had a big data analytics group that looked at customers. We pointed that to their manufacturing process, and pointed it to their internal, human resources issues.

So the really creative thing here – the skill set for the Information Age is understanding that data reveals human conduct and that if you can get insight out of that data, you get insight out of the human beings who created that data, wherever those human beings reside.

Sometimes, it’s the business managers that need the most convincing. So that’s the line of business that you’re in … if you’re in a more information-related company, that’s an easier sell sometimes. But especially, you could elevate the conversation to “Hey, let’s figure out how our people are doing,” or “figure out how our processes are working,” or “figure out how our customers are working”. That’s where you usually get that conversation started.

Matt: Okay. So we’ve got about one more minute left for one last question I want to sneak in. And I like asking this for almost all the webinars we do here. It’s just – the manpower or skills shortage or requirement to meet all of these ideas in the future … just maybe very quickly, I’ll put it to both of you. Do we have enough compliance and privacy and data governance officers to fill these needs that are out there? Do we have enough business unit leaders who …

Debra: People to do the job, and there are many opportunities out there to do the job, and part of it is also … especially with privacy officers, some of them still don’t even have one person on their staff, or if they do, maybe just one person, and it’s not enough. So I would say definitely, understaffed, sure.

Matt: And Bennet, what’s your take on manpower and workforce needs here in the future?

Bennett: We need a lot of folks. And it’s really the cross-disciplinary people that we need. We need business managers who understand privacy and analytics, we need analytics people who understand privacy and business. So it’s the cross-functional person that really will rise to the leadership of these Information Age companies.

Matt: Okay. Well, that is all the time we have for this hour. But Debra Farber and Bennett Borden, you guys covered plenty of ground, you gave us a lot to think about it, so thank you very much for your time. I really appreciate it.

Debra: Thank you for having us.

Matt: Alright, and again, everybody listening, that was Debra Farber and Bennett Borden. They were talking to us today about managing information risk in this first webinar of our three-part series sponsored by AccessData, where we’re going to be looking at information risk and all of its compliance challenges.

Some quick wrap-up details here – yes, you can get a copy of the slides and the discussion you just heard today. We’re going to send a link out to a recording of the webinar that will be emailed to all the registrants after we are done here. You can also find some highlights of the discussion on Twitter under the hashtag of datasecurity.

If you found this webinar useful – and I’m sure you did – I certainly encourage you to register for our next two sessions as well. The next one will be on July 11, where we will look more at data forensics in e-discovery, how to sift through information after some incident has happened at your business. And then, on July 25, our final webinar will look at the EU’s general data protection regulation, some other emerging privacy regimes, what do they ask companies to do, how they’re going to challenge your compliance and data governance programs, all that and more.

All of these webinars are free to attend. All of them are at 11 AM Eastern time. They are all sponsored by AccessData. You can register for them clicking on that link at the bottom of the screen right now.

And that is all for today. I’m your host Matt Kelly, editor of Radical Compliance. Thank you all for listening, and I hope you’ll join us again next time.

End of Transcript

Leave a Comment