GDPR: Navigating the Complexities of Customer Data

Marcy Darsey, Director of Corporate Counsel, Lifesize
Date: February 15, 2018

Hey, everybody, and welcome back to Lifesize Live!, the live web show produced entirely through the Lifesize platform.

I'm your host, Julian Fields, and with me today, we've got Marcy Darsey, legal counsel here at Lifesize. We're talking GDPR.

Right. Everybody's favorite topic these days.

Yeah, and, I mean, we know what GDPR is. But maybe just so we can clarify that everybody knows what GDPR is, can you give us a little bit of a background — like what it means, what it actually stands for, etc.

Well, Julian, first of all, thank you so much for having me down here at Lifesize Live! It's really exciting to see the studio and to be a part of this webcast that Lifesize is producing now. And before we get started, or too deep into GDPR, because I am a lawyer, I want to put out a disclaimer before I get started and say that nothing that I say in the next 10 minutes actually constitutes legal advice.


So if somebody is really curious about this topic or has specific questions about GDPR or any regulations or law, they should consult with a qualified legal expert on those questions. But now getting to your question. What is the GDPR, and what does it stand for? The GDPR stands for the General Data Protection Regulation, and it is a new law that was passed by the European Union Commission. It relates to data security and privacy for all residents of the European Union.

Okay. I know that, working in marketing, we've always had some element of this with not being able to email certain people who haven't opted in.

Right. Well, there have actually been data security and privacy laws in the EU for a really long time, for over 20 years actually. But the GDPR itself has been in place for two years now. We've been in a period of implementations, and May 25th is when regulators will begin enforcing this law. They'll begin responding to issues of noncompliance, and if they find that companies are not in compliance with GDPR, then those companies could be subject to fines. And the fines are really significant.

Okay, so not a slap on the wrist.

Right, it’s not a slap on the wrist. The fines can be up to 20 million Euros, which is meaningful.

The conversion rate on that—let's see, that's like more than 20 million dollars.

It's a lot. Companies can pay the greater of 20 million dollars or up to four percent of their annual global turnover, which includes their revenue from their worldwide operations, not just their operations in Europe.

Okay, so the bigger you are, the bigger the fine.

Right, the fines are meaningful, and that’s because the European Union recognizes that all individuals have a fundamental right to privacy. And when you think about the world that we live in nowadays with all the technological advances that have happened over the last several decades, people's identities and their personal data are shared across a lot of platforms, and a lot of businesses profit from gathering and collecting data from individuals.

Yeah, it seems like just about every application I own has at least my name, some element of a password, phone numbers, address, things like that.

Exactly. I mean, think about all the businesses that you interact with online and how much information those businesses have about you personally.

Credit card is just one thing, but like just my actual person, my details.

Yeah, your identity, your data, who you are, your name — that's private information. And if you share that information with a company, then that company is obligated to ensure that they're being transparent with you about how they're collecting that data and how they're using it, and they are also obligated to implement appropriate technical and operational safeguards to ensure that your personal data is secure so that it doesn't get hacked. And if companies aren't compliant with those laws, then the penalties are severe because the government in the EU wants to ensure that businesses take their obligations seriously.

Gotcha. So what are the key elements of the GDPR?

Well, if you print the GDPR itself in English, it's 88 pages. So it's a lengthy law, and it covers a lot of topics. There are actually 99 different articles in the GDPR.

Well, that's perfect, because today there are 99 days left before we have to be compliant. An article a day — we can do this.

Anybody can do that. So 99 is the magic number today because we're 99 days out from the GDPR implementation being effective. It covers a lot of things, but one of the key topics is one I already mentioned: transparency. So individuals have a right to understand what data's being collected about them, what companies do with that data, whether or not they pass it on to any other third-party vendors or processors. Transparency also involves knowing how and where in the world your data is stored.

So things like the pop-ups that you see on websites that say, we collect cookies, or that sort of thing.

Yeah, it could be banners — that's part of being transparent with customers. Having appropriate disclosures in your privacy policy or your privacy notice is really important. Most businesses are updating their privacy notices for customers so that they're meeting those transparency obligations. One other principle in the GDPR is accountability, which is also really important. What accountability means is that businesses have to be able to demonstrate to their customers that they are complying with the regulations, and in order to demonstrate compliance, businesses have to have documented policies and procedures in place. They also have to document their data processing activities. Businesses have to keep a paper trail of what they're doing.

So it's not good enough to say, “Yes, I verbalize that I'm GDPR.”

It's not good enough to say, “Take our word for it. We comply.” You really have to have some documentation in place demonstrating your compliance with the regulations. And then a third major principle of the GDPR is what's referred to as a privacy-by-design concept, and what that means is that if companies are developing products that involve the collection or processing or storage of individuals’ personal data—

Like those apps that I was talking about.

Right, exactly. So if you're a company and you're developing apps that collect personal data, you need to apply a privacy-by-design approach to product development.

So not an afterthought.

Right, that means that as you're developing the architecture for your solution or your service, you are also taking privacy and security into consideration every step of the way. You’re developing it, thinking about privacy from the ground up, and thinking about how it's baked into your product architecture. It's not something that you do and slap it on at the end as an afterthought. Just like you said.

“We'll encrypt it at the end”— that's not the way to do it.

That doesn't quite get there; you need to be thinking about privacy throughout the product development life cycle. And again, with accountability, you need to be documenting those efforts.

Gotcha. Well, if those are the keys, maybe boil it down to what companies should really be thinking about. What's the first step that they need to work on?

Well, the first step for companies is doing a data assessment and understanding what data they have. Companies really do need to be looking at what data they possess of their customers and understand their data and the data flow. Sometimes it's referred to as doing a data map or data inventory. And then they look at each step, from collection through processing through storage through deletion or retention — whether or not they have obligations that they need to meet to ensure that they're in compliance. So the first step is a data assessment or data map. And then the second step would be developing appropriate technical and operational measures to ensure that personal data is secure throughout that data map.

Okay. I just saw that we did get a question in, if you don't mind answering this one. It has to do with the kinds of requirements around the data protection officer. Is that something that companies have to have?

It depends. It’s not required for every company, but it depends on what type of data you're processing. If you're processing a lot of sensitive data, you probably should have a data protection officer. And if you're doing some kind of ongoing constant monitoring, like if you have a closed-captioned TV and you're constantly monitoring people, you are required to have the data protection officer appointed. That's a typical legal answer, but it depends. And that really is kind of a good takeaway about this law —it's not a one-size-fits-all law. Every company should be developing a custom program for their own compliance. And that depends on the type of data and their data processing activities. So there's not just a simple solution that works for every company. It all needs to be customized and carefully considered.

Right, right. Well, thank you so much, Marcy. I think we've run out of time for this one, but if you don't mind, I know we've got some more questions that have come in. Maybe we could follow up with you and build out a blog kind of talking a little bit more about it?

Definitely. There's a lot to say on this topic, and I just appreciate everybody for tuning in and joining us today. And thank you for having me.

Yeah, of course. We'll have to have you back again soon. Well, thanks everyone for tuning in. We'll see you next week. Bye.

Austin, Texas, USA
+1 512 397 9300
Toll Free US +1 877 543 3749

EMEA Regional Officer
Munich, Germany
+49 89 20 70 76 0
Toll Free Europe
+00 8000 999 09 799

APAC Regional Office
+65 6631 2831


© 2020 Lifesize, Inc. All rights reserved. Information contained in this document is subject to change without notice. Lifesize and the Lifesize logo are registered trademarks of Lifesize, Inc. All other trademarks are the property of their respective owners.

Need more help?
Contact one of our local sales representatives.