Interview with Melanie Haratunian—Executive Vice President, General Counsel & Corporate Secretary at Akamai
Conducted by Chrisella Herzog, on June 1, 2015. This transcript has been lightly edited for readability.
[WhiteHat]: What are some trends that you see in data collection for marketing or banking purposes? For example, Note to Self recently had a story about a program called “Crystal Knows” which looks at public information about people and places them into categories for their personalities. So what are some of the trends that you’re seeing there, from a policy perspective?
[Melanie Haratunian]: I think on a very high level—I used to be a government regulator—and I think one of the hardest things is technology moves at an infinitely faster pace than government can ever keep up with. As a lawyer, sometimes it’s hard to work in that world where the laws were written several years after the technology, and so it’s very hard to keep up.
Some of the things that good privacy laws will do is notify people and users about how the information is going to be used and also to allow users to have some say in that use. It’s a very fine balance; from a policy perspective what you’re trying to do is give folks and users the choice, but also not inhibit the capabilities that the technology can make possible.
[WH]: One thing that I have seen—an article that just came out about this in The Nation—is that publicly available data, like who you’re connected to on Facebook, is being mined by credit agencies to supplement your credit score. Do you see that as something where it could be used to help economically disadvantaged or unbanked consumers, or will that go in the direction of taking advantage of them?
[MH]: It’s hard to say, with any of these big data technologies. As innovators it’s always exciting to kind of focus on what’s possible. It will remain to be seen what the technology can do, but what it should do really is be less in the hands of people who developed it, and more in the hands of people who then leverage it.
I’m hopeful that there are good uses that come out of it. You see this with like crowdsourcing and fund-raising things, but people have mined some of these technologies in a way that really does help people. But technology is a reflection of humanity, and I think there’s also people who will exploit it for things that are not as desirable.
[WH]: On the issue of global data rights—the European Union seems to take an approach to data rights where the protection of personal data and privacy is a fundamental right; meanwhile the U.S. focuses more on harm reduction. Which approach do you think will become more common globally as more countries begin to implement digital rights protections?
[MH]: It’s hard to say—I think that’s the next battle ground. I think it’s a fundamental difference in perspective. Some of it is based on the history of different cultures; Europe’s experience with World War II and some of the abuses that happened when people took personal information and then gave it to the Nazis you know is very much on the forefront of Europeans’ minds. It’s hard to argue with that if you lived through that.
But it remains to be seen—these battles over security policies, over privacy policies, over technical standards, really it’s about who’s going to be the leader on these issues and to the extent that they’re fundamentally different philosophies. That’s where you’re going to see a battle between very different world views.
So it’s hard to predict. I think the United States has been the first mover in technology, and I think one of the things that Europe is trying to do with the digital single market initiative that they announced a couple of weeks ago is to be the first mover on some of these policy issues. They’re hoping that they’ll get a head start and they’ll be able to have a say, but really it remains to be seen who’s going to ultimately prevail.
[WH]: Do you see a right to be forgotten coming to the U.S. any time soon?
[MH]: I don’t know. From a technology perspective, it’s just so difficult to see how. Intellectually, I understand the concept. As someone who works in a technology company, you want something as open as the internet ecosystem. It’s hard to imagine technically how you could pull it back in. I think it’s an interesting trend, and I think at least conceptually the idea of giving users more of a say in information about them is laudable. It’s hard for me to get my head around from a technology perspective—how you actually make that happen?
[WH]: What role do you see international negotiations—like the TTIP, for example—or institutions like the ITU play in this movement and this battle over digital rights protections?
[MH]: It’s fascinating watching some of these negotiations, and seeing these forces that are kind of working against each other. And some alliances are forming too in terms countries that tend to band together on these sorts of issues. Hard to say though—I think we’re ready to enter a new battle ground. I think the United States had a head start in a lot of these areas, but I think there’s a lot of countries who are quickly coming up and demanding more of a say in terms of how these things are decided. So again it remains to be seen.
[WH]: A report released by the U.N.’s Office of the High Commissioner for Human Rights on May 28th states: “Encryption and anonymity enable individuals to exercise their rights to freedom of opinion and expression in the digital age and, as such, deserve strong protections.” Do you agree that encryption technology is essential to basic human rights?
[MH]: I think it offers some potential for greater control, which is one the underlying components in both privacy and security. So I do think that it’s important, and there is a greater demand for more encryption. There’s a lot of potential there, from a human rights perspective as well in term of being able to allow users to have a greater say in terms of what is released into the internet ecosystem and what they choose to keep private.
[WH]: So maybe instead of a right to be forgotten it’s going to be a right to keep it private in the first place. The U.S. Government has had some trouble with this online anonymity issue and has argued that it should be allowed to have back doors into these encrypted technologies for national security purposes. What’s your opinion on this and how do you see this clash of personal rights versus national security playing out policy-wise in the next few years?
[MH]: From a technical perspective it’s hard to imagine that there’ll be a back door that can be created that only the ‘good guys’ will be able to access—whoever you define as the ‘good guys—and not also have that backdoor be a vulnerability that can be exploited by the ‘bad guys’, however you define that. Boy, that would be a killer app is someone could figure that out!
I’m watching that battle with interest because they’re diametrically opposed.
[WH]: I wanted to touch on the net neutrality debate. One concern there is that if the internet is not regulated like a utility or if ISPs are allowed to charge access to different areas of information, economically disadvantaged users will be cut off from accessing some of the information out there. In 2011 the UN declared that internet access is a human right; so what do you think about that and how do you think that’s going to play out within the U.S. as well as globally?
[MH]: I think that the way that the FCC approached the problem, their hands were tied in terms of the kind of the statutory foundations that they could use. But on a very high level the concept that they were shooting for was an interesting balance, which is to focus primarily on trying to avoid abuses but allowing some kind of differentiation. There’s a clause in the Communications Act that talks about unreasonable discrimination—so it’s OK to treat people differently if they’re bringing different things or they have different uses. I think it’s fair, from a business perspective, to treat different people, like if you’re transporting through a taxi cab company. If you’re just transporting one person, that’s different than transporting 50 people with luggage. Some kind of differentiation is probably not a bad thing, but when that is used to disadvantage folks, you have to be careful of that.
And so I think what the FCC did—although by far not perfect—was to try to focus on market power, and to try to focus on the entities that controlled the last mile and some of the more scarce resources that make that up. The fear was if that if there’s mischief there, that’s going to have the most ramification on internet ecosystem.
So who knows ultimately what happened; in the United States it’s subject to court appeal. Then there’s the extent to which other countries are watching it or doing their own thing with net neutrality—I know that the EU has something similar; there’s issues with it in some other parts of the world as well. There’s a balance there between continuing to make the internet an open vehicle for everyone, but also too from a business perspective allowing some kind of differentiation based on reasonable grounds for differentiating different types of users.
[WH]: What do you think about something like an Internet Bill of Rights—should that be codified into law once we get all these sort of things hammered out, or do you think it’s something that as technology changes we’re going to have to leave that to be more flexible?
[MH]: You know, that’s interesting; I’m not quite sure what that would look like. Are there certain things that are fundamental rights? Are there ground rules that we should ask everybody to abide by? Conceptually there’s some aspects of that that might be interesting; the devil’s in the details, though—how they are defined, what the ramifications are, how fundamental these rights really are.
All of these questions need to go through the filter not only of technology, but also the policies and the goals of individual governments—which you know they don’t speak with one voice, and so they’re going to look at it through very different filters. So it’s hard to kind of conceptualize things that are in a resume around the world with such different filters.
There are some fundamental things in privacy and security that are increasingly coming into sharper focus that I think people would agree are important, and then trying to find some commonly accepted but meaningful standards from that, I think, would be helpful. It’s just hard to imagine who’s going to decide that, how they’re going to decide that, and whether when they’re done everyone is going to agree that these are things that are important.
People have very different definitions of what privacy and security should be, what constitutes privacy or what rights people should have or how you exercise the technology; in security, what is acceptable, what’s not, how do you protect it. Security is always a cat and mouse game—it’s always evolving.
Sign Up ForOur Newsletter
Join our mailing list to receive the latest news and updates from our team.