Mark Zuckerberg tells CNN he is 'happy to' testify before Congress

Mark Zuckerberg tells CNN he is happy to testify before Congress

This was a major breach of trust. I'm really sorry this happened, said Zuckerberg.



By Laurie Segall, CNN

Published: Thu 22 Mar 2018, 9:00 AM

Last updated: Thu 22 Mar 2018, 5:39 PM

The world's largest social media network is facing growing government scrutiny in Europe and the United States about a whistleblower's allegations that London-based political consultancy Cambridge Analytica improperly accessed user information to build profiles on American voters that were later used to help elect US President Donald Trump in 2016.
"This was a major breach of trust. I'm really sorry this happened. We have a basic responsibility to protect people's data," Zuckerberg said in an interview with CNN, breaking a public silence since the scandal erupted at the weekend.
Zuckerberg said in a post on Facebook the company "made mistakes, there's more to do, and we need to step up and do it."

Full interview


LAURIE SEGALL, CNN SENIOR TECHNOLOGY CORRESPONDENT:  I want to start with just a basic question, Mark.  What happened?  What went wrong? 
MARK ZUCKERERG, FACEBOOK CEO:  So this was a major breach of trust, and I'm really sorry that this happened.  You know, we have a basic responsibility to protect people's data.  And if we can't do that, then we don't deserve to have the opportunity to serve people. 
So, our responsibility now is to make sure that this doesn't happen again.  And there are a few basic things that I think we need to do to ensure that.

One is making sure that developers like Aleksandr Kogan, who got access to a lot of information and then improperly used it, just don't get access to as much information going forward.  So, we are doing a set of things to restrict the amount of access that developers can get going forward. 
But the other is we need to make sure there aren't any other Cambridge Analyticas out there, right, or folks who have improperly accessed data.  So, we're going to go now and investigate every app that has access to a large amount of information from before we locked down our platform.  And if we detect any suspicious activity, we're going to do a full forensic audit. 
SEGALL:  Facebook has asked us to share our data, to share our lives on its platform and wanted us to be transparent.  And people don't feel like they've received that same amount of transparency.  They're wondering what's happening to their data.  Can they trust Facebook? 
ZUCKERBERG:  Yes.  So one of the most important things that I think we need to do here is make sure that we tell everyone whose data was affected by one of these rogue apps, right?  And we're going to do that.  We're going to build a tool where anyone can go and see if their data was a part of this.  But-
SEGALL:  So the 50 million people that were impacted, they will be able to tell if they were impacted by this? 
ZUCKERBERG:  Yes.  We're going to be even conservative on that.  So, you know, we may not have all of the data in our system today.  So, anyone whose data might have been affected by this, we're going to make sure that we tell. 
And going forward when we identify apps that are similarly doing sketchy things, we're going to make sure that we tell people then too, right?  That's definitely something that looking back on this, you know, I regret that we didn't do at the time.  And I think we got that wrong.  And we're committed to getting that right going forward. 
SEGALL:  I want to ask about that because when this came to light, you guys knew this a long time ago, that this data was out there.  Why didn't you tell users?  Don't you think users have the right to know that their data is being used for different purposes? 
ZUCKERBERG:  So yes.  And let me tell you what actions we took. 
So, in 2015, some journalists from "The Guardian" told us that they had seen or had some evidence that data-that this app developer, Aleksandr Kogan, who built this personality quiz app and a bunch of people used it and shared data with it, had sold that data to Cambridge Analytica and a few other firms.  And when we heard that, and that's against the policies, right?  You can't share data in a way that people don't know or don't consent to. 
We immediately banned Kogan's app.  And, further, we made it so that Kogan and Cambridge Analytica and the other folks with whom we shared the data-we asked for a formal certification that they had none of the data from anyone in the Facebook community, that they deleted it if they had it, and that they weren't using it.  And they all provided that certification. 
So, as far as we understood around the time of that episode, there was no data out there. 
SEGALL:  So, why didn't Facebook follow up?  You know, you say you certified it.  I think why wasn't there more of a follow-up?  Why wasn't there an audit then?  Why does it take a big media report to get that proactive approach? 
ZUCKERBERG:  Well, I mean, I don't know about you, but I was-I'm used to when people legally certify that they're going to do something, that they do it.  But I think that this was clearly a mistake in retrospect. 
SEGALL:  Was it putting too much trust in developers?
ZUCKERBERG:  I-I mean, I think it did.  That's why we need to make sure we don't make that mistake ever again, which is why one of the things that I announced today is that we're going to do a full investigation into every app that had access to a large amount of data from around this time, before we locked down the platform. 
And we're now not just going to take people's word for it and-when they give us a legal certification, but if we see anything suspicious, which I think there probably were signs in this case that we could have looked into, we're going to do a full forensic audit. 
SEGALL:  How do you know there aren't hundreds more companies like Cambridge Analytica that are also keeping data that violates Facebook's policies? 
ZUCKERBERG:  Well, I think the question here is do-are our app developers, who people have given access to their data, are they doing something that people don't want?  Are they selling the data in a way that people don't want, or are they giving it to someone that they don't have authorization to do? 
And this is something that I think we now need to go figure out, right?  So for all these apps-
SEGALL:  That's got to be-I got to say, that's got to be a really challenging ordeal.  How do you actually do that?  Because you talk about it being years ago, and then you guys have made it a bit stricter for that kind of information to be shared.  But backtracking on it, it's got to be difficult to find out where that data has gone and what other companies have shady access. 
ZUCKERBERG:  Yes.  I mean, as you say, I mean, the good news here is we already changed the platform policies in 2014.  Before that, we know what the apps were that had access to data.  We know how much-how many people were using those services, and we can look at the patterns of their data requests. 
And based on that, we think we'll have a pretty clear sense of whether anyone was doing anything abnormal, and we'll be able to do a full audit of anyone who is questionable. 
SEGALL:  Do you expect-do you have any scale or any scope of what you expect to find, anything in the scope of what happened with Cambridge Analytica where you had 50 million users? 
ZUCKERBERG:  Well, it's hard to know what we'll find, but we're going to review thousands of apps.  So, this is going to be an intensive process, but this is important.  I mean, this is something that in retrospect we clearly should have done up front with Cambridge Analytica.  We should not have trusted the certification that they gave us, and we're not going to make that mistake again. 
I mean, this is our responsibility to our community is to make sure that, that we secure the data that their -- 
MARK ZUCKERBERG, CEO, FACEBOOK:  If you told me in 2004 when I was getting started with Facebook that a big part of my responsibility today would be to help protect the integrity of elections against interference by other governments, you know, I wouldn't have really believed that was going to be something that I would have to work on 14 years later. 
LAURIE SEGALL, CNN SENIOR TECHNOLOGY CORRESPONDENT:  I'm going to challenge you-
ZUCKERBERG:  But we're here now.  And we're going to make sure that we do a good job at it. 
SEGALL:  Have you done a good enough job yet? 
ZUCKERBERG:  Well, I think we will see.  But, you know, I think what's clear is that in 2016, we were not as on top of a number of issues as we should have, whether it was Russian interference or fake news. 
But what we have seen since then is, you know, a number of months later, there was a major French election, and there we deployed some A.I. tools that did a much better job of identifying Russian bots and basically Russian potential interference and weeding that out of the platform ahead of the election.  And we were much happier with how that went.
In 2017, last year, during a special election in the Senate seat in Alabama, we deployed some new A.I. tools that we built to detect fake accounts that were trying to spread false news, and we found a lot of different accounts coming from Macedonia. 
So, you know, I think the reality here is that this isn't rocket science, right?  I mean, there's a lot of hard work that we need to do to make it harder for nation states like Russia to do election interference, to make it so that trolls and other folks can't spread fake news. 
But we can get in front of this, and we have a responsibility to do this not only for the 2018 midterms in the US, which are going to be a huge deal this year, and that's just a huge focus of us.  But there's a big election in India this year.  There's a big election in Brazil.  There are big elections around the world, and you can bet that we are really committed to doing everything that we need to, to make sure that the integrity of those elections on Facebook is secured. 

SEGALL:  I can hear the commitment.  But since I got you here, do you think that bad actors are using Facebook at this moment to meddle with the U.S. midterm elections? 
ZUCKERBERG:  I'm sure someone's trying, right?  I'm sure there's, you know, V2 of all-a version two of whatever the Russian effort was in 2016.  I'm sure they're working on that, and there are going to be some new tactics that we need to make sure that we observe and get in front of. 
SEGALL:  Speaking of getting in front of them, do you know what they are?  Do you have any idea? 
ZUCKERBERG:  Yes.  And I think we have some sense of the different things that we need to get in front of.
SEGALL:  Are you specifically seeing bad actors trying to meddle with the U.S. election now? 
ZUCKERBERG:  I'm not 100 percent sure what that means because it's not-I think the candidates aren't all-
SEGALL:  Are you seeing anything new or interesting? 
ZUCKERBERG:  Well, what we see-what we see are a lot of folks trying to sow division, right?  So, that was a major tactic that we saw Russia try to use in the 2016 election.  Actually most of what they did was not directly, as far as we can tell from the data that we've seen.  It was not directly about the election but was more about just dividing people. 
And, you know, so they run a group on, you know, for pro-immigration reform, and then they'd run another group against immigration reform and just try to pit people against each other.  And a lot of this was done with fake accounts that we can do a better job of tracing and using A.I. tools to be able to scan and observe a lot of what is going on.  And I'm confident that we're going to do a much better job. 

SEGALL:  -- to their future and what a kinder Facebook looks like? 
ZUCKERBERG:  Well, I think having kids changes a lot.  And-

SEGALL:  Like what? 
ZUCKERBERG:  Well, you know, I used to think that the most important thing to me by far was, you know, my having the greatest positive impact across the world that I can.  And now, you know, I really just care about building something that my girls are going to grow up and be proud of me for.  And I mean, that's what is kind of my guiding philosophy at this point is, you know, when I come and work on a lot of hard things during the day and I go home and just ask, will my girls be proud of what I did today.


More news from WORLD