Photo By: 
Youtube

Zuckerberg admits Russian Meddling in 2016 Election

Before Congress Facebook CEO admits many faults but promises change

Guest Post's picture
by Guest Post on
Apr 10, 2018

Even before we released the Cambridge Analytica series, that explains how data was used to manipulate the Trump vote in the 2016 Presidential election, we suggested that there be more regulation with internet advertising. It is ironic that in the hearing today Senator Lindsay Graham (R-SC) accused them of being a monopoly because companies like Facebook and Google need more regulation to prevent this. Today before Congress, Facebook CEO Mark Zuckerberg highlighted many positives that the worlds largest social network has been behind like the MeToo movement and March for Our Lives; but do not ignore the bigger picture. Their inability to monitor advertisers and violations of user privacy has given us four years of the worst attacks from the White House against working people in this nation's history. As we applaud this billionaire saying sorry, it’s too little too late Mark. -Kris LaGrange

Below is Zuckerberg's opening statement as well as video from the hearing.

I. INTRODUCTION

Chairman Walden, Ranking Member Pallone, and Members of the Committee,
 
We face a number of important issues around privacy, safety, and democracy, and you will rightfully have some hard questions for me to answer. Before I talk about the steps we’re taking to address them, I want to talk about how we got here.
 
Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring. As Facebook has grown, people everywhere have gotten a powerful new tool to stay connected to the people they love, make their voices heard, and build communities and businesses. Just recently, we’ve seen the #metoo movement and the March for Our Lives, organized, at least in part, on Facebook. After Hurricane Harvey, people raised more than $20 million for relief. And more than 70 million small businesses now use Facebook to grow and create jobs.
 
But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.
 
So now we have to go through every part of our relationship with people and make sure we’re taking a broad enough view of our responsibility.
 
It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.
 
It will take some time to work through all of the changes we need to make, but I’m committed to getting it right.  
 
That includes improving the way we protect people’s information and safeguard elections around the world. Here are a few key things we’re doing:

 II. CAMBRIDGE ANALYTICA
 Over the past few weeks, we’ve been working to understand exactly what happened with Cambridge Analytica and taking steps to make sure this doesn’t happen again. We took important actions to prevent this from happening again today four years ago, but we also made mistakes, there’s more to do, and we need to step up and do it.
 A. What Happened
 In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.
 
In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who agreed to share some of their Facebook information as well as some information from their friends whose privacy settings allowed it. Given the way our platform worked at the time this meant Kogan was able to access some information about tens of millions of their friends.
 
In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the Facebook information apps could access. Most importantly, apps like Kogan’s could no longer ask for information about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from Facebook before they could request any data beyond a user’s public profile, friend list, and email address. These actions would prevent any app like Kogan’s from being able to access as much Facebook data today.
 
In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and other entities he gave the data to, including Cambridge Analytica, formally certify that they had deleted all improperly acquired data — which they ultimately did.  
 
Last month, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to investigate this. We’re also working with the U.K. Information Commissioner’s Office, which has jurisdiction over Cambridge Analytica, as it completes its investigation into what happened.

 B. What We Are Doing
We have a responsibility to make sure what happened with Kogan and Cambridge Analytica doesn’t happen again. Here are some of the steps we’re taking:

  • Safeguarding our platform. We need to make sure that developers like Kogan who got access to a lot of information in the past can’t get access to as much information going forward.  o We made some big changes to the Facebook platform in 2014 to dramatically restrict the amount of data that developers can access and to proactively review the apps on our platform. This makes it so a developer today can’t do what Kogan did years ago. 
     
  • But there’s more we can do here to limit the information developers can access and put more safeguards in place to prevent abuse.  
     
  • We’re removing developers’ access to your data if you haven’t used their app in three months. 
     
  • We’re reducing the data you give an app when you approve it to only your name, profile photo, and email address. That’s a lot less than apps can get on any other major app platform.
     
  • We’re requiring developers to not only get approval but also to sign a contract that imposes strict requirements in order to ask anyone for access to their posts or other private data.
     
  • We’re restricting more APIs like groups and events. You should be able to sign into apps and share your public information easily, but anything that might also share other people’s information — like other posts in groups you’re in or other people going to events you’re going to — will be much more restricted. 
     
  • Two weeks ago, we found out that a feature that lets you look someone up by their phone number and email was abused. This feature is useful in cases where people have the same name, but it was abused to link people’s public Facebook information to a phone number they already had. When we found out about the abuse, we shut this feature down. • Investigating other apps. We’re in the process of investigating every app that had access to a large amount of information before we locked down our platform in 2014. If we detect suspicious activity, we’ll do a full forensic audit. And if we find that someone is improperly using data, we’ll ban them and tell everyone affected.
     
  • Building better controls. Finally, we’re making it easier to understand which apps you’ve allowed to access your data. This week we started showing everyone a list of the apps you’ve used and an easy way to revoke their permissions to your data. You can already do this in your privacy settings, but we’re going to put it at the top of News Feed to make sure everyone sees it. And we also told everyone whose Facebook information may have been shared with Cambridge Analytica.

Click here to read the rest of Zuckerberg's Opening Statement.

 

 

Sign up for our e-Newsletter!