Home » News and Events » PEOPLE ARE CALLING FOR ZUCKERBERG’S RESIGNATION. HERE ARE JUST FIVE OF THE REASONS WHY

PEOPLE ARE CALLING FOR ZUCKERBERG’S RESIGNATION. HERE ARE JUST FIVE OF THE REASONS WHY

Facebook has been beset by scandals over the last year and many believe that nothing will change until its founder and CEO is gone.

A petition has been launched with one simple objective: to force Mark Zuckerberg to resign as CEO of Facebook.

The campaign group behind it, Fight for the Future, says that although there’s no “silver bullet” to “fix” Facebook, the company cannot address its underlying problems while Zuckerberg remains in charge.

The petition is highly unlikely to succeed, of course. It’s hard to imagine Zuckerberg stepping down voluntarily. And there’s not much Facebook’s board can do either, even if they wanted to. Zuckerberg controls about 60% of all voting shares in Facebook. He’s pretty much untouchable, both as CEO and as board chairman. Despite near-weekly scandals, the company is still growing, and it’s one of the most profitable business ventures in human history.

(Another potential solution, as described in a piece in the New York Timeswritten by one of Facebook’s cofounders, is to break the company up and implement new data privacy regulations in the US.)

Need a reminder as to why everyone is so angry with Facebook and Mark Zuckerberg anyway? Here’s a handy cut-out-and-keep list of just some of the most significant scandals involving the tech giant over the last year or so. (Not to mention all the wider problems of fake news or echo chambers or the decimation of the media. Or dodgy PR practices.)

The high-impact one

Back in March 2018, a whistleblower revealed that political consultancy Cambridge Analytica had collected private information from more than 87 million Facebook profiles without the users’ consent. Facebook let third parties scrape data from applications: in Cambridge Analytica’s case, a personality quiz developed by a Cambridge University academic, Aleksandr Kogan. Mark Zuckerberg responded by admitting “we made mistakes” and promising to restrict data sharing with third-party apps in the future.

What made it particularly explosive were claims that the data-mining operations might have affected Trump’s election and the Brexit vote.

The many data mishaps

In September 2018, Facebook admitted that 50 million users had had their personal information exposed by a hack on its systems. The number was later revised down to 30 million, which still makes it the biggest breach in Facebook’s history.

In March 2019 it turned out Facebook had been storing up to 600 million users’ passwords insecurely since 2012. Just days later, we learned that half a billion Facebook records had been left exposed on the public internet.  

The discriminatory advertising practices

Facebook’s ad-serving algorithm automatically discriminates by gender and race, even when no one tells it to. Advertisers can also explicitly discriminateagainst certain areas when showing housing ads on Facebook, even though it’s illegal. Facebook has known about this problem since 2016. It still hasn’t fixed it.

The dodgy data deals

Facebook gave over 150 companies more intrusive access to users’ data than previously revealed, via special partnerships. We learned a bit more about this, and other dodgy data practices, in a cache of documents seized by the UK Parliament in November 2018. Facebook expects to be fined up to $5 billion for this and other instances of malpractice.

The vehicle for hate speech

The Christchurch, New Zealand, shooter used Facebook to live-stream his murder of 50 people. The broadcast was up for 20 minutes before any action was taken. We’re still waiting to hear what, if anything, Facebook will do about this issue (for example, it could choose to end its “Facebook Live” feature). It’s well established now that Facebook can help to fuel violence in the real world. But any response from Facebook has been piecemeal. It’s also a reminder of just how much power we’ve given Facebook (and its low-paid moderators) to decide what is and isn’t acceptable.