“We’ll proceed to face scrutiny — a few of it truthful and a few of it unfair,” he stated within the memo. “However we also needs to proceed to carry our heads up excessive.”
Right here is Mr. Clegg’s memo in full:
OUR POSITION ON POLARIZATION AND ELECTIONS
You’ll have seen the collection of articles about us printed within the Wall Road Journal in latest days, and the general public curiosity it has provoked. This Sunday evening, the ex-employee who leaked inner firm materials to the Journal will seem in a section on 60 Minutes on CBS. We perceive the piece is prone to assert that we contribute to polarization in the US, and recommend that the extraordinary steps we took for the 2020 elections had been relaxed too quickly and contributed to the horrific occasions of January sixth within the Capitol.
I do know a few of you – particularly these of you within the US – are going to get questions from family and friends about these items so I wished to take a second as we head into the weekend to offer what I hope is a few helpful context on our work in these essential areas.
Fb and Polarization
Individuals are understandably anxious concerning the divisions in society and in search of solutions and methods to repair the issues. Social media has had a big effect on society lately, and Fb is usually a spot the place a lot of this debate performs out. So it’s pure for individuals to ask whether or not it’s a part of the issue. However the concept that Fb is the chief reason for polarization isn’t supported by the info – as Chris and Pratiti set out of their notice on the difficulty earlier this 12 months.
The rise of polarization has been the topic of swathes of significant educational analysis lately. In fact, there isn’t a substantial amount of consensus. However what proof there’s merely doesn’t assist the concept that Fb, or social media extra usually, is the first reason for polarization.
The rise in political polarization within the US pre-dates social media by a number of a long time. If it had been true that Fb is the chief reason for polarization, we’d anticipate to see it going up wherever Fb is widespread. It isn’t. In truth, polarization has gone down in plenty of international locations with excessive social media use on the similar time that it has risen within the US.
Particularly, we anticipate the reporting to recommend {that a} change to Fb’s Information Feed rating algorithm was liable for elevating polarizing content material on the platform. In January 2018, we made rating modifications to advertise Significant Social Interactions (MSI) – so that you’d see extra content material from mates, household and teams you’re a part of in your Information Feed. This modification was closely pushed by inner and exterior analysis that confirmed that significant engagement with family and friends on our platform was higher for individuals’s wellbeing, and we additional refined and improved it over time as we do with all rating metrics. In fact, everybody has a rogue uncle or an old style classmate who holds robust or excessive views we disagree with – that’s life – and the change meant you usually tend to come throughout their posts too. Even so, we’ve developed industry-leading instruments to take away hateful content material and cut back the distribution of problematic content material. Because of this, the prevalence of hate speech on our platform is now all the way down to about 0.05%.
However the easy reality stays that modifications to algorithmic rating programs on one social media platform can not clarify wider societal polarization. Certainly, polarizing content material and misinformation are additionally current on platforms that don’t have any algorithmic rating in any way, together with non-public messaging apps like iMessage and WhatsApp.
Elections and Democracy
There’s maybe no different subject that we’ve been extra vocal about as an organization than on our work to dramatically change the best way we method elections. Beginning in 2017, we started constructing new defenses, bringing in new experience, and strengthening our insurance policies to stop interference. Right this moment, we’ve greater than 40,000 individuals throughout the corporate engaged on security and safety.
Since 2017, we’ve disrupted and eliminated greater than 150 covert affect operations, together with forward of main democratic elections. In 2020 alone, we eliminated greater than 5 billion faux accounts — figuring out virtually all of them earlier than anybody flagged them to us. And, from March to Election Day, we eliminated greater than 265,000 items of Fb and Instagram content material within the US for violating our voter interference insurance policies.
Given the extraordinary circumstances of holding a contentious election in a pandemic, we applied so referred to as “break glass” measures – and spoke publicly about them – earlier than and after Election Day to answer particular and weird indicators we had been seeing on our platform and to maintain doubtlessly violating content material from spreading earlier than our content material reviewers might assess it in opposition to our insurance policies.
These measures weren’t with out trade-offs – they’re blunt devices designed to take care of particular disaster eventualities. It’s like shutting down a whole city’s roads and highways in response to a short lived menace that could be lurking someplace in a specific neighborhood. In implementing them, we all know we impacted vital quantities of content material that didn’t violate our guidelines to prioritize individuals’s security throughout a interval of maximum uncertainty. For instance, we restricted the distribution of reside movies that our programs predicted might relate to the election. That was an excessive step that helped stop doubtlessly violating content material from going viral, but it surely additionally impacted lots of completely regular and cheap content material, together with some that had nothing to do with the election. We wouldn’t take this type of crude, catch-all measure in regular circumstances, however these weren’t regular circumstances.
We solely rolled again these emergency measures – primarily based on cautious data-driven evaluation – once we noticed a return to extra regular circumstances. We left a few of them on for an extended time period by means of February this 12 months and others, like not recommending civic, political or new Teams, we’ve determined to retain completely.
Combating Hate Teams and different Harmful Organizations
I wish to be completely clear: we work to restrict, not broaden hate speech, and we’ve clear insurance policies prohibiting content material that incites violence. We don’t revenue from polarization, in reality, simply the alternative. We don’t permit harmful organizations, together with militarized social actions or violence-inducing conspiracy networks, to arrange on our platforms. And we take away content material that praises or helps hate teams, terrorist organizations and felony teams.
We’ve been extra aggressive than some other web firm in combating dangerous content material, together with content material that sought to delegitimize the election. However our work to crack down on these hate teams was years within the making. We took down tens of 1000’s of QAnon pages, teams and accounts from our apps, eliminated the unique #StopTheSteal Group, and eliminated references to Cease the Steal within the run as much as the inauguration. In 2020 alone, we eliminated greater than 30 million items of content material violating our insurance policies concerning terrorism and greater than 19 million items of content material violating our insurance policies round organized hate in 2020. We designated the Proud Boys as a hate group in 2018 and we proceed to take away reward, assist, and illustration of them. Between August final 12 months and January 12 this 12 months, we recognized practically 900 militia organizations underneath our Harmful Organizations and People coverage and eliminated 1000’s of Pages, teams, occasions, Fb profiles and Instagram accounts related to these teams.
This work won’t ever be full. There’ll at all times be new threats and new issues to deal with, within the US and all over the world. That’s why we stay vigilant and alert – and can at all times must.
That can also be why the suggestion that’s typically made that the violent rebel on January 6 wouldn’t have occurred if it was not for social media is so deceptive. To be clear, the accountability for these occasions rests squarely with the perpetrators of the violence, and people in politics and elsewhere who actively inspired them. Mature democracies by which social media use is widespread maintain elections on a regular basis – for example Germany’s election final week – with out the disfiguring presence of violence. We actively share with Regulation Enforcement materials that we will discover on our companies associated to those traumatic occasions. However lowering the advanced causes for polarization in America – or the rebel particularly – to a technological rationalization is woefully simplistic.
We’ll proceed to face scrutiny – a few of it truthful and a few of it unfair. We’ll proceed to be requested troublesome questions. And many individuals will proceed to be skeptical of our motives. That’s what comes with being a part of an organization that has a major affect on this planet. We should be humble sufficient to simply accept criticism when it’s truthful, and to make modifications the place they’re justified. We aren’t good and we don’t have all of the solutions. That’s why we do the form of analysis that has been the topic of those tales within the first place. And we’ll preserve in search of methods to answer the suggestions we hear from our customers, together with testing methods to verify political content material doesn’t take over their Information Feeds.
However we also needs to proceed to carry our heads up excessive. You and your groups do unimaginable work. Our instruments and merchandise have a vastly optimistic affect on the world and in individuals’s lives. And you’ve got each motive to be pleased with that work.