WASHINGTON, D.C. – U.S. Senator Tammy Baldwin is urging Facebook CEO Mark Zuckerberg to improve enforcement of Facebook’s community standards and address how their algorithms are amplifying and spreading divisive messages, including hate speech, misinformation, and violent extremism on Facebook.
In a hearing of the Senate Commerce Committee on October 28, Baldwin asked Zuckerberg about his company’s failure to promptly remove the “Kenosha Guard” Facebook event encouraging armed vigilantism, despite receiving at least 455 user complaints that the event violated the company’s policy of banning militia groups and groups that seek to incite violence. That evening, an armed seventeen-year old traveled to Kenosha and killed two people and seriously injured a third person. Zuckerberg called the failure to take down the Kenosha Guard’s event an “operational mistake” but has failed to identify substantive changes he has made at his company as a result of that mistake.
“But beyond organizing by violent groups on Facebook, I am deeply troubled that there continues to be widespread misinformation on your platform, and that it is structured to facilitate, rather than combat, its spread,” wrote Senator Baldwin in her letter. “Reporting and analysis by the New York Times has found that misinformation is more popular on Facebook now than in 2016, with falsehoods about election interference swirling online at an alarming rate and election officials reporting they fear voter harassment and intimidation on Election Day as a result. I am concerned that your algorithms are designed to elevate this type of content for your users – because it is controversial, attracts attention, and keeps them on Facebook for longer periods of time. Your own employees have raised concerns about Facebook’s algorithms, calling them the single scariest feature of your platform.”
Baldwin continued, “Facebook has become a tool used by millions of Americans, and billions of people worldwide, to communicate, engage and learn. You have a responsibility to those users, and I request that you reexamine Facebook’s algorithms to ensure that they are not amplifying hate speech, misinformation, and violent extremist propaganda – particularly as our nation approaches a presidential election.”
The full letter is available here and included below.
Dear Mr. Zuckerberg,
I write today to follow up on our discussion of Facebook’s response to violent organizations on your platform during the Senate Commerce, Science, and Transportation Committee’s hearing earlier this week. I appreciate that, in response to my question about the use of Facebook by violent extremist groups, you stated that the issue was a “big area of concern” and that you are worried about “making sure that those groups can’t organize on Facebook.”
But with the election only days away, your answer did not reassure me that your company is taking sufficient action to address how Facebook facilitates violent organizations and spreads misinformation and hatred that can lead to real-world violence. From the recent plot to kidnap Michigan Governor Gretchen Whitmer, to ongoing efforts to intimidate voters with the presence of armed individuals at voting locations, the potential impact on our democracy is very real. I urge you to improve enforcement of your community standards and address how your algorithms are amplifying and spreading divisive messages, including hate speech, misinformation, and violent extremism on Facebook. I also request that you provide more information about how Facebook is combatting the spread of violent extremism on its platform.
As I noted in my question, in late August, Facebook failed to remove the “Kenosha Guard’s” event encouraging armed vigilantism, despite receiving at least 455 user complaints that the event violated your company’s policy of banning militia groups and groups that seek to incite violence, according to a report by BuzzFeed News. Kyle Rittenhouse has been charged with killing two people in Kenosha, Wisconsin, and you called the failure to take down the Kenosha Guard’s event an “operational mistake.” Your response to my question raised concern that you have not implemented adequate reforms to prevent a similar mistake from happening again. Therefore, please provide detailed information about how Facebook is screening for posts, groups and events on your platform by militias and other violent organizations and working promptly to remove them.
But beyond organizing by violent groups on Facebook, I am deeply troubled that there continues to be widespread misinformation on your platform, and that it is structured to facilitate, rather than combat, its spread. Reporting and analysis by the New York Times has found that misinformation is more popular on Facebook now than in 2016, with falsehoods about election interference swirling online at an alarming rate and election officials reporting they fear voter harassment and intimidation on Election Day as a result. I am concerned that your algorithms are designed to elevate this type of content for your users – because it is controversial, attracts attention, and keeps them on Facebook for longer periods of time. Your own employees have raised concerns about Facebook’s algorithms, calling them the single scariest feature of your platform.
Facebook has become a tool used by millions of Americans, and billions of people worldwide, to communicate, engage and learn. You have a responsibility to those users, and I request that you reexamine Facebook’s algorithms to ensure that they are not amplifying hate speech, misinformation, and violent extremist propaganda – particularly as our nation approaches a presidential election.
If this is in fact, as you told me on Wednesday, a “big area of concern” for Facebook, you must take additional steps to ensure that the platform is not a tool to furthering violence or undermining our democracy. Your leadership and corporate responsibility is desperately needed to meet this moment.
An online version of this release is available here.