A man charged with making a bomb threat at the Library of Congress broadcast his standoff with police live on Facebook.
His posts remained online for several hours before Facebook deactivated his account and pulled the videos, though clips of his anti-government tirade continued to circulate on social media. The incident reignited criticism of how social media platforms respond to users who advocate violence or broadcast crimes in real-time.
U.S. Sen. Gary Peters, D-Bloomfield Township, referenced the bomb threat live stream in a letter asking the CEOs of Facebook, YouTube and Twitter to provide information about how they moderate content that advocates violence. Peters said each platform has been used to disseminate manifestos or to post videos of violent acts.
The request is part of an ongoing investigation into domestic terrorism, sparked by the Jan. 6 riot at the U.S. Capitol, that Peters is leading as chair of the Senate Homeland Security and Governmental Affairs Committee. In letters to the big tech CEOs, Peters expressed concern about “violent extremists” using social media platforms to fundraise, radicalize others and organize attacks.
Peters also requested information about how targeted advertising tools allow advertisers to curate messages to certain groups based on keywords like “white supremacists” and “neo-Nazi.” Microtargeted ads and algorithms designed to drive engagement suggest tech companies are profiting from content that amplifies political violence, Peters said.
“There is a financial incentive for social media platforms like Facebook to keep users engaged on their platforms and viewing content, including extremist content,” Peters wrote in his letter.
The Michigan senator acknowledged the private companies are protected from legal liability and have the right to decide what is allowed on their websites. He also recognized efforts by Facebook, YouTube and Twitter to remove content that breaks their terms of service, but said questions remain about how platforms are being used to incite violence. Peters said more transparency is needed to ensure white supremacist and anti-government propaganda isn’t easily accessible with a few clicks.
Specific examples of incidents involving each of the three tech companies were outlined in Peters’ letters to Facebook CEO Mark Zuckerberg, YouTube CEO Susan Wojcicki and Twitter CEO Jack Dorsey.
All three platforms were used by a network of groups and users to organize “stop the steal” rallies in Washington, D.C. The rallies were promoted by focusing on the false narrative that former President Donald Trump won the 2020 election.
The rallies were followed by a violent riot at the U.S. Capitol where participants live-streamed themselves breaching the building and clashing with police. Rioters used YouTube and Facebook to livestream their actions inside the Capitol during the attack, and some used the streams to ask for donations. Mentions of “civil war” and calls for violence spiked online during the riot.
A group of men charged with planning to kidnap Gov. Gretchen Whitmer also used private Facebook groups to share recordings of training exercises, discuss their plans and recruit new members, according to federal prosecutors.
Facebook was also used to organize a violent neo-Nazi rally in Charlottesville, Virginia in 2017. Twitter users promoted wild claims that a child sex ring was being operated out of a pizzeria in Washington, D.C., which led to an armed man breaking into the restaurant in 2016.
A man who killed 51 worshipers at mosques in New Zealand in 2019 said racist YouTube videos were a significant source of inspiration for the attack.
Peters cited a report that found YouTube allowed advertisers to use racial epithets and phrases associated with domestic extremist groups as keywords to find videos and channels for targeted ads. This lets advertisers use hate terms to target ads on YouTube, according to the report.
Peters also expressed concern about Facebook’s advertising tools. The senator cited a Buzzfeed report that found Facebook users who posted about election misinformation or the Jan. 6 riot received targeted ads for armored vests, weapons attachments and other military-style gear.
“It is unclear the extent to which Facebook policies and practices continue to result in similar targeted advertisements to individuals associated with domestic extremist groups and how they may generate revenue from their placement,” Peters wrote in his letter to Facebook.
READ MORE ON MLIVE: