Facebook ran ads on searches related to white supremacist groups, despite banning such content on the platform, according to a report by the Tech Transparency Project.
The report, which was first covered by The Washington Post, identified 119 Facebook pages and 20 Facebook groups affiliated with white supremacist organizations on the platform. Researchers searched Facebook for 226 designated hate groups or dangerous organizations using sources including the Southern Poverty Law Center, the Anti-Defamation League and even Facebook itself, and found that more than a third were present on Facebook. the platform.
The study found that despite Facebook’s insistence that the company does not profit from hateful content, ads appeared on 40% of group queries.
The white supremacist pages identified by the report include two dozen that were automatically generated by Facebook. The platform automatically creates pages when users list interests, workplaces, or businesses with no existing page. The issue of auto-generated white supremacist commercial pages was previously raised in a 2020 analysis, also by the Tech Transparency Project. Among the auto-generated pages identified by the 2022 report are “Pen1 Death Squad”, shorthand for a white supremacist gang.
Meta spokesperson Dani Lever said 270 groups the company designates as white supremacist organizations are banned from Facebook, and it is investing in technology, staff and research to keep users safe. platforms.
“We immediately fixed an issue where ads appeared in searches for terms related to banned organizations and we are also working to fix an auto-generating issue, which incorrectly affected a small number of pages,” says Lever. “We will continue to work with outside experts and organizations with the goal of staying ahead of violent, hateful and terrorism-related content and removing such content from our platforms.”
In 2020, more than 1,000 advertisers boycotted Facebook due to the platform’s handling of hate speech and misinformation. That same year, the Civil Rights Auditors released a report which found that the company’s decisions had resulted in “serious setbacks” for civil rights. Following the audit, Meta created a Civil Rights Team in 2021, which released the status of actions and recommendations issued by the auditors.