When they first got off the ground, social media sites like Facebook and Instagram existed merely as places for people to congregate and share their thoughts online. It was a simpler time, and there wasn’t much in the way of controversy. There certainly weren’t vocal political factions speaking out and demanding the platforms do a better job of regulating content.

Times have changed, though, and they’re more turbulent than ever in 2019. In the weeks following a horrific event in Christchurch, New Zealand—in which terrorists attacked two mosques and aired a live stream of the massacres on Facebook Live—the tech industry powers that be are now beginning to take serious action. Facebook has announced a decision that it will be banning white nationalist and white supremacist content.

Wired reports that this action will apply to posts on both Facebook and Instagram. The move is a long time coming, as a Motherboard report almost a year ago began this conversation by detailing the company’s failure to combat racism. It was generally agreed upon that expressing white nationalism was a violation of Facebook guidelines, but it was difficult to nail down exactly how to address the issue.

“We believe there’s a lot of content generated from white nationalist groups generally that would violate [Facebook guidelines],” Muslim Advocates special counsel Madihha Ahussain told Wired. “It takes a lot on the part of advocacy groups to see some action.”

Policing white nationalist content has always been a challenge for tech companies like Facebook. With some extremist ideologies, such as support for ISIS or Al-Qaeda, objectionable content is overt and easy for moderators to spot. White nationalism can sometimes be vague and coded into phrases that sound ordinary but mean something completely different to white supremacists, also known as “dog whistles,” which makes it difficult to pin down. This problem was the focus of a 2018 Senate hearing, in which lawmakers agreed with executives from YouTube, Facebook, and Twitter that taking down ISIS posts was easy but white supremacy was a thornier issue.

It remains to be seen whether Facebook’s new policy will actually be effective. Facebook has indicated that it’s likely to start slowly, going after only white nationalist content that is explicit and overt. Data and Society researcher Becca Lewis told Wired that this represents a good start, but it’s far from addressing the whole of the problem.

“It’s always tricky to implement these [policies] in a meaningful way,” Lewis said. “I’m cautiously optimistic about the impact that it can have.”

Photo by TY Lim / Shutterstock.com