(CNN)Over the weekend, President Donald Trump unleashed a Twitter storm against Facebook for banishing sevenhigh-profile figures from its platform, including conservative conspiracy theorist Alex Jones, right wing commentator Milo Yiannopoulos, and Nation of Islam leader Louis Farrakhan, who has frequently made anti-Semitic statements.
Trump seems to be forgetting — or willfully ignoring — the fact that constitutional guarantees of free speech protect US citizens, not private companies, from the government. It’s critical for Facebook to stand its ground against Trump and to keep going after other hatemongers on its platform. However, Facebook and other social networks should start with time-outs before banning users outright.
First, Facebook’s actions weren’t politically motivated. As a spokesperson for Facebook told CNN, “We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology.” The people who have used Facebook to try to drum up hatred against their fellow human beings are to blame for the shuttering of their accounts — not the company.
We’ve already seen an example of the disastrous outcomes that can result from Facebook choosing not to stand up to unjustified allegations. As Jill Abramson explains in her new book “Merchants of Truth: The Business of News and the Fight for Facts,” when in the spring of 2016 it was revealed that Facebook used human editors to identify and summarize trending stories, the company was falsely accused of bias against conservatives. It responded by eliminating the human editors, which is what has helped fake news stories spread more virulently on the platform.
It has long been obvious that social networks need to do more to control the accounts of people who spread hate on their platforms. One recent example of the devastating consequences of unchecked accounts comes from Germany, where researchers found that, in towns where more people use Facebook, there have been more attacks on refugees. In Myanmar, human rights activists say Facebook has been used to fuel a genocide against the Rohingya people, a Muslim minority group. The government of Sri Lanka blocked Facebook last year, saying it was being used to foment violence, and shuttered the platform again after terrorist attacks last month. The man who massacred worshippers in two New Zealand mosques in March used Facebook to document his actions. And many in India have blamed numerous launchings on hoaxes spread on the Facebook-owned platform, WhatsApp. So Facebook is clearly going to need to crack down on many more accounts to do its part to reduce these acts of hate.
But shutting down divisive users’ profiles without first giving them an opportunity to change their behavior might be too extreme a solution. In the case of Jones, for example, Facebook started by issuing him a warning and then suspended his account for 30 days in July. That was the right place to start. While it’s unclear whether a similar warning was given to all others whose profiles were shut down, it should have been.
Of course, social platforms have financial incentives to suspend rather than permanently banish users, since keeping more accounts can bring in more ad revenue. But this warning approach is also good for society. If people who promulgated hateful views were immediately banned permanently and could never again communicate on mainstream platforms, they wouldn’t have an incentive to change their behavior. Indeed, if they could only then find a voice on extreme platforms, such as Breitbart news, they would be incentivized to become more radical rather than more enlightened.
People should instead be encouraged to reform their speech and ideas. Time-outs could push abusers to rethink their views rather than become more extreme over the long term — especially if they don’t want to lose their own coveted followers.
In fact, Twitter’s chief executive, Jack Dorsey, told CNN in an interview that the company has evidence that temporary bans are effective. “I’m not naïve enough to believe it’s going to change it for everyone,” Dorsey admitted, “but it’s worth a shot.” He’s right.
Unfortunately, Jones, for one, is an inveterate hatemonger, and he didn’t heed Facebook’s warnings. Facebook was therefore right to evict him for good. But that doesn’t mean the approach will always be unsuccessful. In fact, the examples of Jones and others who have recently been banned would show future users who are given time-outs that if they don’t change, they truly are in danger of being permanently ejected.
Uduala ClickFOMO helps ecom store owners harness the psychological principle of the herd effect, social proof and fomo - 'the fear of missing out' to double, triple or even quadruple their store conversions. This integrates natively with shopify, woocomme
Get this Brand New Software That INSTANTLY Creates 1-Click SEO-Optimized and Traffic Pulling Affiliate Sites Stacked With Fresh, Unique Content and HOT Videos To Boost Sales and Affiliate Commissions 24*7 on Complete Autopilot.
It’s a sad state of affairs that we fear acts of violence — even fatalities — if we do not monitor the speech of social media users. But it is the reality that we are faced with. If Facebook wants to be part of the solution, rather than the problem, it must continue putting teeth into its policies by monitoring and managing its platform to end abuse, regardless of how much the President complains.
Breakthrough, Innovative Video Software that creates professional HIGH-Quality animated videos on demand. Click. Edit. Animayte. NO video skills, design ability or technical experience needed. 100% Newbie-Friendly and Ready-2-Animayte! RRP $2,991.00!