By Kalev Leetaru
Not a week goes by that there isn’t yet another story about the evils of social media, from enabling genocide to helping terrorists to human trafficking. A story earlier this month about Facebook’s use in advertising child brides and the company’s response that it was entirely unaware of the practice reminds us how little the companies are aware of how their products are being used and the monumental task of policing the online world. Perhaps most of all, these stories remind us of the lack of incentives for social media platforms to care, since they profit monetarily from evil.
It is impossible to design technology that can be used only for good. All scientific advances can be used either to improve the world or to harm it. Technology creators can create incentive structures and design principles that encourage certain uses of their tools, but in the end, bad actors will always find ways of repurposing even the most benign and beneficial advances for evil.
Earlier this month The Daily Beast reported the use of Facebook by certain communities to advertise child brides for sale, capturing yet another way in which Facebook’s platform supports human trafficking. While the company did not respond to a request for comment, a spokesperson commented to The Daily Beast that it was unaware that its platform was being used in that way.
Facebook’s ignorance to the myriad ways its platform is being misused reflects the simple fact that there are no incentives for it to do better. Across the world, the company bears almost no legal responsibility for the misuse of its platform.
In fact, Facebook actually earns a monetary profit from the sale of child brides on its platform through all of the advertisements that are consumed in the process.
Yet, when asked again and again whether it would consider refunding the revenue it earns from human trafficking, terrorism, hate speech and genocide, the company has met each request for comment with either silence or no comment.
Facebook is increasingly acting as a marketplace for illegal and unethical transactions across the world, earning a monetary commission on those activities through its ad revenue. When a child bride is sold on its platform, all of the users viewing, commenting and engaging with that sale are shown ads that earn Facebook money.
In short, like any marketplace, Facebook profits monetarily from the human trafficking it facilitates.
The fact that Facebook has refused to date to commit to either refunding the revenue it earns from such activity or to donate those funds to organizations that combat such activities, reminds us that social platforms have no reason to deter such use of their tools since they benefit them financially.
This failure of moral leadership is one of the reasons that countries across the world have stepped forward so aggressively of late to explore new legislation that would incentivize social platforms to finally take action against the misuse of their platforms that they profit from.
For example, recent proposed legislation in the United Kingdom would make social media executives personally liable for certain misuse, providing a strong incentive for them to finally take action. Though, in reality, it is unlikely that such strong legislation will survive industry lobbying. Unsurprisingly, Facebook did not respond to a request for comment on the legislation.
Putting this all together, it is clear that the current system of self-regulation and total immunization of social media platforms from any responsibility or consequence of the misuse of their platforms simply isn’t working. Beyond half-hearted public relations efforts, the companies have taken few real steps towards combatting misuse and their refusal to commit to refunding the money they earn from such misuse serves as a stark reminder of the very real financial counterincentives to regulation.
In the end, until companies take responsibility for their platforms and invest seriously in combatting misuse, governments have no choice but to step in with new legislation that will finally force companies to act.