One of last week's newsletters focused on the issue of free speech (Strategic CSR – Free speech). The article in the url below expands on this argument, demonstrating some of the challenges (and collateral damage) of entrusting for-profit firms with the task of defining the parameters of acceptable content that is allowed to be disseminated via their platforms:
"What do a small business that sell socks packaged by homeless youth and a start-up that makes bracelets from life vests once worn by refugees have to do with the spread of misinformation during the presidential election season? Nothing, thought the entrepreneurs who started them, until Facebook notified them that their ads had been pulled because they fell into a category of 'social issues, elections or politics' that were being blocked by the site."
The challenges of separating right from wrong are particularly stark when the effort to define them is systematic – in other words when it is algorithmic, which necessarily generalizes about topics that, in reality, are extremely nuanced:
"The social media giant announced last week that it was extending a ban imposed on certain ads during the election to prevent the dissemination of false information. The prohibition has ensnared a number of socially driven businesses with no direct connection to partisan politics. Companies connected to issues like hunger, the environment and immigration, many of which rely heavily on social media to draw customers to their websites, have seen their access abruptly cut off."
The trouble is that, in writing algorithms (as in using data to predict any outcome), the prediction is only as good as the data available. In other words, the resulting algorithm can only account for what has happened before when predicting what might happen in the future. This conservative approach necessarily entails generalizations that lead to unintended consequences:
"In the run-up to the 2016 election, deceptive and distorted information spread by Russian automated accounts and others on social media platforms like Facebook, Instagram and YouTube was designed to influence voters. Some of the accounts generated posts about social issues, such as civil and women's rights, that proved divisive."
While the lost ad revenues are merely a drop in the ocean for Facebook, the ability to advertise in the run-up to the holiday season is life and death for many small businesses:
"Advertising on Facebook is a lifeline for Epimonia, a Minneapolis-based company that makes and sells bracelets and other items made from the discarded life jackets worn by refugees fleeing on flimsy boats to Europe. The company spends several thousand dollars a year to advertise on Facebook, which targets users who have a favorable view of refugees based on the interests listed on their profiles. … 'Not being able to run ads before the holidays could put us out of business,' said [founder and CEO, Mohamed] Malim, who employs a handful of refugees to make bracelets, beanies and T-shirts."
The lasting impression is that this is a fascinating topic that is incredibly difficult to navigate – whether for CEOs trying to run a company or social media platforms trying to determine what content should be allowed on their platforms. This difficulty is magnified when the affected stakeholders really care about a particular issue and are willing to pushback if their concerns are not acknowledged or accommodated. But, I think these cases (where stakeholders care deeply) also fall into the 'exception' category I tried to demarcate in my post last week. The trouble is that everything cannot be an exception or else no work ever gets done. But, whenever a bright line is drawn, unintended consequences mean there are inevitably innocent victims caught up in the blunt solution that has been imposed on such a complicated issue. In short, the key questions are 'Where should we draw the line?' and 'Who gets to draw it?' So far, it is not at all clear that, when considering society as a whole, the best answer to either question is for-profit firms.
Take care
David
David Chandler
© Sage Publications, 2020
Instructor Teaching and Student Study Site: https://study.sagepub.com/chandler5e
Strategic CSR Simulation: http://www.strategiccsrsim.com/
The library of CSR Newsletters are archived at: https://strategiccsr-sage.blogspot.com/
Why Did Facebook Mute Philanthropic Businesses?
By Miriam Jordan
November 18, 2020
The New York Times
Late Edition – Final
A14