With more and more of our news access being filtered (willingly or not) through social media reach, it’s about time readers started thinking critically about how those same social media sites might influence what we read and know.
Companies like Facebook have supplanted our traditional means of distribution, meaning many news outlets have no oversight—or insight—into how their content is disseminated and received by readers. And now that they’ve fired all their human editors in favor of the almighty algorithim, there’s even less insight and, as the Megyn Kelly-trending example shows, less management into what content gets distributed and how.
“I also worry about the opaqueness of Facebook and its mysterious algorithms. My team and I try to figure out why some posts seem to “hit” and are shared thousands of times while reaching millions of people, while others fare much more modestly,” said Dan Rather in a recent post (on Facebook). “On balance, I feel that all this change is a tremendous force for good. As this article states, I believe Facebook never set out to become the primary means of journalistic communication. We have to figure out how to make that work best for all concerned.”
But as we wade into discussing what alleigance and assistance social media companies owe us in the fight for modern journalism, let’s talk about things that matter. And—on trend—things that are real.
For instance, the answer to “Did Facebook Commit Libel Against Megyn Kelly?” is a resounding no. Libel, the legal definition for a defamation in a written form, is committed by folks who write articles, not folks (or robots or companies) that allow for that content to be shared. What’s more, under the DMCA or Communications Decency Act internet service providers and their intermediaries are not responsible for illegal content on sites so long as they remove it when it comes to their attention.
“It’s difficult to know who to blame for Facebook’s mistake,” wrote The Atlantic (which ultimately acknowledged that the law would not see Facebook as at fault). “On its face, the company’s decision to switch from human to algorithmic editors seems like a shirking of authority. The new Trending algorithm appears to work by promoting the most-discussed news topics to a place of prominence, no matter their global or editorial importance. It also caters to the kinds of stories that users appear to want to read.”
Which if Facebook is solely a technology company and not a media company—which it has always claimed is the case—then it has the right to do. Algorithms mess up. Just ask anybody who’s ever gotten a notice from the DMCA to take down a video because it contains a 30-second snippet of a song in the background that Youtube’s software flagged as a violation. As a technology company they are not necessarily responsible for verifying what users share. That’s how bullshit gossip and hashtags trend anywhere.
It’s worth asking if, in the future, there will be a new category of law that social media companies find themselves beholden to, with addendums for what they can and cannot allow on their pages. We seem to be wading into the debate already with questions over Twitter or Facebook’s politics and desire to step in around harassment. But in the meantime these social media sites are not legally treated as media companies. And that’s the way it was.