Time to scrutinize Facebook's algorithms — and what social media do to us

Facebook logos on a computer screen are seen in this illustration photo. (CNS/Reuters/Valentin Flauraud)

Facebook logos on a computer screen are seen in this illustration photo. (CNS/Reuters/Valentin Flauraud)

by Michael Sean Winters

View Author Profile

When you think of Facebook, do you think of a grandfather posting pictures of his grandchildren? Of long-lost classmates reconnecting after years and sharing their life stories with an old friend? Of recipe swapping and neighborhood watch and finding Mass times?

Think again. Yes, Facebook is all those things, a benign means of connecting virtually with people, but it is also, as the leading social media platform, a means for conspiring to overthrow the government, a vehicle for bullying and worse, and a corporate monopoly to boot!

One of the mantras common among those with the thankless task of offering a moral defense of modern capitalism is that the market is intrinsically amoral, that it is merely a tool, that it can be used for good or for ill. The defenders of social media platforms have often made a similar argument. They function in the modern world the way a town meeting did in former times. Or they allow people the right to make their voices heard without traveling to Hyde Park's Speakers' Corner to mount the soapbox. If you never objected to the "Letters to the Editor" section of the newspaper, why would you object to Facebook?

I never bought the argument about markets being amoral and I do not buy it here either. It seems obvious from even the most cursory examination of our species that we humans are shaped by the tools we use. And it is also more than obvious that the fact you need not be in the same room as your interlocutor would invite much mischief, from posturing to belligerence, when compared with speaking to a real person face-to-face.

The testimony from whistleblower Frances Haugen earlier this month brought into the open some of the principal concerns about Facebook. As NPR summed up the case:

Haugen told Congress that Facebook consistently chose to maximize its growth rather than implement safeguards on its platforms, just as it hid from the public and government officials internal research that illuminated the harms of Facebook products.

"The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people," Haugen testified.

Facebook is not the only company to choose maximization of growth and profits over public safeguards. The theory that companies have a fiduciary responsibility to their shareholders to maximize profit has been at the heart of neoliberal economics for the past 40 years. What was different in this case is that social media exert such exceptional control over children. It is that concern that got Democrats and Republicans united in calling for reforms at the social media giant. Haugen's testimony alleged Facebook knew exactly what it was doing.

Of course, I wouldn't expect Republicans to extend this same concern about corporations putting profits ahead of safety, even the safety of children, when it comes to gun manufacturers anytime soon. The Second Amendment remains more sacrosanct than any concern about child safety to them.

I am surprised to see some on the left show so little regard for the First Amendment. On Chris Hayes' MSNBC show Tuesday night, Oct. 26, he interviewed The Washington Post's Silicon Valley correspondent Elizabeth Dwoskin, who said: "Now, increasingly, in recent years, as Mark Zuckerberg has become more isolated, he's also taken on more hardline positions about free speech."

Anything wrong with being concerned about free speech? Hayes said not a word. Over in the Arlington National Cemetery, the body of Supreme Court Justice Oliver Wendell Holmes Jr. was turning over in his grave.

To be sure, the problem at Facebook is not merely one of free speech. Even those of us who are computer Luddites have been made to learn the power of internet algorithms. When you check on a price for a plane ticket, and then get bombarded with emails about hotels and restaurants in that location, it is because you triggered an algorithm that sends people trying to sell you something your way. This week we learned that a Facebook algorithm awarded five points to a post that contained an angry emoji. If you merely liked a post, you only got one point.

This had the effect of making angry information more accessible and prevalent within the Facebook world than non-angry information. When you think about the spread of misinformation about the election, and the anger that erupted on Jan. 6, the full reprehensibility of the practice is evident.

This summer, the Federal Trade Commission filed a complaint alleging unfair monopolistic activity on the part of Facebook, a "buy-or-bury" scheme that eliminated competition. Antitrust activity was not undertaken during much of the past 40 years of neoliberal economics. You have to ask yourself if the government had pursued such action earlier, would a rival have emerged that promised to place child safety ahead of profits and growth?

Social media have the tendency to eliminate the context for information, to compress, concentrate and accelerate it. Just so, social media distort normal human relations and the people on social media can easily become caricatures of their true selves, with the result that the ever-changing relations with others that characterize their lives are seen partially and in attenuated form.

History does a similar thing: We look back across the years of, say, the early federal period, and we piece together the documentary evidence we have, a series of letters, newspapers that were published, pamphlets advocating political points of view, records of congressional debates. The evidence is fragmentary. There are days, maybe even weeks, about which we know next to nothing because no documentary evidence has come down to us. Even on the days about which we know quite a lot, we know, too, that there were unspoken motives and ideas that played their part and about which we know nothing. A good historian develops an allergy to sweeping claims because he or she knows that the picture may be as full as possible, but that the actual events had other elements in play but that left no discernible trace for the historian to pursue.

A nicely done work of history slows down our understanding and doesn't accelerate it. It invites humility where social media invite hubris. Social media perpetrate their distortions in the twinkling of an eye. They invite a lack of caution. They rush an already fast world to ever quicker, and consequently more reckless, conclusions. Social media create no place to have a discussion or a debate if you are interested in getting to the heart of the matter, but it is excellent at whipping up emotions. I do not know if Facebook is the worst among the various social media platforms, but it has been the most successful.

The prospect of government regulation of speech is alarming, to be sure. What may appear benign today, the thought of Attorney General Merrick Garland, supervising reforms at Facebook, could set a precedent that permits a Trump-appointed attorney general to wreak havoc. Here is my guidance: The more full-throated and comprehensive and uncomplicated a proposal, the less likely it will be to balance the interests that need to be balanced.

Social media aren't going away just like cars are not going away, but the country has an interest in making both as safe as possible, even if that means social media become a little more boring. How you balance the algorithms to give safety the same consideration as speed is something the engineers need to determine. What we know now is that the government has to check their work.

Latest News

Advertisement