Skip to main content

Verified by Psychology Today

The Facebook Papers, Congress, and Society

Commentary: Facebook is in the news again. We have to pay attention.

This post is in response to
Why Emotional Intelligence Is in Decline
Gerd Altmann/Pixabay
Source: Gerd Altmann/Pixabay

Frances Haugen, the Facebook whistleblower, made public a treasure trove of insider documents that confirm and expand on the difficulties posed by Facebook and social media. Internally, Facebook has known about these challenges for years, but it appears that business concerns—about revenue, growth, and potential blowback from high-profile users and their followers—have regularly displaced and overruled concerns about mental health, social good, democracy, human rights, amplification of violence and hatred in the U.S. and abroad, and intractable and perhaps unsolvable problems of the platform itself. Time and again, Facebook’s leadership has issued platitudes of supposed concern. Certainly, they have devoted some resources to these issues, occasionally significant. But they have not gotten ahead of the problems that the platform has created, catalyzed, and revealed.

It seems clear to this observer that Facebook and its current leaders cannot be trusted to navigate these issues without outside oversight and intervention. It is possible that the current organizational structure of Facebook (now Meta) itself cannot be a trusted partner to address all the concerns that have come to light.

As individuals and as a society, we must ask ourselves how social media is enhancing, complicating, and impeding our journeys of identity, belonging, and wellness. We must make conscious choices, or we will be caught up in the algorithms and narratives of forces that are not in our best interest.

As The Wall Street Journal noted in its comprehensive series “The Facebook Files”:

“Facebook’s own research lays out in detail how its rules favor elites; its platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.

The documents show that Facebook has often made minimal or ineffectual efforts to address the issues and plays them down in public.”

Moreover, Facebook’s platform itself has foiled even the best of Mark Zuckerberg’s intentions, most notably in spreading accurate COVID-19 vaccine information. A relatively small number of anti-vax activists combined with anti-vax commenters created a deluge of anti-vax sentiment earlier this year. Facebook played catch-up for months, by “demot(ing) the news feed ranking of content it described as sensationalist, alarmist or even indirectly discouraging vaccines”; slashing the number of comments users could make on posts from authoritative health sources; adding tools to share facts about the vaccines; and allowing users to turn off comments. Guy Rosen, Facebook’s Vice President of Integrity, cited a survey stating “vaccine acceptance by Facebook users in the U.S. had risen 10 to 15 percentage points since January, and said it had removed or reduced the visibility of more than 185 million pieces of debunked or false COVID content.”

This might be something of a success story, but it also demonstrates the kinds of kludges and patches Facebook has to invent on the fly to deal with its inherent problems:

1. Sensationalistic information that plays to user fears propagates virally and overwhelmingly.

This has caused and continues to cause significant and even overwhelming real-world impacts. Individuals and societies-writ-large cannot function without substantively addressing fear. Social media seems incapable of adequately addressing the fears it propagates and thus appears to be a useful tool for those who wish to weaponize fear to produce division, suffering, and societal challenge, up to and including the destruction of society itself.

2. Mental health concerns have been minimized.

Social media is not good or bad, but good and bad. Dealing with the mental health downsides of social media (including excessive or compulsive use, social comparison, worsened relationships, damage to body image, decreasing subjective well-being, and increasing depression and anxiety) has largely been left to individuals, some of whom are much more vulnerable than others. Researchers internal and external to Facebook have pointed out these problems for years, yet we meander on in the shadow of C-suite denials and distractions.

3. The drive for growth and engagement online is problematic at best.

Certainly, social media was a lifeline for many during the pandemic, easing some stings of isolation. But should there be an “Instagram Kids”? Personally, viewing children as a market for social media and its affiliated advertisers brings a pit to my stomach. The violence Facebook has unleashed in Myanmar, Sri Lanka, Ethiopia, India, the U.S., and elsewhere is a greater and even more unconscionable distress.

Facebook might be viewed as a kind of colonial power, caring little about its downside in the developing world. In my view, this should weigh on us and force change, up to and including the possibility of withdrawing Facebook entirely from the world stage. That a company could introduce itself into a country without taking full account of potential problems is highly disturbing.

Congress is currently considering proposals to check some of these problems.

  • The Justice Against Malicious Algorithms Act targets Section 230 of the Communications Decency Act of 1996, which “shields a broad range of digital services—from giants like YouTube and Instagram to smaller sites like Etsy and Nextdoor—from lawsuits for hosting and moderating user content.” (Lima, C. “Top Democrats unveil bill to rein in tech companies’ ‘malicious algorithms,’” The Washington Post, October 14, 2021
  • The KIDS (Kids Internet Design and Safety) Act introduced by Senator Ed Markey (D-Mass) proposes new measures to protect children under the age of 16. It would ban push notifications, autoplay of videos, and algorithmic processes that expose children to a wide variety of content deemed harmful (including sexual material, content promoting physical or emotional violence and bullying, other dangerous, abusive, exploitative, or exclusively commercial content).
  • The COPPA (Children’s Online Privacy Protection Act 2.0) was also sponsored by Senator Markey. This Act, updating the 1998 COPPA, extends current privacy protection measures to 13-15-year-olds, ban targeted advertising to kids, and more. (See Wall Street Journal video on these last two here.)
  • The Platform Transparency and Consumer Transparency Act (PACT), introduced by Sen. Brian Schatz (D-HI), would allow researchers access to social media companies' data, as well as other provisions related to defining acceptable use and dealing with complaints about content.

Where is our mind, individually and collectively, on social media? Does our time online substitute for other more critical ways to grow in depth, relatedness, and humanity? Can society promote healthy ways to interact with tech, or are we left to the whims and drives of profit-hungry corporations? How can we limit harm?

These are complicated questions. We can’t simply cherry-pick ostensibly good outcomes (the ability to share and connect, the capacity to grow a business) and ignore the bad. We have to be aware, and we have to act on our awareness. Lives, and our future, hang in the balance.

(c) 2021 Ravi Chandra, M.D., D.F.A.P.A.


For more:

Chandra R. Facebuddha: Transcendence in the Age of Social Networks. Pacific Heart Books, 2017.

Chandra R. MOSF 16.9: An Ugly Truth: Social Media, the American Psyche, and Preventing Civil War. East Wind eZine, Nov. 27, 2021.

Frenkel S, Kang C. An Ugly Truth: Inside Facebook’s Battle for Domination. Harper Collins, 2021.

Haidt J. The Dangerous Experiment on Teen Girls. The Atlantic, November 21, 2021.