Social media algorithms are engineered for profit, providing endless entertainment for users and generating billions for tech companies, advertisers, and content creators. But beneath the surface, these same algorithms promote misinformation, fuel political polarization, and restrict our freedom of choice by dictating what we see. In doing so, they don’t just shape individual perspectives—they threaten the fabric of American democracy. To protect the integrity of our information ecosystem, we must take action to curb the power algorithms have over our feeds.
In the 1970s, the number of TV channels began increasing exponentially. Since then, polarization has been on the rise. Before the ‘70s, Americans watched three channels on their TVs, which created a shared collective experience and encouraged cooperation across party lines. But that has since changed.
More TV channels led to a more fragmented media landscape, making it harder for viewers who watched different programs to find common ground. It also meant misinformation could more easily slip through the cracks.
Before this fragmentation, politicians that tried to rally voters with misinformation were quickly ousted. Joseph McCarthy lost support as soon as people figured out that all the democrats were not in fact communists. But as the number of media outlets increased, so did tolerance for tactics similar to McCarthy’s. From the ‘70s to the ‘90s, Newt Gingrich enjoyed a wildly successful political career all while slandering the names of democrats and falsely claiming they had corrupt intentions.1
Gingrich succeeded where McCarthy did not because he took advantage of novel media technologies—such as C-SPAN—that enabled him to speak to his constituents without pushback from political opponents.2
Gingrich’s tactics persist today, with politicians threatening the funding of NPR and PBS over alleged bias. Efforts to defund these outlets serve only to delegitimize reporting that is critical of the sitting administration. Such actions erode the strength of the press as a governmental check and fuel media fragmentation by discouraging partisans from engaging with content that contradicts their views. The more fragmented our information landscape, the easier it becomes for misinformation to spread unchecked and for politicians to avoid accountability.
Social media has taken media fragmentation to entirely new heights. In 2025, anybody with a smartphone and internet connection can assume the role of a news producer. As a result, political misinformation is increasingly prominent. This is especially alarming given that voters are more likely to believe misinformation if it supports their political leaning, which contributes to the polarization that America is experiencing today. To combat it, we need to ensure voters have easy access to a broad range of perspectives.
With cable news, viewers can easily switch between channels to see what people on all sides of the political spectrum are saying. With social media, it is far more difficult.
Profit-driven algorithms are designed to keep users online longer, increasing ad exposure and maximizing revenue. When a user engages with a certain type of content for longer, the algorithm interprets this as a preference and responds by showing similar content. Eventually, a user can find themselves in an echo chamber, where they are presented only with content that supports their political leaning, regardless of factuality.
A 2021 study recruited Facebook users and randomly split them into groups that would watch left or right leaning content. The two groups subscribed to accounts that posted content with the respective political slant, making Facebook’s algorithm less effective at suggesting preference-aligned content.
The algorithm began suggesting a wider range of political content to many users and by the end of the study participants felt significantly less disdain towards members of the opposite party. This suggests that limiting the effectiveness of social media algorithms can help decrease affective polarization.
Policy makers should pass a policy that decreases the amount of control algorithms have over user feeds. If social media platforms were required to include more randomly selected content in users’ feeds instead of only showing posts based on their past preferences, echo chambers could weaken, misinformation could decline, and polarization could ease.
If we want our democracy to thrive, Americans need to hear from each other. We must be encouraged to engage with viewpoints beyond our own. While there is no shortage of perspectives on the internet, algorithms inhibit our ability to access them. Diversifying our feeds will help us recognize misinformation, engage in constructive dialogue, and achieve political cooperation.
While social media is not yet the dominant news source for most voters, it is shaping the political landscape of future generations. Regulating algorithms is a crucial first step in safeguarding our democracy. If we fail to do so, we risk deepening political divides and further damaging trust in our political system.
Footnotes:
- Steven Levitsky and Daniel Ziblatt, How Democracies Die (New York: Broadway Books, 2018), 146.
- Levitsky and Ziblatt, How Democracies Die, 147.