Добавить новость
ru24.net
Nieman Journalism Lab
Ноябрь
2025
1
2
3
4
5
6 7
8
9
10
11
12 13
14
15
16
17 18 19 20
21
22
23
24 25 26
27
28
29
30

Trapped by what they know: Young adults’ algorithmic cynicism

0

Today’s young adults are the first generation to grow up entirely within a digital world shaped by algorithms. According to Pew Research, 84% of U.S. young adults (ages 18 to 29) use at least one social media platform regularly. For many, these platforms are more than entertainment; they are the primary gateway to news and information.

The problem is that the content they see every day is far from accurate or neutral. Every post, video, or story is filtered through opaque systems designed to maximize engagement. These systems push emotional, sensational, and often misleading content to the top, making false or biased information spread faster, while nudging users into filter bubbles that narrow their worldview.

Educators and policymakers often point to algorithmic literacy as the solution. I was one of those voices. The idea is straightforward: If young people understand how algorithms select, prioritize, and promote content, they can better navigate their news environment. Because they are “digital natives,” the hope is that such education will be both intuitive and effective.

But my recent study, published in the Harvard Kennedy School Misinformation Review, complicates this optimism. Surveying 348 Americans ages 18 to 25, I first found one encouraging result: Young adults who understand how algorithms use their data, what motivates social media algorithm formulas, and what ethical consequences follow are far more aware of the risks than those with less knowledge of algorithms. They recognize that algorithms can amplify misinformation and trap them in filter bubbles.

But here’s the twist: Greater algorithmic knowledge did not necessarily translate into healthier online behavior. Those with stronger algorithmic understanding were actually less likely to correct misinformation or seek out diverse perspectives on social media.

I call this “algorithmic cynicism.” It’s the sense that personal action is futile against massive, profit-driven social media systems designed to capture attention rather than serve truth. This isn’t just apathy; it reflects a broader cultural mood where young people feel paralyzed in the face of a media ecosystem flooded with sensationalism and polarization. When the game feels rigged, why bother playing? Fighting the algorithm can feel as futile as tilting at digital windmills.

But cynicism comes at a cost. A generation that understands manipulation but disengages from action risks undermining the very foundations of a healthy public sphere and democracy itself.

And my data suggest that cynicism isn’t the only problem. Another finding reveals a different side of the challenge: Time spent on social media is linked to lower understanding of how algorithms work. This paradox may stem from a kind of technological familiarity that discourages critical reflection. For many young users, algorithms are simply the air they breathe — so deeply embedded that they fade into the background. Heavy users often take these systems for granted, seeing them as natural parts of the digital experience rather than as mechanisms worth questioning. That everyday ease breeds an illusion of understanding: People feel they “get” how platforms work because they use them constantly, even though their grasp of the underlying logic remains shallow. As a result, overconfidence can dull curiosity and reduce motivation to learn about the very systems shaping their attention.

This pattern isn’t only about screen time; it also cuts across ideology, revealing another kind of disconnection. Young adults who identify toward the conservative end of the political spectrum also tend to have lower algorithmic knowledge. This may reflect broader patterns of media distrust among conservative audiences, as discussions of bias or platform design can sometimes be viewed through partisan lenses. Uneven awareness means uneven power to resist manipulation. Heavy social media users may not recognize how their feeds are being shaped, while politically conservative young adults may approach discussions of media bias with suspicion rather than curiosity. This compounds the challenge: While some young adults become cynical and disengaged because they know too much about how the system works, others remain unequipped because they know too little.

So what can be done?

First, algorithmic literacy must go beyond teaching how algorithms work to fostering a sense of agency. My data show that most respondents received little to no formal instruction about algorithms in school. That makes early, structured education crucial. It also calls for collaboration between educators and platforms — meeting young adults where they already are, through short in-feed explainers or creator-led content that demystifies algorithms.

Second, education must lead to practice. Awareness alone isn’t enough. Students should be guided to experiment with their own feeds — adjusting settings, auditing recommendations, or subscribing to credible sources — to experience how small acts can alter what they see. These micro-interventions can counter the helplessness that feeds algorithmic cynicism.

Third, algorithmic literacy efforts must be ideologically inclusive. The finding that conservative-leaning young adults show lower algorithmic knowledge reminds us that digital education cannot be framed as a partisan cause. Improving algorithmic literacy across ideological lines means showing how information moves, not just what it says. Teaching young people to trace why certain stories trend or how recommendation systems work can build a shared sense of how truth circulates online.

Being a “digital native” means familiarity with technology, not empowerment. Without coordinated action from educators, policymakers, and platforms, this algorithmic cynicism could harden into the defining feature of a generation’s digital life — with profound consequences for democratic society.

Algorithms don’t just shape what young people see; they shape how they see the world. That means the stakes are far higher than individual feeds. It’s time to move from resignation to action.

Myojung Chung is an assistant professor of journalism and media innovation at Northeastern University and a faculty associate at the Berkman Klein Center for Internet & Society at Harvard University.

Adobe Stock



Moscow.media
Частные объявления сегодня





Rss.plus
















Музыкальные новости




























Спорт в России и мире

Новости спорта


Новости тенниса