Giving Compass' Take:
- Myojung Chung reports on research around youth algorithmic literacy and cynicism, explaining how we might empower youth with knowledge rather than inducing hopelessness and helplessness.
- What steps can donors and funders take to support algorithmic literacy that is not only informative but empowering to young people?
- Learn more about trends and topics related to education.
- Search our Guide to Good for nonprofits focused on education in your area.
What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Today’s young adults are the first generation to grow up entirely within a digital world shaped by algorithms. According to Pew Research, 84% of U.S. young adults (ages 18 to 29) use at least one social media platform regularly, showing the importance of youth algorithmic literacy and addressing cynicism. For many, these platforms are more than entertainment; they are the primary gateway to news and information.
The problem is that the content they see every day is far from accurate or neutral. Every post, video, or story is filtered through opaque systems designed to maximize engagement. These systems push emotional, sensational, and often misleading content to the top, making false or biased information spread faster, while nudging users into filter bubbles that narrow their worldview.
Educators and policymakers often point to algorithmic literacy as the solution. I was one of those voices. The idea is straightforward: If young people understand how algorithms select, prioritize, and promote content, they can better navigate their news environment. Because they are “digital natives,” the hope is that such education will be both intuitive and effective.
But my recent study, published in the Harvard Kennedy School Misinformation Review, complicates this optimism. Surveying 348 Americans ages 18 to 25, I first found one encouraging result: Young adults who understand how algorithms use their data, what motivates social media algorithm formulas, and what ethical consequences follow are far more aware of the risks than those with less knowledge of algorithms. They recognize that algorithms can amplify misinformation and trap them in filter bubbles.
But here’s the twist: Greater algorithmic knowledge did not necessarily translate into healthier online behavior. Those with stronger algorithmic understanding were actually less likely to correct misinformation or seek out diverse perspectives on social media.
I call this “algorithmic cynicism.” It’s the sense that personal action is futile against massive, profit-driven social media systems designed to capture attention rather than serve truth. This isn’t just apathy; it reflects a broader cultural mood where young people feel paralyzed in the face of a media ecosystem flooded with sensationalism and polarization. When the game feels rigged, why bother playing? Fighting the algorithm can feel as futile as tilting at digital windmills.
Read the full article about youth algorithmic literacy and cynicism by Myojung Chung at Nieman Journalism Lab.