How Algorithms Are Shaping Our Culture (And What to Do About It)
Every major cultural platform in your life is algorithmically curated. Spotify picks your music. Netflix picks your shows. Instagram picks your news. Amazon picks your products. Google picks your information.
These algorithms shape what you see, hear, think about, and buy. And most of us interact with them on autopilot, never questioning why we’re seeing what we’re seeing.
The Recommendation Problem
Recommendation algorithms are trained to maximise engagement. They learn what keeps you on the platform longer and show you more of it.
This creates a few problems.
Homogeneity. When everyone gets recommendations based on what’s popular and engaging, culture converges. The same ten albums dominate. The same types of restaurants thrive. The same aesthetic appears everywhere.
Walk through any trendy neighbourhood in any major city. The cafes look the same. The furniture stores sell the same mid-century modern pieces. The restaurants serve the same brunch menu. Algorithms haven’t caused this entirely, but they’ve accelerated it.
Polarisation. Engagement-optimised content tends to be emotionally charged. Outrage gets clicks. Controversy gets shares. Nuance gets ignored.
News feeds and social media algorithms amplify extreme positions because extreme positions generate engagement. The result is a political and cultural discourse that’s louder and angrier than the population it supposedly represents.
Filter bubbles. The more you consume content that aligns with your existing views, the more you see content that confirms those views. The algorithm creates a personalised reality where your perspective seems universal because everything you see supports it.
The Music Example
Spotify’s algorithm is particularly visible in music. The “Discover Weekly” playlist is eerily accurate at finding songs you’ll like. That’s genuinely useful.
But it also means you’re less likely to encounter music that challenges you. The algorithm doesn’t recommend jazz to a pop listener, even though that person might love jazz if they heard it. It optimises for what you already like, not for what you might benefit from exploring.
Artists suffer from this too. New musicians who don’t fit neatly into algorithmic categories struggle to reach audiences. The artists who succeed are increasingly the ones who create music that the algorithm favours — familiar enough to be recommended, different enough to feel new.
The News Example
Google News and social media feeds are most people’s primary news sources. Both are algorithmically curated.
The algorithm learns what stories you engage with. If you click on political news, you get more political news. If you click on scandal and drama, you get more scandal and drama. Serious, complex reporting about important but “boring” topics (policy, economics, infrastructure) gets less engagement and therefore less visibility.
This isn’t a conspiracy. It’s an optimisation function doing exactly what it’s designed to do. But the downstream effect is a less-informed public consuming more entertainment disguised as news.
The Shopping Example
Amazon and other e-commerce platforms use purchase history and browsing behaviour to recommend products. This is convenient when it surfaces something you actually need.
It’s less convenient when it creates consumption patterns you didn’t intend. “Customers who bought X also bought Y” isn’t neutral information. It’s a suggestion designed to increase your spending.
Advertising algorithms are even more targeted. That Instagram ad for the product you mentioned in conversation (which probably appeared because you searched for something related, not because your phone is listening) is an algorithm identifying your purchase intent and serving content to capitalise on it.
What You Can Do
Actively seek outside your feed. Visit websites directly rather than relying on social media to surface content. Subscribe to newsletters from diverse sources. Browse bookstores and libraries where discovery is serendipitous.
Audit your information diet. What are you consuming? Is it diverse in perspective, format, and source? If everything you read comes from your social media feed, you’re getting one algorithmically-curated view of the world.
Use incognito mode for research. When you’re researching a topic, incognito mode prevents your search history from skewing future results.
Resist the autoplay. Netflix’s autoplay, YouTube’s “up next,” and Spotify’s automatic playlist continuation are designed to keep you consuming. Make conscious choices about what to watch, listen to, and read.
Support independent media. Independent publications, podcasts, and creators are less beholden to algorithms. Supporting them (with subscriptions, donations, or just attention) maintains the media diversity that algorithms are slowly eroding.
The Bigger Picture
Algorithms aren’t evil. They solve real problems: finding relevant content in an ocean of information. The issue is that “relevant” has been defined almost exclusively as “engaging,” and engagement is often at odds with what’s valuable, challenging, or true.
Understanding this dynamic doesn’t mean rejecting technology. It means using it intentionally rather than passively.
You’re already making choices about what to consume. The question is whether those choices are truly yours or whether they’ve been subtly shaped by systems designed to maximise your time on a platform.
Awareness is the first step. Intentional consumption is the second. Neither requires giving up your phone. They just require paying attention to who’s choosing what you see, and occasionally choosing differently.