From Project Liberty <[email protected]>
Subject How Algorithms Replaced Human Editors: Yuval Harari’s Warning
Date March 11, 2025 1:31 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Are algorithms the new cultural gatekeepers?

View in browser ([link removed] )

March 11th, 2025 // Did someone forward you this newsletter? Sign up to receive your own copy here. ([link removed] )

How Algorithms Replaced Human Editors: Yuval Harari’s Warning

In his recent book Nexus, Yuval Noah Harari suggests that we misunderstand how culture is shaped—not just online today but throughout history.

In his telling, we tend to give the creators of culture—the authors, musicians, philosophers, artists, and contemporary influencers and content creators—too much credit.

Yes, creators play an essential role, Harari concedes. But it is the editors that shape culture.

They are the ones who decide what gets included and excluded, what gets published and what gets left on the cutting room floor. They shape what has the chance to enter into the zeitgeist and what falls away into obscurity.

For centuries, the editing role has been in human hands—shaped by their power, and at times, influenced by their biases and preferences.

But today, as Harari points out, much of that editorial power has shifted to technology. On Instagram, X, or TikTok, it’s no longer human editors making the calls. Instead, a complex web of algorithms—trained often on opaque criteria—determines what gets amplified and what fades into the void.

We are downstream of our algorithms. They don’t as much reflect culture. They shape the culture that shapes us.

If all this is true, how might we move upstream to take greater agency over the information we engage with? Is the solution simply consuming more selectively, or is it being more intentional in crafting our digital environment?

In this week’s newsletter, we explore the idea that our algorithms are editors that shape our experience and what we can do about it.

// Globalizing sameness

Algorithms don’t just direct the attention of a society but influence buying patterns and individual preferences. It can be hard to tease apart what we genuinely like and what algorithms have told us to like.

- Homogenous listening: After a year of listening to music and podcasts on Spotify, the music streaming service will share each user’s Spotify Wrapped, a personalized summary of their most listened-to songs, creators, and genres. But as Tiffany Ng wrote about in MIT Technology Review ([link removed] ) last year, Spotify’s algorithm herds us into familiar listening patterns and makes it harder to discover new music.
- Algorithm-designed spaces: In the book Filterworld, author Kyle Chayka explains how social media algorithms “flatten” and homogenize our culture by making decisions for us. All of a sudden, we all like the same things and have the same taste ([link removed] ) .

// Editors throughout history

Editors have always played an important role in society. In the 4th century, the Catholic Bishops who gathered in Carthage (modern-day Tunisia) for the Councils of Carthage ([link removed] .) decided what books should be included in the Bible.

Last month, for its 100th anniversary, The New Yorker ran an article ([link removed] ) about the role of the invisible editor in shaping the articles that ultimately made it onto its pages. At The New Yorker, the editors possess extraordinary power.

In modern times, the producers who oversaw the big three network news channels in the 20th century (NBC, ABC, and CBS) disproportionately influenced the cultural zeitgeist. How the Vietnam War was covered ([link removed] ) had bearings on how divided and charged the nation became in the 1960s and 1970s.

What receives our attention has always been shaped by gatekeepers. The difference today? The gatekeepers are no longer human.

// Algorithms: The new editors

With algorithms as our modern-day editors, core issues emerge:

- Algorithms, like people, are biased. In previous editions of this newsletter ([link removed] ) , we've explored the risks of algorithmic biases in housing, healthcare, education, and facial recognition.
- Algorithms are upstream from our attention. Chayka’s book Filterworld explains how today’s creators—from interior designers to musicians—are more inclined to optimize for the engagement of an algorithm than for the engagement and interaction of humans themselves.
- Algorithms are opaque. Many algorithms are not transparent about how they work or what types of content they reward and promote.

// Reclaiming choice

Our mission at Project Liberty is to build solutions that help people regain control of their digital lives by reclaiming a voice, choice, and stake in a better internet.

It’s hard to entirely extricate ourselves from the influence of algorithms, but the effort to reclaim control and exercise more choice is unfolding in multiple realms.

- Policy & regulation around algorithmic transparency: You can’t shape what you don’t understand, and today, many algorithms are black boxes controlled by tech companies unwilling to open the hood and show how their algorithms work. Audits of algorithms ([link removed] ) can help. The EU is a leader in regulation with its 2024 AI Act ([link removed] .) , which requires that “AI systems are developed and used in a way that allows appropriate traceability and explainability, while making humans aware that they communicate or interact with an AI system, as well as duly informing deployers of the capabilities and limitations of that AI system and affected persons about their rights.” In the U.S. in 2022, senators proposed an Algorithmic Accountability Act, but it hasn’t yet been passed into law.
- New platforms that give users more control: Platforms like Bluesky allow users to choose which algorithms they want to drive their experience on the platform. In a 2023 post ([link removed] ) on algorithmic choice, Bluesky CEO Jay Graber wrote, “We want a future where you control what you see on social media. We aim to replace the conventional ‘master algorithm,’ controlled by a single company, with an open and diverse ‘marketplace of algorithms.’”
- Consumers demanding change: In 2022, widespread protests ([link removed] ) from Instagram users about changes to the algorithm and design of Instagram led the platform to walk back ([link removed] ) changes.
- Consumers finding workarounds to manipulate the algorithm: Some platforms allow users to select chronological feeds as opposed to ones oriented around engagement. On X, users can select different feeds. On YouTube, viewers can manage their recommendations by selecting videos they’re not interested in.

In the book Nexus, Harari suggests that culture doesn’t just happen. It is engineered by those who control the flow of information. Today, that power rests in the hands of algorithms. But technology isn’t destiny. Just as humans once shaped the editorial gatekeeping of the past, we still have the ability to push for transparency, demand choice, and build alternatives to today’s algorithmic editors. It’s still up to us to be the ultimate editors.

Project Liberty in the news

// Project Liberty supported a bill that was passed by the Utah Legislature ([link removed] ) last week, HB418, which grants users the right to control and manage their data.

// At SXSW in Austin, Texas last week, Project Liberty Founder Frank McCourt and investor Kevin O’Leary discussed The People’s Bid for TikTok in a fireside chat. Watch the conversation here ([link removed] ) .

Other notable headlines

// 🖥 Reddit is not perfect, but it may be the best platform on a web of noise and junk, according to an article in The Atlantic ([link removed] ) .

// 🚨 A whistleblower alleged that Meta considered sharing user data with China, according to an article in the Washington Post ([link removed] ) .

// 📺 The vast majority of YouTube's estimated 14.8 billion videos have almost never been seen. Until now, according to an article in the BBC ([link removed] ) .

// 📱 Screen time reports won’t help you put your phone down. An article in Vox ([link removed] ) outlined why you should make your phone boring.

// 📝 Some publishers are considering a long-shot bet to bypass the middlemen of social media, according to an article in the New York Times ([link removed] ) .

// 🎙 A podcast on Tech Policy Press ([link removed] ) with Audrey Tang explored the opportunities and risks of digital public squares.

Partner news & opportunities

// Exploring AI ethics across disciplines

March 19, 12-1pm ET (Monthly Series)

Join a virtual event series, presented by NC State University’s Grand Challenge Scholars Program ([link removed] ) and partners, featuring experts discussing AI’s ethical implications across disciplines. The first session, “AI x History: Ethics, Memory, and the Future of the Past,” will feature Jason Steinhauer and Nathan Lachenmyer. Register here ([link removed] ) .

// TikTok debate at Harvard

Harvard’s Berkman Klein Center ([link removed] ) co-founder Jonathan Zittrain moderated a debate between Alan Rozenshtein and Faculty Associate Anupam Chander on the law requiring ByteDance to sell TikTok to a U.S. firm or face a ban. About the Supreme Court's decision to uphold the law, Chander predicted, “This is a constitutional decision that will reverberate for the next century.” Watch here ([link removed] ) or read the summary in the Harvard Gazette ([link removed] ) .

What did you think of today's newsletter?

We'd love to hear your feedback and ideas. Reply to this email.

/ Project Liberty builds solutions that help people take back control of their lives in the digital age by reclaiming a voice, choice, and stake in a better internet.

Thank you for reading.

Facebook ([link removed] )

LinkedIn ([link removed] )

Sin título-3_Mesa de trabajo 1 ([link removed] )

Instagram ([link removed] )

Project Liberty footer logo ([link removed] )

10 Hudson Yards, Fl 37,

New York, New York, 10001

Unsubscribe ([link removed] ) Manage Preferences ([link removed] )

© 2025 Project Liberty LLC
Screenshot of the email generated on import

Message Analysis

  • Sender: n/a
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a