4 min read

The Algorithm: Exposure or Manipulation? A Balanced Take on Accountability

The Algorithm: Exposure or Manipulation? A Balanced Take on Accountability
Photo by 愚木混株 cdd20 / Unsplash

Introduction

In a recent post, Gary Vaynerchuk (Gary Vee) stated:

"Stop 🛑 blaming the algo .. the algo is exposing you not changing you .. the lack of accountability is shocking .. and if you don’t like it .. delete it … it’s time for you to take on the responsibility of your happiness - my friends .. you’re in control."

Instagram Post

At first glance, this message is both empowering and true. We are responsible for what we consume. If we engage with certain content, algorithms will naturally serve us more of it. If we dislike what we see, we have the power to change our habits or even remove ourselves from the platform. This is personal responsibility 101. However, the issue is far more complex than simply "stop blaming the algorithm."


Algorithms Don’t Just Expose—They Shape

The assumption that algorithms are neutral and merely reflect user preferences ignores the underlying incentives that drive these systems. Social media platforms prioritize content based on engagement metrics, favoring content that maximizes time spent on the platform.

What Type of Content Gets the Most Engagement?

  • Controversial or polarizing topics
  • Emotionally charged material
  • Tribalistic, “us vs. them” narratives
  • Sensationalized content that reinforces echo chambers

The reality is, you don’t always get served what you consciously want—you get served what you are most likely to engage with, even if it’s unhealthy, misleading, or addictive. The algorithm doesn’t just reflect who you are; it subtly shapes who you become by reinforcing content patterns that keep you hooked.

A detailed analysis of how engagement-based ranking can amplify divisive content is explored in this study: Engagement, User Satisfaction, and the Amplification of Divisive Content on Social Media.

This study conducted a pre-registered randomized experiment, comparing Twitter’s engagement-based ranking algorithm with a reverse chronological timeline. Participants were divided into two groups:

  • One group used Twitter’s algorithm, which ranks tweets based on engagement (likes, retweets, and replies).
  • The other group used a reverse chronological feed, where tweets appeared in the order they were posted, without algorithmic ranking.

By analyzing user behavior in both groups, the study found that Twitter’s engagement-based algorithm significantly amplified divisive and emotionally charged content, reinforcing political hostility. Users in the algorithm-based group reported more negative emotions toward political out-groups than those in the reverse chronological group.

This finding demonstrates that algorithms do not merely expose what users are interested in but actively steer them toward highly engaging, often polarizing content.


Another flaw in Gary Vee’s perspective is the idea that people are in complete control of their content choices. The moment you open an app like TikTok, Instagram, or YouTube, you are not given a blank slate to search freely. Instead, you're met with a feed of preselected, high-engagement content tailored by the algorithm. Your choices are already curated before you even start browsing.

Even if you make a conscious effort to seek out diverse perspectives, the content recommended to you is still being filtered based on what the platform deems most engaging. Platforms track not just your explicit interactions but also:

  • How many times you view a particular short video (Noble Desktop)
  • How long you watch before scrolling away (Hootsuite)
  • Whether you pause to read the comments (Sprout Social)
  • How frequently you interact with specific creators or topics (Sprout Social)

These subtle behavioral signals shape your feed more than you realize. Even passive engagement—such as pausing on controversial content without liking or commenting—signals to the algorithm that the content is effective at holding your attention, increasing the likelihood of similar material being recommended.


Beyond Individualism: A Systemic Perspective on Accountability

Gary Vee’s take aligns with a Stage Orange mindset—emphasizing personal hustle, individual responsibility, and self-determination. While this perspective is empowering, it lacks a Stage Yellow systemic awareness of how algorithms operate at scale.

A more balanced approach would recognize both individual and systemic accountability:

  • Personal Agency: Users should cultivate digital literacy, adjust content preferences, and diversify their media consumption.
  • Structural Awareness: Social media platforms optimize for engagement, not truth or well-being—which means they often prioritize controversy and tribalism over balance and accuracy.
  • Platform Responsibility: Tech companies must be held accountable for the types of engagement they amplify, through algorithmic adjustments, improved user controls, and transparency in content ranking.

A More Nuanced Take: Free Will in a Manipulated Environment

Yes, we can retrain our feeds. Yes, we have control over our digital diet. But free will is constrained by engineered incentives that push us toward specific content patterns. Simply telling people, "Take responsibility and delete the app if you don’t like it," ignores the deeper issue:

Social media doesn’t just expose who we are; it actively influences who we become.

Instead of focusing solely on individual responsibility, we should ask:

  • How can algorithms be designed to balance engagement with content quality?
  • How can individuals become more mindful consumers of digital media without falling into algorithmic traps?
  • How can we create healthier social media ecosystems that don’t reward controversy over truth?

These are the real questions worth asking. The conversation shouldn’t be about simply blaming users or avoiding responsibility—it should be about understanding the full picture.


Conclusion

Algorithms don’t just expose us; they shape us. While personal responsibility plays a role in digital media consumption, it’s equally important to recognize that these platforms engineer content experiences that prioritize engagement over well-being. Instead of reducing the conversation to “just take responsibility,” we must advocate for a more sophisticated understanding of how algorithmic incentives shape online behavior—and what can be done to create a more conscious and intentional digital landscape.