Skip to main content

Verified by Psychology Today

Artificial Intelligence

Who's Watching Whom?: The Dangers of Surveillance Capitalism

A new documentary reveals the dark side of the web.

Key points

  • Surveillance capitalism is based on tracking our every move—online and offline—to sell us more stuff.
  • Technology is shaping humanity in ways we cannot yet define.
  • We need a new shared philosophy to understand the mechanisms behind the screen.
David Donnelly/Used with permission
Source: David Donnelly/Used with permission

David Donnelly, a U.S. filmmaker based in London, has released a new documentary entitled The Cost of Convenience. While you might think the focus would be solely on our collective digital addiction, that problem is only the gateway to a broader issue that the film addresses: the sinister reality of power and thought control in the hands of a few corporations. It is not a conspiracy theory; it is the raw reality of our times. Internet platforms are collecting sensitive data from users without their knowledge, which has opened up a floodgate of opportunity for exploitation and manipulation. It is one of the reasons TikTok has come under such political scrutiny as the Chinese government allegedly has access to all of the app's users’ information.

Tracking digital versions of ourselves

The truth is that every move we make is being recorded, online and off. Through our excessive smartphone usage, so-called digital versions of ourselves are being tracked. Algorithms learn our preferences and keep feeding us more of the same in an endless feedback loop. It has created a deep rift in society, polarizing us to the point of paralysis. In essence, this film is an exploration of how far down the rabbit hole we are. It is about maximizing profits, not optimizing people’s lives. These corporations are beholden to their shareholders, but not to those who share, like, and comment on the platforms they’ve created.

The film reveals the notion of surveillance capitalism, thereby putting technology into a present-day framework. It asks the simple question: How does this benefit us as individuals and as a society? In the end, the filmmaker addresses the dangers of a no-holds-barred approach, including the downside of artificial intelligence. Drawing a steep comparison to the atomic bomb, Donnelly reveals how little we ask ourselves what will happen if we develop our technological potential and what risks they inherently hold. If the Oscar award-winning film Oppenheimer shows us anything, it is that we don’t ask that question often enough.

In a recent article, Donnelly suggests five areas of concern arising from Internet platforms: the ubiquitous nature of our devices (Internet is everywhere), personalized feeds that reinforce our existing beliefs, the exponential growth of computational power that is exceeding our ability to gauge its impact, the business model of profit maximization at the expense of societal and mental health, and the lack of regulation and accountability on the part of corporations running these platforms.

The personalization of Google and Facebook ads is just one example of how we are being manipulated. The sites' algorithms show us specific content based on the data they have gathered through our consumption of information. Essentially, how we scroll reflects how we roll in life. The more we search, scroll, and hunt down information, the more solidified our so-called filter bubble becomes. That, in turn, reinforces what we perceive to be true. To Google something does not mean what you find is true, but it does that mean what you find in your search is also what you find to be true. It can lead to a severe sense of disassociation.

In the film, Susie Alegre, an international human rights lawyer, states that “the direction that technology is going in is increasingly about getting inside our minds, understanding how they work, and changing that.” Mind control at its best. “That changes the whole future of humanity.”

In a brief interview, David calls for a cultural paradigm shift worldwide. In his view, Internet users need to know the mechanisms behind the platforms they are using. “We need to build a vocabulary so we can talk about these things. The world is in need of a new philosophy and basic understanding of what is happening behind the screen.”

What we can do

Before you begin to believe it is all doom and gloom, we can do something about how we deal with our everyday devices.

  1. Foster a sense of mindfulness around your digital device usage. Consciously put your phone away for a specified amount of time. Dare to even turn it all the way off. It’s better for you, and ultimately, for your device, too.
  2. Create a no-phone zone. The kitchen or dining room table is a great place to start.
  3. Log your screen time. Most smartphones have an app that shows you your average screen time per week. Compete with yourself for the lowest number possible, even if you shave off 10 minutes per day.
  4. Turn off location tracking on your smartphone.
  5. Engage in a spiritual practice, whether it is a walk in the woods, meditation, or playing with your pet. It can be very grounding. Besides, nature is objective. If you are struggling with anxiety and self-esteem issues in our FOMO culture, remember that the tree in your local park or woods doesn’t care if you’ve brushed your hair today or not.
  6. Write to your Congressman or Congresswoman to encourage more transparency and privacy regulations.

We require that people take driver's education classes and exams before getting their licenses. Operating a vehicle is based on a set of mutually agreed-upon rules; otherwise chaos would ensue, and our roads would become unsafe. Why shouldn't the same idea apply to the Wild West of the Web? After all, knowledge is power. It is time we take ours back.

References

Donnelly, David. “The Last of the Analogs.” Medium, March 4, 2024.

advertisement
More from Christine Louise Hohlbaum
More from Psychology Today
More from Christine Louise Hohlbaum
More from Psychology Today