Is Tech Hacking Us?

Estimated Time of Reading: 3 minutes

“It’s easier to fool people than to convince them that they’ve been fooled.” – Unknown.

It was just another thought-slide up in my mind when I was tempted to click on a button which the blue social giant F app showed. Surprisingly, it was the first time when I withdrew my thumb from doing that. Then I thought “Is Tech hacking my mind?” So I got myself another topic to write about (yeah, with joy) and am rendering my views on it like before. When using technology, we often focus optimistically on all the things it does for us. But I want to show you where it might do the opposite.

The technology exploits our minds’ weaknesses by just giving us the menu of choices designed by “them”. When we are given a menu of choices, we rarely ask: “Is this menu empowering my original need, or are the choices actually a distraction?” We assume that our phone always has the most empowering and useful menu to pick from. Is it? The “most empowering” menu is different from the menu that has the most choices. But we blindly surrender to the menus we’re given.

Example: When we wake up in the morning and turn our phone over to see a list of notifications - it frames the experience of “waking up in the morning” around a menu of “all the things I’ve missed since yesterday.”

By shaping the menus we pick from, technology hacks the way we perceive our choices and replaces them with new ones. To maximise addiction, all tech designers do is link a user’s action with a variable reward. Does this effect really work on people? Yes.

  • We pull to refresh our email, to see what’s new.
  • We swipe down to scroll the News feed, to see what’s next.
  • We swipe faces left/right on Hike’s Match Up, to see if there’s a match.


The tech giants are paid for each and every pull and swipe of us.

Fear of Missing Something Important is the gun used by them to make us stay glued. Apps and websites hack people’s minds by introducing a “1% chance of missing something important.”

  • This keeps us “subscribed” to channels even after they haven’t delivered recent benefits.
  • This keeps us “friended” to people with whom we haven’t spoken in ages.

But if we zoom into that fear, we’ll discover that we’ll always miss something important when we stop using something. But note this “We never miss what we don’t need”.

We’re all vulnerable to Social Approval. The need to belong, to be approved or appreciated by our peers. But now our social approval is in the hands of tech companies.


Facebook or Instagram suggests all the faces in a photo people should tag (e.g. by showing a box with a 1-click confirmation, “Tag Vinesh in this photo?”). So when my friend tags me, he’s actually responding to Facebook’s suggestion, not making an independent choice. Imagine millions of people getting interrupted throughout their day, running around like chickens with their heads cut off, reciprocating each other — all designed by companies who profit from it.


Then there come the bottomless bowls, Infinite feeds and Auto play. Another way to hack people’s mind is to let them keep consuming things, even when they aren’t hungry anymore. News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave. It’s also why video and social media sites like YouTube, Netflix or Facebook auto play the next video after a countdown instead of waiting for you to make a conscious choice (in case you won’t). A huge traffic on these websites is driven by auto playing the next thing. asks us for a single click review (“How many stars?”) while hiding the triple page survey of questions behind.

We should know about Instant Interruption vs. “Respectful” Delivery. Given the choice, Facebook Messenger (or WhatsApp) would prefer to design their messaging system to interrupt recipients immediately (and show a chat head or chat box) instead of helping users respect each other’s attention. In other words, interruption is good for business.

Another way apps hack you is by taking your reasons for visiting the app and make them inseparable from the app’s business reasons. For example, when you want to look up a Facebook event happening tonight (your reason) the Facebook app doesn’t allow you to access it without first landing on the news feed (their reasons).

Inconvenient choices are given and we’re told that it’s enough for businesses to “make choices available.”

  • “If you don’t like it, you can always unsubscribe.”
  • “If you’re addicted to our app, you can always uninstall it from your phone.”

For example, lets you “make a free choice” to cancel your digital subscription. But instead of just doing it when you hit “Cancel Subscription,” they send you an email with information on how to cancel your account by calling a phone number that’s only open at certain times.

When you put the “true cost” of a choice in front of people, you’re treating your users or audience with dignity and respect. That’s why I add “Estimated reading time” to the top of this post. In a Time Well Spent internet, choices could be framed in terms of projected cost and benefit, so people are empowered to make informed choices by default, not by doing extra work.

The ultimate freedom is a free mind, and we need technology that’s on our team to help us live, feel, think and act freely. We need our smartphones, notification screens and web browsers to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first. People’s time is valuable. And we should protect it with the same rigour as privacy and other digital rights.

Words inspired from the thoughts by:

1. Andrew Sullivan (Sep 19th 2016,

2. Joe Edelman’s work on Human Values and Choice-making.

3. Tristan Harris (former Product Philosopher, Google)


Vinesh Iyappan loves writing about tech and is a second year student of Instrumentation Engineering.