Site icon Buzz Around Us – Buzzaroundus.net

How Technology Is Hijacking Your Mind

How Technology Is Hijacking Your Mind

“It’s easier to fool people than to convince them that they’ve been fooled.”

— Unknown.

The human brain is undoubtedly one of the human body’s most amazing organs, as is universally acknowledged. The three-pound mass has a large number of neurons and uses 20% of the body’s energy. The control center of the body is the human brain. It is what enables all of the daily activities to be performed.

But it’s not flawless. It has the potential to experience tech stress, especially now that everything is connected via technology and the internet.

When utilizing technology, we frequently have an optimistic perspective on all the benefits it provides. But we’d want to illustrate here how it might work the other way round.

Although it is true that technology has spurred a lot of positive advancement, it is also taking over your thoughts. The detrimental psychological effect of implementing new technologies at work is known as tech stress or technostress.

Where does technology take advantage of the flaws in human minds?

American technical ethicist Tristan Harris has recently been researching how technology is hijacking your mind to get more money. Harris has been employed by Google for the past three years as a design ethicist. There, he has been concentrating on how to create technology in ways to prevent tech from hijacking your mind.

Tristan Harris eventually came to see this as similar to a magician’s act. Magicians begin by searching for people’s perception’s blind spots, edges, vulnerabilities, and boundaries in order to have an undetectable impact over their actions. When you learn how to manipulate others, you can play them like a piano.

This is exactly what product designers do. In the struggle for your attention, they use your psychological weaknesses — both deliberately and unconsciously — against you.

In this article, we will show you the ways how technology is hijacking your mind.

Technology Is Hijacking People’s Minds

Here’s how it happens.

Hijack 1:
If You Control the Menu, You Control the Choices

Western culture is based on the principles of personal autonomy and freedom. Millions of us passionately defend our ability to make ‘free’ decisions while unaware to how those decisions are influenced earlier by menus that we didn’t choose.

Exactly this is what magicians do. They give people the possibility of free choice while designing and implementing the menu so that they earn, whatever you choose. One cannot overstate how profound this understanding is.

When presented with a menu of options, people hardly ever would ask:

Take Tuesday night, for instance, when you’re out with pals and want to continue the conversation. You open Yelp to see a list of bars and to discover suggestions for your area. The group collapses into a scrum of faces comparing bars on their phones as they look down. They compare cocktail drinks as they closely examine each person’s photos. Is this menu still appropriate for the group’s initial goal?

The issue isn’t that bars are a bad option; rather, Yelp changed the group’s original query — “Where can we go to keep talking?” — by rewriting the menu to read, “What’s a bar with excellent pictures of cocktails?”

Moreover, the group believes the Yelp menu to be an entire list of options for where to go, which is deceptive. They miss the park across the street where a band is performing live music because they are too busy looking down at their phones. They overlook the temporary art gallery serving coffee and crepes on the other side of the street. Both of those are absent from Yelp’s menu.

The more options we have thanks to technology in almost every area of our lives (data, activities, places to go, contacts, relationships, careers), the more we come to believe that our phone is always the most useful and empowering menu to choose from. Is it?

The menu with the “most empowering” options is not the one with the greatest number of options. But when we mindlessly accept the menus that are provided, it’s simple to miss the distinction:

When we wake up in the morning and touch our phone to see a list of notifications — it frames the experience of ‘waking up in the morning’ around a menu of ‘all I’ve missed since yesterday.’

Technology hijacks how we view our choices and substitutes new ones by modifying the menus we choose from. However, the more we focus on the options presented to us, the more we’ll notice when they don’t genuinely meet our needs.

Hijack 2:
A Slot Machine In a Billion Pockets

If you’re an app, how do you keep people running your app? Become a slot machine.

An individual checks their phone 150 times every day on average. Why do we act this way? Are we consciously making 150 decisions?

The primary psychological component of slot machines is one of the main motives: intermittent variable rewards.

All tech designers need to do to maximize addictiveness is connect a user action (like pulling a lever) with a variable reward. When you pull a lever, you either instantly obtain a tempting reward (a match, a prize!) or nothing at all. When the reward rate is most variable, addictiveness is at its highest.

Does this influence truly have an impact on people? Yes. In the US, slot machines generate more revenue than sports, movies, and theme parks put together. According to NYU professor Natasha Dow Schull, author of Addiction by Design, people become “problematically involved” with slot machines 3–4 times more quickly than with other forms of gambling.

The terrible fact is that a few billion people carry around a slot machine in their pocket:

Because it’s profitable, apps and websites liberally scatter occasional variable rewards over their offerings.

However, sometimes slot machines appear by chance. For instance, email wasn’t purposefully transformed into a slot machine by some evil organization. When millions of people open their inbox and nothing is there, nobody wins. The designers of Apple and Google also didn’t want their phones to operate like gambling machines. It happened by chance.

However, it is now the responsibility of businesses like Apple and Google to lessen these effects by redesigning intermittent variable rewards to be less addictive and more predictable. For instance, they might provide users the option to choose regular times throughout the day or week to check “slot machine” apps, and then modify when new messages are delivered to coincide with those periods.

Hijack 3:
Fear of Missing Something Significant

A “1% chance you could be missing something vital” warning is another method that applications and websites manipulate people’s perceptions.

It will be difficult for you to turn him off, unsubscribe, or delete your account if he manages to persuade you that he is a channel for crucial news, messages, friendships, or prospective sexual encounters because (aha, he wins) you might miss something extremely important:

But if we focus on that anxiety, we’ll see that it has no boundaries: whenever we quit utilizing anything, we always miss something crucial.

But living from moment to moment in fear of missing something is not how we were intended to live.

And it’s amazing how quickly we awaken from the illusion once we let go of that fear. The worries we expected to have don’t materialize when we unplug for longer than a day, unsubscribe from those notifications, or go to Camp Grounded.

What we don’t see, we don’t miss.

What if I miss something important? is a concern that arises prior to unplugging, unsubscribing, or turning off rather than after. Consider what would happen if tech companies understood that and actively assisted us in fine-tuning our relationships with friends and companies in terms of what we define as “time well spent” for our life, as opposed to what we might miss.

Hijack 4:
Social Approval

We are all prone to societal approval. One of the strongest human motives is the need to fit in, to receive approval from others, or to be valued by our peers. But now, tech firms control our social acceptance.

When your friend Marc tags you, you assume that he chose to do it on purpose. However, you fail to see how a business like Facebook arranged for him to do that in the first place.

By automatically recommending all the faces users should tag (for example, by displaying a box with a one-click confirmation, “Tag Tristan in this photo?”), Facebook, Instagram, or SnapChat can control how frequently people are mentioned in pictures.

Marc is actually responding to Facebook’s suggestion when he tags you rather than making a decision on his own. However, Facebook has power over the multiplier for how frequently millions of people feel as though their social standing is on the line through design decisions like these.

The same thing occurs when we update our primary profile photo since Facebook is aware that this is a time when we are particularly susceptible to feedback from others. This might be given a higher ranking on Facebook so that it appears more frequently in news feeds and receives more likes and comments from friends. We’ll be dragged back every time they like it or comment on it.

Everyone has a natural tendency to seek out social approval, but certain groups — notably teenagers— are more susceptible than others. Because of this, it’s crucial to understand how capable designers are at exploiting this weakness.

Hijack 5:
Social Reciprocity (Tit-for-Tat)

We are sensitive to feeling like we need to return others’ kindness. But much like with social approval, tech firms now control how frequently we feel it.

Sometimes it happens by accident. Email, texting, and messaging programs are factories for social reciprocity. But in other instances, businesses deliberately take advantage of this weakness.

The most obvious issue is LinkedIn. Because each time someone reciprocates (by accepting a connection, answering a message, or endorsing someone back for a skill), they have to return to linkedin.com, where they can attract new visitors, LinkedIn wants as many individuals to create social responsibilities for one another as possible.

LinkedIn uses an asymmetries in perception, just like Facebook. When someone invites you to connect, you assume that they did it intentionally, but in truth, they most likely responded unconsciously to LinkedIn’s list of suggested contacts. To put it another way, LinkedIn converts your unconscious urges (to “add” a person) into brand-new social obligations that millions of people feel compelled to do. All the while making money off the time that people spend doing it.

Imagine millions of individuals having this kind of interruptions throughout the course of the day, reacting to one another like chickens with their heads chopped off — all because firms who benefit from it built it that way.

Welcome to social media.

Hijack 6:
Endless flows, Infinite Feeds, and Autoplay

Keeping people from stopping eating when they are no longer hungry is another method of hijacking them.

How? Simple. Take a limited and finite experience and transform it into a boundless stream that never ends.

In his study, Cornell professor Brian Wansink showed how you can fool individuals into continuing to eat soup by offering them a bottomless bowl that fills up as they eat. People who use bottomless bowls consume 73% more calories than those who use regular bowls and overestimate their calorie intake by 140 calories.

The same concept is applied by tech companies. News feeds are intentionally made to automatically reload with content to keep you reading and to remove any opportunity for you to stop, think, or quit completely.

Additionally, it explains why video and social media platforms like Netflix, YouTube, and Facebook automatically play the next video after a countdown rather than delaying playback until you make a decision (in case you won’t). These websites receive a significant amount of traffic thanks to the autoplaying of the next video.

When genuinely advancing their commercial interests, tech companies frequently make the assertion that “We’re just making it easier for people to view the video they want to watch.” You can’t blame them either because the money they fight for is increasing “time spent.”

Instead, picture a world where technology companies gave you the freedom to intentionally shape your experience such that it matched your personal definition of “time well spent.” Limiting not only how much time you spend, but also the characteristics of “time well spent”.

Hijack 7:
‘Respectful’ Delivery vs. Instant Interruption

Companies are aware that synchronous messages are less convincing at persuading people to respond than asynchronous messages (like email or any deferred inbox).

Instead of encouraging users to respect each other’s attention, Facebook Messenger (or WhatsApp, WeChat, or SnapChat, for that matter) would choose to build their messaging system to interrupt recipients right away (and display a conversation box).

So interruptions are advantageous to business.

They also have a reason to make people feel more reciprocal and urgent. Facebook, for instance, immediately notifies the sender when you “see” their message rather than allowing you choose whether to hide the fact that you didn’t read it (“Now that one knows you’ve seen the message, you feel even more pressure to answer”).

Apple, in contrast, allows customers to toggle “Read Receipts” on or off in a more courteous manner.

The issue is that increasing interruptions for business purposes leads to a tragedy of the commons that destroys attention spans worldwide and results in billions of pointless disruptions every day. We must use common design standards to address this significant issue.

Hijack 8:
Substituting Your Reasons with Their Reasons

Another way that applications can control you is by fusing your goals for using the app—completing a task — with its commercial objectives — maximizing our consumption while we’re there.

For instance, in the real world of grocery shops, purchasing milk and getting prescription refills are the top two reasons people go there. The pharmacy and the milk, however, are located at the back of the shop because supermarket stores seek to maximize the amount that customers purchase.

In other words, they combine what the company wants with what the customer wants (milk, pharmacy). The most well-liked things would be placed at the front of the shelves if businesses were actually set up to support customers.

The websites of tech corporations are all created similarly. For instance, the Facebook app intentionally prevents you from accessing a Facebook event without first landing on the news feed (your reason) when you want to search up one that is taking place tonight. Facebook aims to turn every reason you use the site — to increase the amount of time you spend consuming content — into a justification for using Facebook.

Instead, picture what would happen if…

Hijack 9:
Inconvenient Choices

It is said that corporations only need to “make choices available.”

Naturally, businesses try to make the decisions you want to make simpler and the decisions you don’t want to make harder. The same is done by magicians. You encourage a viewer to choose the option you favor while making it more difficult for them to choose the other.

You can “make a free decision” to end your digital subscription, for instance, at NYTimes.com. However, they send you an email with instructions on how to cancel your account by dialing a phone number that is only available during specific hours, rather than just completing it when you click “Cancel Subscription.”

We should consider the friction needed to implement choices rather than the availability of choices in the world.

Imagine a world in which decisions were classified according to their level of difficulty (similar to coefficients of friction), and that there was a separate organization, such as a trade association or non-profit, that classified these levels of difficulty and established guidelines for how simple navigation should be.

Hijack 10:
Predicting Errors, “Foot in the Door” strategies

Last but not least, apps can take advantage of people’s incapacity to anticipate the results of a click.

When given information, people struggle to predict the actual cost of a click. Salespeople are using “foot in the door” strategies by starting with a simple, innocent request (“Just one click to check which tweet got retweeted”) and building on it (“Why don’t you stay awhile?”). This approach is employed by nearly all engagement websites.

Imagine if web browsers and smartphones — the entry points via which individuals make these decisions — were genuinely on the lookout for users and assisted them in anticipating the effects of clicks.

For this reason, posts on websites include “Estimated reading time” at the top. People are treated with respect and dignity when you present them with the “real cost” of their choices. People were empowered to make educated decisions by default in a Time Well Spent Internet because decisions could be framed in terms of expected cost and benefit, rather than by putting in extra effort.

Bottom Line and Solutions

Are you angry that technology hijacks your company? So are we. There are literally thousands of techniques, but we’ve only included a few. Imagine entire bookcases, seminars, workshops, and trainings that impart such skills to budding digital entrepreneurs. Imagine a team of hundreds of engineers whose daily task is to come up with fresh tricks to keep you interested.

The highest form of freedom is a free mind, and in order to live, feel, think, and act freely, we need technology on our side.

We need exoskeletons for our minds and social interactions that prioritize our beliefs over our impulses in our smartphones, notifications screens, and web browsers. The time of others is valuable. And it should be rigorously protected, much like other digital rights and personal privacy.

Exit mobile version