- Tram Ho
Predictive reading time: 12 minutes
“It’s easier to fool people to convince them that they’ve been fooled.” – Unknown .
As Design Ethicist working for Google, Tristan Harris is an expert on how technology controls the weaknesses of human psychology, and his main job here is to design tools tools, principles, and rules help protect our thoughts from this unrecognizable control.
When using technology, we often focus on what technology does (in a positive way) for us but forget that it can do the opposite.
What advantages does technology take advantage of in our thinking?
Let’s start with a comparison with a magician. Magicians begin by finding blind spots, blind spots, defects and limitations in human awareness, from which they can influence other people’s actions they don’t even recognize.
And this is exactly what the product designers are doing with your thoughts. They “turn” you with easy-to-attack (recognizable and unrecognizable) points in your psychology to gain attention.
And here is how they have been and will continue to use.
Hijack # 1: If you control the menu, you have control over your choices
Western culture is built around the idea of freedom and personal choice. Millions of people around us vigorously defended the right to make the “free” choice, forgetting that those choices were manipulated from above with menu items that we didn’t choose from the beginning. .
This is what magicians do. They give people the illusion of free choice, while building menus so they can always win, no matter what you choose. This is indeed an extremely deep, effective and less offensive strategy.
When offered a multiple choice menu, people rarely ask:
- What is missing in the menu?
- Why do I only have these options and not others?
- Do I know what their goal is when making these menus?
- “Does this menu address my initial needs, or are these options just a distraction? (eg a row of overwhelming toothpaste)
For example, on Tuesday night, imagine you are hanging out with friends and want the conversation to be exciting. Open Yelp to find nearby suggestions and see a series of bars. The last group of friends stuck their eyes to the screen to compare bars . They mulled each picture, comparing drinks and cocktails. Is this menu still relevant to your team’s first wish?
Not a bar is a bad choice, but Yelp has replaced your group’s first question (“Where can we go to talk?”) With another question (“Which bar has beautiful cocktails? ”) all from the menu only.
What’s more, your team delusion into the illusion that Yelp’s menu shows all the options they can go. While looking down at the phone, they could not see in the park across the street, there was a group playing music. They miss the pop-up gallery on the other side of the street selling crepes and coffee. And neither option appears on Yelp’s menu.
The technology gives us more choices in almost all areas of life (information, events, places to go, friends, dating, work) – the more we assume the phone is is the most diverse and useful menu . Is it right?
The “most powerful” menu is very different from the menu with the most choices . But we blindly surrender to the menu presented, and easily forget this difference:
- “Who’s free to go out?” Becomes our most recent messaging menu.
- “What’s going on in the world?” Become the news feed menu.
- “Single for dating?” Becomes a face menu to swipe on Tinder (instead of a nearby event with friends, or city adventures)
- “I have to respond to this email.” Becomes the keyboard menu to type in answers (instead of human-communicating ways).
When we wake up in the morning and turn the phone over to see the noti — the phone replaces the “morning wake up” experience into a menu “all the things we missed yesterday.”
By shaping the menu for us to choose, technology dominates the way we choose, and replaces it with new options. But the more we pay attention to the choices offered, the more we realize when they don’t fit our real needs.
Hijack # 2: Put an In a Billion Pockets Slot Machine
If you are the app, then how do you attract people? Please turn into slot machine.
A person usually checks the phone 150 times a day. Why do we do that? Are we making 150 intentional choices?
One major reason is the psychological material # 1 in slot machines: intermittent variable rewards (into disruptive consequences) .
If you want to maximize the users’ addiction, all the tech designers need to do is connect user actions (such as dragging them) with a variable reward . You need to pull and immediately get an interesting reward or … nothing. The addiction is maximized when the success rate has a big difference.
Is this effect effective for everyone? The answer is yes”. Slot machines earn more money in America than baseball, movies, and thematic parks … combined. Compared to other forms of gambling, people are “enraptured in a confusing way” 3-4 times faster , according to Professor NYU Natasha Dow Schull, author of Addiction by Design.
But there is a loss of truth, to the ranks of people who are carrying a slot machine in their pocket:
- When pulling the phone out of our pocket, we are playing a slot machine to see what we have noti.
- When we pull the phone out of our pocket, we ‘re playing a slot machine to see what new email we get.
- When sliding your finger down to scroll the Instagram feed, we ‘re playing a slot machine to see which images we’ll see next.
- When swiping your face left / right on apps like Tinder, we ‘re playing a slot machine to see if we have a match.
- When clicking on the number on the notification, we are playing a slot slot to see what is underneath.
Apps and websites are littered with intermittent variable rewards across all of their products, so they should eat.
But in other cases, slot machines appear accidentally. For example, there is no evil company behind email services, because no one benefits when people check email every day and can’t get anything new in it. The Apple and Google designers are also not intended to make the phone work like a slot machine, all of which comes from many factors.
But now companies like Apple and Google are responsible for reducing these effects by converting intermittent variable rewards into less addictive and predictable results, with improved design even more. For example, they can assist people to set predictable timelines during the day or week to check “slot machine” applications, and adjust accordingly when new messages are sent, to combined with these time points.
Hijack # 3: Fear of Missing Something Important (FOMSI)
Apps and websites also dominate our thinking with fear “1% of us are missing something important.”
If I persuade you that I am a bridge to everything important like information, messages, friendships or love opportunities — you will hardly turn me off, unsubscribe, or delete my account— because (aha, I win) you may miss important things:
- This keeps us from subscribing to newsletters even if they don’t have too many benefits
- This makes us continue to “make friends” with people who haven’t talked for a hundred years (“What if I miss important information from them?”)
- This makes us keep stroking our faces on dating apps, even if we haven’t met anyone for a long time (“What if I miss a hot match like me?”)
- This makes us constantly appear on social networks (“What if I miss the hot trend that everyone knows?”)
But if we look at these fears, we will realize the truth no one can avoid: we will always miss something important at a certain time , because we have to eat, sleep, bathing, going to the toilet … and that time, we have no way to update everything.
But living every minute and every second in fear is not the way we should live.
And miraculously, how quickly we awaken from illusion when we get rid of that fear. When we disconnect for more than a day, unsubscribe all those notifications — worries we think we have, in fact never happened.
We don’t miss what we can’t see.
Hijack # 4: Social Approval (Accept social)
We all have a “ social approval” effect. The need to belong to a certain group, to be accepted, to be respected by everyone around is one of the greatest learning and working motivation of human beings. But now our social approval is in the hands of technology companies.
When I was tagged by my friend Marc, I imagined he was making a conscious choice . But I didn’t see how companies like Facebook arranged this from the beginning.
Facebook, Instagram or SnapChat can manipulate the way a person is tagged with images by automatically suggesting the faces that people should tag (for example, show a hint box “Tag Tristan in this photo? Is it possible?”). “).
So when Marc tagged me, he was in fact responding to Facebook’s suggestion, not making an independent choice. But through this design, Facebook is simultaneously controlling the way millions of people experience social approval.
The same thing happens when we change the profile picture, Facebook knows right away that when we are vulnerable to social approval factors: “What do you think about this new photo?” Facebook can rank pictures This image is higher than in news feed, so they will appear more and friends will comment on it more. Whenever they like or comment on it, we will be pulled back.
Everyone naturally responds to social approval, and some groups (like the puberty group) are more vulnerable than other groups. Knowing how to take advantage of this weakness, the designer had a very powerful weapon in his hand.
Hijack # 5: Social Reciprocity (A social response)
- You help me – I owe you the next time.
- You say, “thank you.” – I have to say “nothing.”
- You email me — no reply will be considered rude.
- You follow me – if you don’t follow up, it will be rude. (especially with puberty groups)
We are bound to the need to respond to other people’s gestures. And like Social Approval, technology companies now manipulate our experience week.
Sometimes, this is just asexual. Email, texting and messaging applications are just the same social media delivery channels . But in other cases, companies intentionally abuse this weakness.
LinkedIn is probably the best expert at Social Reciprocity today. LinkedIn wants more and more people to be socially responsible for each other, because every time they respond (by accepting a connection, replying to a message, or resetting someone’s skills) they return to linkedin.com and Continuing the circle like that makes everyone spend time on it.
Like Facebook, LinkedIn takes advantage of perceived asymmetry. When you receive a connect invitation from someone, you imagine that this person is making a conscious choice , but in reality, it is very likely that they just unconsciously reflect on LinkedIn’s contact list suggestions. only In other words, LinkedIn turns your unreasonable outbreaks (when “adding” someone) into new social responsibilities that make millions feel they have an obligation to respond. And LinkedIn benefits greatly from the number of times people spend on these obligations and responsibilities.
Imagine millions of people interrupted like this all day, running around like chickens losing rice, trying to respond to each other – all designed by companies that benefit from this weakness.
Welcome to the social network.
Imagine if technology companies have a responsibility to minimize these social responses. Or if there is a neutral organization expressing the desire of everyone — an industry association or the technology world FDA — to help keep track of when technology companies abuse biases like this?
Hijack # 6: Bottomless Cups, Endless Feeds, and Autoplay
Another way to manipulate people is to force them to consume something, even if they are no longer hungry.
How? Kinda easy. Selecting a limited experience and turning it into an endless series continues forever.
Cornell University professor, Brian Wansink, talked about this issue in his research, suggesting that you could trick others into constantly eating soup by giving them an endless, automatic cup after they eat. accomplished. With endless cups, people eat up to 73 calories more than the regular cup and evaluate the calories they eat at less than 140 calories compared to the usual.
Technology companies take advantage of this same principle. News feed is purposely designed to fill with reasons that keep you rolling, and intentionally eliminate any reason for you to stop, consider, or exit.
This is also the reason why video and social networking sites like Netflix, Youtube and Facebook autoplay videos follow after a few seconds instead of waiting for you to make a conscious choice. A large amount of traffic on these websites comes from autoplay like this.
Technology companies insist that “we are just creating conditions for people to watch the videos they want to see” while they really just want to enrich themselves. And you should not blame them, because this era, “time spent” from you is a measure of every business.
Hijack # 7: Instant Interruption vs.. “Respectful” Delivery (Respectful notice)
Companies know that messaging disrupts people often persuades people to respond rather than “post-view” style messages but emails.
With such psychology, Facebook Messenger (or WhatsApp, WeChat or SnapChat is similar) will always want to design their messaging system in the direction of “disruptive” recipients immediately (and jump out a chat box) instead of Help users respect each other’s interest.
In other words, the more you interrupt others, the more you should eat .
Not only that, they also added factors such as emergency feeling and social response. For example, Facebook automatically tells the sender when you “viewed” their message, instead of giving you the option to show whether or not you read it (“Now you know I have read the message, I must have an obligation to respond. ”)
In contrast, Apple respects users more, and allows them to turn off the “read reader”.
The problem is, the maximum interruption in the name of making “tragedy of the commons” (the tragedy of the general), breaking the pace of concentration of people (have you ever caught people and forgot about I’m doing when the facebook dot appears?) and caused billions of interruptions every day. This is a huge problem that we need to address with the general design standard.
Hijack # 8: Bundling Your Reasons with Their Reasons
Technology also has another way to manipulate our thinking, by using the reason for using the application (doing a job) and making them inseparable from the application’s business purpose (maximizing The intensity of our consumption when we “lost it”.
For example, in the convenience store of the physical world, # 1 and # 2 are the reasons people come here to buy body medicine and get more milk. But the convenience store wanted to maximize the number of goods each person would come to buy, so they put the medicine and milk at the end of the store.
In other words, these stores make things that customers want (milk, medicine) cannot separate from what they themselves want . If the store is really designed to support people , they will place the most popular item on the top .
Technology companies design their websites in the same direction. For example, when you want to find the Facebook event tonight (your reason), the Facebook application will not give you access to it but not on the news feed page before (their reason), and this is the design. purely intentional. Facebook wants to convert every reason you use Facebook, into a reason to maximize your consumption time on their site.
Instead, imagine if …
- Twitter gives you a way to post without having to see the news feed.
- Facebook gives you a way to search for events without forcing you to use news feeds.
- Facebook gives you a way to use Facebook Connect as a passport to create new accounts on third-party applications and websites, without being forced to install both Facebook and news feeds and notifications.
Hijack # 9: Inconvenient Choices (inconvenient choices)
We just learned at # 1, that the choices they provide us are not good. But things don’t stop there.
- “If you don’t like it, you can always switch to another product.”
- “If you don’t like it, then you unsubscribe at any time.”
- “If you’re addicted to our application, it’s okay to uninstall it every time.”
Companies always want to make the choices they want you to make easier, and the choices they don’t want you to make become more difficult. So is the magician. You put the easy option in your favor, the audience will choose it; The more difficult choices are, the fewer people will look.
More specific examples, NYTimes.com allow you to “freely choose” to cancel subscribing to news from this site. But instead of clicking “Cancel Subscription” is done, they send you an email with instructions to cancel your account by calling the phone number that only opens at a time of the day.
Imagine using the internet and provided with labeled options for their difficulty. And an independent leader, an association or a non-profit organization, performs branding and systematization of these difficulties to bring to users.
Hijack # 10: Predicting the problem, the tactic “Foot in the Door”
Finally, there is a fact that we cannot predict the consequences of a click, and applications can exploit this weakness.
People can hardly predict (intuitively) the true price of a click when they have to choose where to click. The sales people specialize in using the “door-to-door” technique by making a seemingly harmless request (“one click only to see which tweet has been retweets”) and everything escalates from there (“At Why don’t you linger a bit more? ”). Almost every interactive website uses this trick.
Imagine if the web browser and smartphone, the gateway that helps people make this choice, really cares about people and helps them predict the consequences of mouse clicks.
That’s why I have more “predictive reading time” at the top of my post. When setting “the true price” of choice before everyone, you are treating users with the respect they deserve. In such an ideal world, choices will be presented with the fake giving the same benefits so that we can evaluate, avoid slipping into the trap of business companies.
TripAdvisor is a good example of the “set foot on the door” technique, deceiving users with the number of stars rated. But when you click, the entire 3 pages of the survey will jump out to ambush you.
All is not just imagination
With the goal and mission to protect internet users from manipulating people as a puppet; Timewellspent.io was created to research, implement and advocate for the technology companies’ current skills, ensuring a safe, healthy, and user-oriented internet environment.
With the implementation of Timewellspent , perhaps in the near future, we can put the above values into reality, not only in imagination.
ITZone via medium
Source : Medium