The illusion of choice and fear: how technology makers manipulate us
When you grab a smartphone first thing in the morning, this is not really your decision. When you are constantly distracted by notifications during work, this is also not your decision. You are being manipulated with might and main, but you don’t even notice. Writes about this Lifehacker.
When we use this or that technology, we are rather optimistic about the opportunities that it gives us. Tristan Harris, co-founder of the Center for Humane Technology and a former design ethics specialist at Google, talked about the back of all this and how technology exploits the vulnerability of our minds.
Having felt blind spots, weaknesses and limits of people's perception, an illusionist can influence them so deftly that a person will not even notice how he is led by the nose. If you find the right “keys” from people, you can play them like a piano.
The creators of products do the same with our minds. To win attention, they play with the psychological weaknesses of a person - consciously or not.
Trick number 1. If you control the menu, then control your choice
Western culture is built on the ideals of freedom and personal choice. Millions of people are fiercely upholding the right to freedom of decision-making, but at the same time they do not see point-blank that they are being manipulated. All this freedom is available only within the framework of a given menu - and we, of course, did not choose it.
Magicians work like that. They give people the illusion of free choice, but in reality they only throw up options that guarantee the illusionist victory.
If a person is given a ready-made list of options, he rarely wonders what is not on the list and why there are precisely such options in it, and not some others. What the one who made the list wanted to achieve, whether these options help satisfy the need or only distract from it - hardly anyone will ask about this.
Imagine that you met friends on Tuesday evening and decided to sit somewhere. Open the aggregator of reviews and start looking for what's interesting nearby. The whole company instantly stumbles into smartphones and begins to compare bars, study photos and evaluate the list of cocktails. So how did it help solve the “sit somewhere” task?
The problem is not in the bars, but in the fact that using the menu aggregator replaces the initial need. "Sit and chat" turns into "find a bar with the coolest cocktail photos." Moreover, your company falls into the illusion that the proposed list has all the options available. While friends look at the screens of smartphones, they don’t notice that in the park nearby the musicians had a live concert, and across the street there is a cafe serving pancakes and coffee. Well, I would, because the aggregator did not offer them this.
The more choices technology gives us in any sphere of life, whether it’s information, events, places to go, friends, dates or work, the more we believe that the smartphone provides an exhaustive list of options. But is it?
The choice that helps us solve a problem is not the same as the choice from a large number of options. But when we blindly believe everything that they palm off on us, this difference is easy to miss. The question "Who can I hang out with tonight?" turns into a choice from the list of people you recently chatted with. “What is happening in the world” is turning into a news feed. The question "Who to go on a date with?" he decides to flip through photos in Tinder, although you could go with friends to some local event or just go on an adventure in the city. Finally, the installation “I need to answer this letter” comes down to choosing options for what to write, and there are many other ways to contact a person.
Even our morning starts with checking notifications. We wake up and immediately pick up a smartphone - you never know, suddenly missed something important. But does the notification list show what really matters to us?
By creating a limited list of options from which to choose, technology replaces our true preferences with those that are convenient for them. And if you look closely at what we are offered, you can understand: all this does not meet our real needs.
Trick number 2. A personal slot machine in everyone’s pocket
How can an application hook a user? You have to become a kind of slot machine. On average, a person checks a smartphone 150 times a day. But is it possible that all 150 times is a conscious choice?
The same mechanism works here as in slot machines: reinforcement with periodic rewards. If you need to hoist the user on your product, connect his actions with the opportunity to get the same reward. You pull the lever and immediately get a prize - or you get nothing. The more the size of the reward changes, the stronger the addiction.
So does this really work? And how. Slot machines in the United States bring more money than baseball, movies, and amusement parks combined. According to Natasha Dow Schull, a professor at New York University, addiction to slot machines arises 3-4 times faster than other types of gambling.
And now the unpleasant truth: billions of people carry a slot machine in their pocket.
We play when we pick up a smartphone and check for fresh notifications. We play when we open the mail - well, are there any new letters? We play when we watch the Instagram feed: I wonder what kind of photo will be next? Even when you flip through photos in Tinder, this is also a game: you suddenly find someone with whom you can create a couple.
Applications and sites use rewards simply because it works for the good of the business. But sometimes this effect occurs by chance. For example, e-mail is not at all a product of some evil corporation. Nobody makes profit from the fact that millions of people regularly check mail and find nothing new there. Designers Apple and Google did not want to turn your smartphone into a gaming machine. It just happened.
Now large companies must take responsibility and nullify this effect, making rewards less addictive and more predictable. For example, they could give people the opportunity to choose the time themselves when they want to check what's new in the applications, and send notifications only during this period.
Trick number 3. Fear of missing something important
To manipulate people's minds, applications and sites suggest: there is a 1% chance that you will miss something important. If I can convince you that I am a source of significant information and a supplier of friendly contacts or potential sexual partners, you simply will not get rid of me. You will not delete your account and do not unsubscribe, because you will be afraid to miss something.
Therefore, we are subscribed to stupid mailings - you never know, suddenly in the next letter there will be something interesting. We keep in “friends” people whom we haven’t communicated with for a hundred years - we will suddenly miss something important from them. We are sitting in dating applications, even if we do not plan to meet with anyone, because we are afraid to miss the very one or the one who is interested in us. We hang out on social media so as not to miss the news that everyone around will discuss.
But if you take a closer look at this fear, it turns out that in any case we will at least miss something.
You can’t see a message from an old friend, if you don’t sit on Facebook for several hours in a row, miss out on your ideal partner in Tinder, if you don’t flip photos there 700 times a day, and don’t answer an urgent call on time - you cannot be in touch 24/7 .
Seriously, we do not live in order to constantly twitch and be afraid to miss something. It is amazing how quickly this fear disappears when you get rid of illusions. Try to go offline at least for a day and turn off all notifications. Most likely, nothing terrible will happen.
We do not miss what we do not see. The idea that you can overlook something appears until the moment you exit the application or unsubscribe from the newsletter. Before, not after. It would be great if technology companies took this into account and helped build relationships with others in terms of time well spent, rather than frightening us with the illusory opportunity to miss something important.
Trick number 4. Social endorsement
Each of us is easy to catch on this bait. The desire to belong to a certain group and receive recognition from it is one of the strongest motivators for any person. But now technology companies are driving social endorsement.
When a friend marks me in a photograph, I think that this is his conscious choice. In fact, a company like Facebook brought him to such an action. Social networks manipulate how people point to photos of other users, palm off them candidates, which can be noted in one click. It turns out that my friend did not make a choice, but simply agreed to what Facebook suggested. Through such decisions, the company operates millions of people, playing on their desire for social approval.
The same thing happens when we change the profile photo. The social network knows: at this moment we are most vulnerable to the approval of others - it’s interesting after all, what friends will say about the new photo. Facebook can raise this event in the news feed so that as many people like or comment as possible. And every time someone does this, we return to the social network again.
Some groups are especially sensitive to public approval - to take at least adolescents. Therefore, it is extremely important to understand how designers influence us when they use this mechanism.
Trick number 5. Social reciprocity, or service for service
They helped me - I have to help in return. They say "thank you" to me - I answer "always please." I received an email - it would be rude not to reply. You signed up for me - if I don’t do the same in return, it’s not very polite.
The need to reciprocate the actions of others is another weak spot. Of course, technology companies will not miss the chance to exploit this vulnerability. Sometimes this happens by accident: emails and instant messengers, by definition, suggest reciprocity.
But in other situations, companies intentionally exploit our weaknesses in order to benefit.
LinkedIn is probably the most obvious manipulator. The service wants to create as many social obligations between people as possible, so that they return to the site whenever they receive a message or request to be added to contacts.
LinkedIn uses the same scheme as Facebook: when you receive a request, you think it’s a person’s conscious choice. In fact, he simply answered the contact list offered by the service on the machine.
In other words, LinkedIn turns unconscious impulses into social obligations, makes millions feel like they owe it, and capitalizes on it.
Just imagine how it looks from the outside. People run around all day like a chicken with a severed head and are constantly distracted from business in order to reciprocate each other, and the company that developed this model benefits. What if technology companies took responsibility for reducing social obligations or a separate organization monitored this to prevent possible abuse?
Trick number 6. Bottomless plate, endless tape and auto play
Another way to take control of people's minds is to make them consume, even if they are already fed up. How? Yes easily. We take a process that is limited and finite, and turn it into an endless stream.
Cornell University professor Brian Wansink has shown how this works. The participants in his experiment ate soup from bottomless plates, which automatically filled again and again. It turned out that under such conditions, people consumed 73% more calories than usual, while underestimating the real amount of food eaten.
Technology companies use the same principle. A news feed automatically loads new records so that you continue to flip through it. Netflix, YouTube and Facebook include the following video instead of giving you the opportunity to make informed choices. A significant proportion of the traffic on these sites is provided by autoplay.
Companies often say that in this way they simplify the life of the user, although in essence they only defend their business interests. It's hard to blame them, because the time spent on the resource is the currency for which they are fighting. Just imagine that companies could make an effort not only to increase the amount of this time, but also to improve its quality.
Trick number 7. A sharp distraction instead of a polite reminder
Companies know that the most effective messages are those that sharply distract a person. They are more likely to be answered than a delicate email, which is quietly in the mailbox.
Naturally, instant messengers prefer to slow down the user, attract his attention and immediately show the chat window so that he immediately reads the message. Distraction is beneficial for business, as well as the feeling that the message needs to be answered urgently - here also social reciprocity is connected. For example, Facebook shows the sender that you read his message: if you want, you don’t want, but you have to answer. Apple treats users with great respect and allows you to disable read notifications.
Constantly distracting people, business creates a serious problem: it is difficult to concentrate when you are pulled a billion times a day for any reason. This problem can be solved with the help of common standards for creating services and applications.
Trick number 8. Your tasks are closely related to the tasks of the business
To make it easier for you to manipulate, applications study your goals (for example, performing a task) and combine them with business goals so that you spend as much time as possible in this application and actively consume content.
For example, people usually go to the supermarket for milk. But the store needs to increase sales, so dairy products are on the shelves at the very end of the hall. So the buyer's goals (buy milk) become inseparable from the store’s goals (to sell as much as possible).
If the supermarket really cared for customers, it would not force them to dangle around the hall, but laid out the most popular goods on the shelves immediately at the entrance.
Technology companies use the same approach when creating their products. You have a task - to open the event page on Facebook. But the application will not let you do this until you open the news feed. He has a different task - to make you spend as much time as possible on a social network.
In an ideal world, we are free to do what we need, not business: you can post a message on Twitter or open the event page on Facebook without going to the feed. Imagine a digital Bill of Rights that sets out product design standards. Thanks to these standards, billions of users will be able to immediately get what they need, and not wander through the maze.
Trick number 9. Inconvenient choice
It is believed that the business should give the client an obvious choice. Do not like one product - use another, do not like the newsletter - unsubscribe, and if you feel that you are addicted to the application, just delete it.
Not really. The business wants you to make a choice that is beneficial to it. Therefore, the actions that business needs are easy to carry out, and those from which one loss is much more difficult. For example, you can’t just opt out and unsubscribe from The New York Times. They promise that there is nothing complicated about this, but instead of instantly unsubscribing you will receive a letter with instructions and a number that you need to call at a certain time in order to finally cancel the subscription.
Instead of talking about the possibility of choice, it is better to take into account the efforts that must be made to make this choice. Imagine a world where affordable solutions are marked with a certain level of complexity, and all this is regulated by an independent organization.
Trick number 10. False forecasts and the “Foot in the door” strategy
Applications and services exploit the inability of a person to predict the consequences of a click. People simply cannot intuitively evaluate the real value of the action that they are offered to perform.
On the subject: 20 US cities where it’s easy to find high-paying technology jobs
In sales, the “Foot at the Door" technique is often used. It all starts with a harmless sentence: "Just one click, and you will see which tweet is retweeted." More is more: an innocent request is followed by a sentence in the spirit of “Why don't you stay here for a while?”
Imagine if browsers and smartphones, predicting the effects of a click, really cared about people and helped them make informed choices. On the Internet, all options should be presented taking into account the real benefits and costs - so that people can make informed choices without additional effort.
What to do with all this
Is it sad to know how the creators of technology control you? To gain freedom, you need to free the mind. Therefore, we need technologies that will play for us and help us live, feel, think and act freely. Smartphones with notifications and browsers should become for our mind and relations with the surrounding kind of exoskeletons - helpers who have priority in our values, not impulses.
Our time is value. And we must protect it as zealously as privacy and other digital rights.
Read also on ForumDaily:
stdClass Object ([term_id] => 1 [name] => Miscellaneous [taxonomy] => category [slug] => no_theme)Miscellaneous
stdClass Object ([term_id] => 1492 [name] => world [taxonomy] => post_tag [slug] => mir)мир
stdClass Object ([term_id] => 2333 [name] => technology [taxonomy] => post_tag [slug] => tehnologii)Technology
stdClass Object ([term_id] => 13992 [name] => Likbez [taxonomy] => category [slug] => poleznaja-informatsija)Educational program
Do you want more important and interesting news about life in the USA and immigration to America? Subscribe to our page in Facebook. Choose the "Display Priority" option and read us first. And don't forget to subscribe to ForumDaily Woman and ForumDaily New York - there you will find a lot of interesting and positive information.