Home Web design how sites manipulate you by clicking

how sites manipulate you by clicking


    <classe étendue="attribution"><une classe="lien rapid-noclick-resp" href="https://www.shutterstock.com/image-vector/ad-block-popup-banner-concept-isolated-427404361" rel="nofollow noopener" cible="_Vide" data-ylk="slk:Rzt_Moster/Shutterstock">Rzt_Moster / Shutterstock</a></span>

The vast majority of websites you visit now greet you with a pop-up window. This annoying obstacle to your seamless web browsing is called a “cookie banner,” and it’s there to ensure your consent under online privacy laws for websites to retain information about you between browsing sessions.

The cookie banner purports to give you a choice: consent only to essential cookies that help maintain your browsing functionality, or accept all of them, including cookies that track your browsing history for sale to targeted advertising companies. Because these additional cookies generate additional revenue for the websites we visit, cookie banners are often designed to trick you into clicking “Accept All”.

The UK Information Commissioner recently urged G7 countries to tackle the problem, stressing how tired internet users are willing to share more personal data than they would like. But in truth, manipulative cookie banners are just one example of what is known as “dark design” – the practice of creating user interfaces that are intentionally designed to deceive or deceive the user.

Dark design has proven to be an incredibly effective way to encourage internet users to part with their time, money, and privacy. This in turn has established “dark models,” or sets of practices that designers know they can use to manipulate web users. They’re hard to spot, but they’re becoming more prevalent in the websites and apps we use every day, creating manipulative design products, much like the persistent, ubiquitous pop-ups we are forced to close. when we visit a new website.

Cookie banners remain the most obvious form of dark design. You’ll notice how big and cheerfully highlighted the ‘Accept All’ button is, catching your cursor a fraction of a second after you land on a website. Meanwhile, dull, less visible “confirm choices” or “manage settings” buttons – the ones we can protect our privacy – scare us with more time-consuming clicks.

You will know from experience which you tend to click. Or you can try the Cookie Consent Speed-Run, an online game that shows how hard it is to right-click when faced with a dark design.

E-commerce websites frequently use dark designs as well. Suppose you have found a competitively priced product that you would like to purchase. You dutifully create an account, select your product specifications, enter delivery details, click on the checkout page – and discover that the end cost, including delivery, is mysteriously higher than you might expect. ‘origin. These “hidden costs” aren’t accidental: the designer hopes you’ll just click “order” rather than spending even more time repeating the same process on another website.

Other dark design elements are less obvious. Free services like Facebook and YouTube monetize your attention by placing ads in front of you as you scroll, browse, or watch. In this “attention economy,” the more you scroll or watch, the more money businesses make. These platforms are therefore intentionally optimized to command and hold your attention, even if you prefer to close the app and continue with your day. For example, the expert-designed algorithm behind YouTube’s “Up Next” video suggestions can make us watch for hours if we let them.

Read more: Remember, you are being manipulated on social media: 4 essential reads

Application design

Manipulation of users for commercial purposes is not only used on websites. Currently, over 95% of Android apps on the Google Play Store are free to download and use. Building these apps is an expensive endeavor, requiring teams of designers, developers, artists, and testers. But designers know they’ll recoup that investment once we get hooked on their “free” apps – and they do so using dark design.

In recent research analyzing free app-based games that are popular with teens today, my colleague and I identified dozens of dark design examples. Users are forced to watch advertisements and frequently come across disguised advertisements that resemble a part of the game. They are invited to share social media posts and, when their friends join the game, are invited to make in-app purchases for differentiate their character from that of their peers.

Some of these psychological manipulations seem inappropriate for young users. Teenage girls’ susceptibility to peer influence is exploited to get them to buy clothes for game avatars. Some games promote unhealthy body images while others actively demonstrate and encourage bullying through assault. indirect between the characters.

There are mechanisms to protect young users from psychological manipulation, such as age rating systems, codes of practice, and guidelines that specifically prohibit the use of dark design. But these are up to the developers to understand and interpret these instructions correctly, and in the case of the Google Play Store, the developers are in control of their own work and it is up to the users to report any issues. My research indicates that these measures are not yet fully effective.

Turn on the light

The problem with the dark design is that it’s hard to spot. And the dark patterns, which are established in every developer’s toolbox, spread quickly. They’re hard for designers to resist when free websites and apps vie for our attention, judged on metrics like “time on page” and “user conversion rate”.

So, while cookie banners are boring and often dishonest, we need to consider the broader implications of an online ecosystem that is increasingly manipulative by design. Dark design is used to influence our decisions about our time, money, personal data, and consent. But a critical understanding of how dark models work and what they hope to achieve can help us detect and overcome their deception.

Google had not responded to a request for comment on this story at the time of its publication.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The conversation

The conversation

Daniel Fitton does not work, consult, own stock or receive funding from any company or organization that would benefit from this article, and has not disclosed any relevant affiliation beyond his academic appointment.


Please enter your comment!
Please enter your name here