Manipulative Design Practices Online: What Policy Solutions for the EU and the U.S.

man in the shadow holding phone

Manipulation has become an ever-present feature of the digital environment. Deceptive design patterns (note: often known as ‘dark patterns’, TACD and its members have chosen to use the terms manipulative/deceptive design) are applied frequently and in various forms, with the goal to maximise platform engagement and extraction of personal data.

The risks of manipulative design

Deceptive design deepens the ongoing structural risks of the surveillance economy by furthering discrimination, minimising privacy, and affirming data monopolies. Involuntarily shared location data can be used to deduce very intimate personal information, that can either be applied to manipulate consumer behaviour even further, or to actively discriminate.

“It allows discrimination at scale because people can be profiled. It means people can be targeted based on their vulnerabilities. It is an issue of discrimination and manipulation” – Finn Lützow-Holm Myrstad, Director of Digital Policy, Forbrukerrådet and EU Co-Chair of TACD’s digital policy group

Manipulation can also be the cause of financial harm. Consumers can be tricked into purchasing more expensive products, trapped in subscription, or be deceived through marketing that obscures costs or induces a sense of urgency.

Manipulative design presents a challenge to fundamental human rights, such as the freedom of thought; and undermines the capabilities of users to make decisions in their best interest. Every user of digital platforms is at risk of being manipulated, but particularly vulnerable groups, such as children, face increased risks.

The online environment is also characterised by an ever-increasing power and information imbalance. Technology companies have capacities to employ data scientists, behavioural psychologists, and design experts, with the intention creating addictive online spaces and obtaining the maximum amount of user data. Consumers are constantly asked to give up their data at a scope and frequency that has caused digital fatigue, which opens the door for the expanding normalisation of permanent surveillance.

Policy challenges and solutions

The EU has taken first steps towards reigning in big technology companies with the Digital Services Act and Digital Markets Act but failed in effectively tackling manipulative online design practises. Particularly enforcement processes have been ineffective and untransparent. Complaints can go unanswered for years without possibilities to estimate the progress or potential success of the complaint.

Video player Manipulative design practices CPDP

During TACD’s panel at the Computer, Privacy and Data Protection Conference (CPDP), MEP Kim van Sparrentak outlined the potential of the AI act in approaching the issue of manipulative design. A strengthened ban on manipulation in the AI Act could in-turn also support the Digital Service Act’s capabilities to engage with dark patterns.  The upcoming fitness check for EU consumer law does provide another opportunity to effectively deal with deceptive design practises and implement stronger consumer protection mechanisms. In addition, banning surveillance-based advertising could address root causes of profiling and manipulation, as well as challenge the business model of platforms centred around maximising user data extraction.

Similarly, the U.S. lacks concrete privacy legislation to effectively target deceptive design practises online. However, the Federal Trade Commission (FTC) possess capable investigation and enforcement authority. Business practises considered deceptive or unfair and thus causing harm to the consumer can be investigated by the FTC and may result in civil penalties. Determining that an act is considered deceptive is comparatively easier for the FTC but can also be mitigated easier by industry through more elaborate privacy disclosures that do not cause a substantial change business practices. Unfair practices are more difficult to determine but also hold substantial amount of promise as an enforcement tool. However, fundamental changes in the surveillance economy require a cultural change from technology companies that must be induced through legislation and enforcement. The burden of protection must not be solely on the side of the consumers.

Cooperation as solution?

The attempts of a renewed transatlantic cooperation as part of the EU-US Trade and Technology Council (TTC), as well as the Informal Dialogue on Consumer Protection, offer the prospect of improving privacy and consumer rights, particularly by coordinating policy and enforcement strategies, as well as through sharing expertise and institutional capacities. Shared values and shared objectives must be transformed into effective legislation that is centred around protecting consumers and transforming the online environment.