The $500 Million Fine Against Fortnite Signals A New Era In The Regulation Of User Interfaces: The Federal Trade Commission charged Epic Games with a range of shady business activities and levied a massive $520 million punishment as part of a broad settlement announced on Monday.
There is an underlying theme of deceptive design throughout the complaint, which touches on a variety of topics from claimed invasions of children’s privacy to misleading users into making unintended purchases.
As part of the settlement, Epic agreed to a number of interface modifications, including increasing friction to the purchase process to prevent unintentional purchases, implementing a new rapid purchase cancellation system, and disabling voice chats for children.
FTC Chair Lina Khan issued a statement stating that “Epic used privacy-invasive default settings and deceptive interfaces that deceived Fortnite players, including teenagers and minors.”
“Protecting the public, and especially minors, from online privacy intrusions and dark patterns is a key concern for the Commission, and these enforcement actions make clear to businesses that the FTC is cracking down on these illegal practices,” the commission stated.
Regulators are focusing on the manipulative capabilities of digital interfaces after years of discussion, and the government seems prepared to take action.
According to John Davisson, director of litigation and senior counsel at the Electronic Privacy Information Center, or EPIC, the FTC has been working on deceptive design practices for years, but this is the biggest move up in terms of enforcement we’ve ever seen (unrelated to Epic Games).
Lawmakers now have a keener awareness of the shortcomings in digital design. On the web, layout and composition are getting more attention. Dark patterns, a term for misleading design, were outlawed last year as part of an amendment to the California Consumer Privacy Act.
In September, California approved the Age Appropriate Design Code, requiring businesses to put children’s safety and well-being first when developing online services.
A similar rule with the same name was implemented in the UK last year; it resulted in a $30 million punishment for TikTok, and New York State is currently drafting a bill even more draconian in its approach to children’s design. Federal officials in the United States are also stepping up to the plate; in 2021, the FTC sponsored a session on dark patterns.
According to Justin Brookman, former head of technology research at the FTC and current director of technology policy at Consumer Reports, there has been a trend in favor of regulating design.
There is increasing consideration of mandating businesses to take other values into account when building goods, and it is acknowledged that decisions about platform architecture are within the purview of what regulators can pursue. (Disclosure: This reporter previously worked at Consumer Reports’ journalism department, which is independent of Brookman’s position there in the advocacy department.)
Design regulation is challenging. Making one button blue and the other red can affect user behavior, but no one wants the government to control the colors used on websites. But in situations like Fortnite’s, the issues are a little bit more obvious.
Players were duped into spending hundreds of millions of dollars in unauthorized transactions by Epic’s “counterintuitive, inconsistent, and confusing button configuration,” the FTC claimed. When trying to start the game from sleep mode or by hitting the instant purchase button, which is situated just next to the preview item toggle, for example, users run the risk of unintentionally purchasing items.
Epic apparently disregarded complaints from over a million consumers when they reported the issue. According to the FTC, “Epic intentionally hid cancel and refund features to make them harder to find” using internal testing. If users attempted to dispute charges with their credit card issuers, Epic would freeze their accounts.
In a statement, Epic discussed the settlement and its strategies for resolving the issues brought up by the FTC. No developer makes a game with the purpose of it appearing here, according to Epic. “The laws have not changed, but the way they are applied has changed, and traditional business tactics no longer work.
We agreed to this agreement because we want Epic to lead the way in consumer protection and give our gamers the finest possible experience.
According to EPIC’s Davisson, “This settlement is going to wake firms up; they’ll be taking a serious look at what the FTC regards as manipulative design to make sure they’re not repeating the same behaviors.”
The settlement’s discussion of the voice chat function in Fornite is possibly the most unexpected element. Even for youngsters, chats were enabled by default, putting them at risk of being harassed or even sexually abused. The FTC claims that this broke the law against unfair business practices. But that argument stands out because it defines voice chats as inherently harmful and makes them susceptible to regulatory oversight.
“For the FTC, asserting that turning voice chat on by default is damaging per se is a brand-new premise. I am unable to think of any comparable instances when they said that particular design decision was damaging by nature, Brookman remarked.
If we take into account additional technological features and services that can come with inherent hazards, this argument might have broader ramifications. Consider complaints that TikTok’s algorithm is overly addicting or Instagram’s associations with eating disorders and suicidal ideation in adolescent girls.
Insofar as it offers chat functionality, Fortnite is a kind of social media platform, according to Brookman. The FTC asserts that businesses have a greater duty to create systems that minimize harm.
In Davisson’s opinion, Fortnite’s move is positive, particularly when considering dark patterns in the context of privacy issues. According to Davisson, there is “growing recognition and acceptance” that the layout of platforms and websites plays a significant role in extractive commercial monitoring. As part of a larger data protection drive, that issue needs to be addressed.
You May Also Like: