When European Union vice president Věra Jourová met with YouTube CEO Neal Mohan in California last week, they fell to talking about the long-running conspiracy theory that the moon landings were fake. YouTube has faced calls from some users and advocacy groups to remove videos that question the historic missions. Like other videos denying accepted science, they have been booted from recommendations and have a Wikipedia link added to direct viewers to debunking context.
But as Mohan spoke about those measures, Jourová made something clear: Fighting lunar lunatics or flat-earthers shouldn’t be a priority. “If the people want to believe it, let them do,” she said. As the official charged with protecting Europe’s democratic values, she thinks it’s more important to make sure YouTube and other big platforms don’t spare a euro that could be invested in fact-checking or product changes to curb false or misleading content that threatens the EU’s security.
“We are focusing on the narratives which have the potential to mislead voters, which could create big harm to society,” Jourová tells WIRED in an interview. Unless conspiracy theories could lead to deaths, violence, or pogroms, she says, don’t expect the EU to be demanding action against them. Content like the recent fake news report announcing that Poland is mobilizing its troops in the middle of an election? That better not catch on as truth online.
In Jourová’s view, her conversation with Mohan and similar discussions she held last week with the CEOs of TikTok, X, and Meta show how the EU is helping companies understand what it takes to counter disinformation, as is now required under the bloc’s tough new Digital Services Act. Its requirements include that starting this year the internet’s biggest platforms, including YouTube, have to take steps to combat disinformation or risk fines up to 6 percent of their global sales.
Civil liberties activists have been concerned that the DSA ultimately could enable censorship by the bloc’s more authoritarian regimes. A strong showing by far-right candidates in the EU’s parliamentary elections taking place later this week also could lead to its uneven enforcement.
YouTube spokesperson Nicole Bell says the company is aligned with Jourová on preventing egregious real-world harm and also removing content that misleads voters on how to vote or encourages interference in the democratic processes. “Our teams will continue to work around the clock,” Bell says of monitoring problematic videos about this week’s EU elections.
Jourová, who expects her five year term to end later this year, in part because her Czech political party, ANO, is no longer in power at home in Czechia to renominate her, contends that the DSA is not meant to enable anything more than appropriate moderation of the most egregious content. She doesn’t expect Mohan or any other tech executive to go a centimeter beyond what the law prescribes. “Overusage, overshooting on the basis of the EU legislation would be a big failure and a big danger,” she says.
On the other hand, she acknowledges that if the companies aren’t seen to be stepping up to mitigate disinformation, then some influential politicians have threatened to seek stiffer rules that could border on outright censorship. “I hate this idea,” she says. “We don’t want this to happen.”
But with the DSA offering guidelines more than bright lines, how are platforms to know when to act? Jourova’s “democracy tour” in Silicon Valley, as she calls it, is part of facilitating a dialog on policy. And she expects social media researchers, experts, and the press to all contribute to figuring out the fuzzy borders between free expression and destructive disinformation. She jokes that she doesn’t want to be seen as the “European Minister of the Truth,” as tempting as that title may be. Leaving it to politicians alone to define what’s acceptable online “would pave the way to hell,” she says.