If you use Google, Instagram, Wikipedia, or YouTube, you’re going to start noticing changes to content moderation, transparency, and safety features on those sites over the next six months.
Why? It’s down to some major tech legislation that was passed in the EU last year but hasn’t received enough attention (IMO), especially in the US. I’m referring to a pair of bills called the Digital Services Act (DSA) and the Digital Markets Act (DMA), and this is your sign, as they say, to get familiar.
The acts are actually quite revolutionary, setting a global gold standard for tech regulation when it comes to user-generated content. The DSA deals with digital safety and transparency from tech companies, while the DMA addresses antitrust and competition in the industry. Let me explain.
A couple of weeks ago, the DSA reached a major milestone. By February 17, 2023, all major tech platforms in Europe were required to self-report their size, which was used to group the companies in different tiers. The largest companies, with over 45 million active monthly users in the EU (or roughly 10% of EU population), are creatively called “Very Large Online Platforms” (or VLOPs) or “Very Large Online Search Engines” (or VLOSEs) and will be held to the strictest standards of transparency and regulation. The smaller online platforms have far fewer obligations, which was part of a policy designed to encourage competition and innovation while still holding Big Tech to account.
“If you ask [small companies], for example, to hire 30,000 moderators, you will kill the small companies,” Henri Verdier, the French ambassador for digital affairs, told me last year.
So what will the DSA actually do? So far, at least 18 companies have declared that they qualify as VLOPs and VLOSEs, including most of the well-known players like YouTube, TikTok, Instagram, Pinterest, Google, and Snapchat. (If you want a whole list, London School of Economics law professor Martin Husovec has a great Google doc that shows where all the major players shake out and has written an accompanying explainer.)
The DSA will require these companies to assess risks on their platforms, like the likelihood of illegal content or election manipulation, and make plans for mitigating those risks with independent audits to verify safety. Smaller companies (those with under 45 million users) will also have to meet new content moderation standards that include “expeditiously” removing illegal content once flagged, notifying users of that removal, and increasing enforcement of existing company policies.
Proponents of the legislation say the bill will help bring an end to the era of tech companies’ self-regulating. “I don’t want the companies to decide what is and what isn’t forbidden without any separation of power, without any accountability, without any reporting, without any possibility to contest,” Verdier says. “It’s very dangerous.”
That said, the bill makes it clear that platforms aren’t liable for illegal user-generated content, unless they are aware of the content and fail to remove it.
Perhaps most important, the DSA requires that companies significantly increase transparency, through reporting obligations for “terms of service” notices and regular, audited reports about content moderation. Regulators hope this will have widespread impacts on public conversations around societal risks of big tech platforms like hate speech, misinformation, and violence.
submitted by /u/ethereal3xp