USA-China Tensions Transform Global Market

After the U.S. elections, relations between the...

The Act on Digital Services – “Internet Constitution” comes into effect

LAWThe Act on Digital Services - "Internet Constitution" comes into effect

On February 17, 2024, the Digital Services Act (DSA), often referred to as the “Internet Constitution”, comes into effect, introducing new responsibilities for entrepreneurs operating online. The focus of businesses and media has primarily been on the new rules for displaying ads. “However, even greater changes await us in the area of content moderation and protecting users from harmful content,” says Jakub Pietrasik, counsel in charge of IP & TMT practice at the Warsaw office of Wolf Theiss.

The Digital Services Act, i.e., the Regulation of the European Parliament and the Council on the digital single market services (“DSA”), introduces a set of rules for conducting online businesses. The main goal is to limit damage, counter threats, and provide new measures to ensure user protection.

“While the new rules already apply to global tech players, smaller, local entities must also now ensure compliance with the new regulations,” explains Jakub Pietrasik.

These regulations involve numerous additional obligations, especially for internet platforms defined as intermediary services, which publicly disseminate user information. This includes not only social media platforms, but also platforms for trade, comparison portals, and forums with commentary and ratings.

The DSA requires online platforms to moderate content that is inconsistent with both Union and national law. This applies not only to the content itself but also to the content related to specific products or services. Lack of response will be tantamount to liability for their distribution and will be subject to penalties. According to the DSA, the upper limit of such penalties has been set at 6% of the annual global turnover of the intermediary service providers from the previous financial year.

Under EU law, illegal content includes incitement to terrorism, discriminatory material, promoting racism and xenophobia, or violating intellectual property rights. According to national law, this definition also includes materials that are contrary to the criminal and civil codes.

“The list of illegal contents is very broad and includes, among others, photos that could be identified with alcohol advertising, comments encouraging to commit a crime or posts containing punishable threats. Another extensive category includes links to pirated copies of films or music, as well as graphics that violate intellectual property law,” explains Kinga Kluszczyńska from the Wolf Theiss law firm.

Moderating illegal content means implementing measures by online platforms to prevent or limit users’ access to it, which may include removing it. Under the DSA, online platform providers are required to establish mechanisms (e.g., special forms or procedures) that allow users to report the presence of illegal content. Publishing illegal content is also sanctioned by their demonetization or downrating to limit their visibility. The moderating action may also cover accounts promoting banned content. At the same time, users will be given tools to appeal the platform’s decision – through a dedicated channel, as well as through the judicial process.

Additional categories of content introduced by the DSA include those related to service provision conditions and so-called harmful content, which also includes disinformation. From April 2023, both very large online platforms and search engines are required to undertake actions reducing risks associated with such content, which include measures to ensure that misleadingly genuine or authentic content is properly labelled and identified.

The regulations favor users and require websites and service providers that their moderation policies are transparent and readily available. For content moderation, however, they impose an obligation of due diligence, objectivity, and proportionality.

“On the one hand, the DSA gives online portals a lot of freedom in shaping their terms and conditions and creating catalogs of unwanted content, on the other hand, it introduces a number of guidelines aimed at protecting user interests. It seems that the intention is to find a balance between administrators and users, promoting freedom of speech while maintaining universal principles, such as social coexistence rules,” adds Kinga Kluszczyńska.

One of the most significant changes introduced by the DSA for service providers is the requirement to set up dedicated channels or technological solutions enabling anyone or any entity to report inappropriate content. This involves designating so-called trusted flaggers, who have priority in handling such reports. These entities are expected to be independent of Internet providers and possess specialized knowledge and skills in detecting, identifying, and reporting illegal content. Furthermore, these entities will publish annual reports summarizing the reports made.

Check out our other content
Related Articles
The Latest Articles