Press releases

EMMA-ENPA: Feedback on Commission DSA Art. 28 draft Guidelines

By implementing the DSA obligations and already fulfilling the present guidelines’ core objective of ensuring a high level of privacy, safety and security, editorial online platforms should be explicitly excluded from requirements to implement additional measures prescribed in the guidelines.

The European Magazine Media Association (EMMA) and the European Newspaper Publishers’ Association (ENPA), representing the vast majority of European publishers of newspapers, magazines, periodicals and specialised press, including in digital, welcome this opportunity to provide feedback on the European Commission’s public consultation on its draft guidelines on measures to ensure a high level of privacy, safety and security for minors online pursuant to Article 28(4) of Regulation (EU) 2022/2065 on a Single Market For Digital Services (Digital Services Act, henceforth “DSA”).

A crucial part of the digital offerings of newspapers and magazines is the possibility to provide third parties, their readers, the ability to comment or contribute to comment sections, editorial forums and communities in the context of their publications. These offerings constitute intermediary services, and, in the cases of forums and reader communities, also online platforms within the meaning of the DSA. The latter offerings will therefore be within the scope of the guidelines. While forums and reader communities generally do not provide any direct revenue to publishers, they are not merely optional add-ons that can be discontinued without any effect on the publications’ journalistic and economic success. Rather, they are often integral parts of the publication, indispensable to enable readers to engage in discussions amongst themselves and with the editorial team about topics and articles pertaining to the publication.

Against this background, the additional obligations proposed in the draft guidelines would place impossible burdens on these editorial platforms that are furthermore entirely disproportionate in light of the negligible or non-existent risk they pose to minors.

Fundamentally, the design and features already in place of editorial online platforms, including reader communities and discussions forums, shield minors from risks to their privacy, safety and security, included but not limited to exposure to illegal content. Hence, any potential requirement to implement specific additional measures such as the ones prescribed in the draft guidelines document would create disproportionate burdens. Indeed, since editorial platforms generally don’t provide any direct revenue to publishers any additional and disproportionate burdens would render them impossible to operate.

Publishers’ have highlighted that the DSA's far-reaching due diligence obligations for online platforms already tie up significant resources of publishers. Yet, no general complaints or interventions specific to minors have been necessary under the DSA thus far. This is not surprising, given the purpose and nature of these editorial forums and publishers’ clear commitment to ensuring online safety and encouraging healthy discussion with respect to their publications. This commitment is reflected in publishers implemented measures, including policies whereby subscribers alone are allowed to contribute to forum discussions as well as publishers’ moderation (even pre-moderation, in some cases) of said discussions. Other safeguards include locking debates after a certain time; ensuring that comments focus on the debate’s subject matter; or blocking group functions and direct messaging between users. These concrete measures illustrate publishers’ commitment to protect minors in an efficient manner.

In light of the negligible or non-existent risk to minors posed by these editorial platforms and given the already considerable burden of implementing the DSA obligations for these offerings, we call on the Commission to explicitly exclude them from requirements to implement additional measures prescribed in the guidelines. Concrete examples of forums and reader communities which handle the necessary editorial content control in a situation-specific and efficient manner, and which would be economically endangered should they be required under the guidelines to implement additional and disproportionate safeguard measures are provided below.

EXAMPLES

The forum of a house and garden magazine serves the primary objective of giving readers the possibility to exchange experiences about gardens, gardening etc. It exists for 15 years and has now 3.5 million contributions and 185,000 users. The forum has a netiquette to which the users must adhere. Problematic contributions are rare.

The forum of a computer magazine exists since 1998 and today has over 2 million registered accounts and several hundred thousand active users. The forum is very important for readers of both the digital and printed versions of the magazine.

The forum of an IT magazine's online publication exists for more than 20 years and has around 670,000 registered users who during this period have posted more than 37 million contributions on all conceivable technical, economic, social and political aspects of the digital world. During peak times of the day, three contributions per second are posted on the forum. The readership of millions of user posts generates around 30 million page views per month. The posted content is not only used for discussion among readers. It also provides valuable suggestions and corrections for the editorial team. The forum is managed by the editors responsible for respective topics, who are supported by forum members on a voluntary basis.

The online discussion platform focused on law and taxation has been operated by a specialised press publisher since 2009. The platform, which is designed for professional exchanges amongst experts and hence generates limited economic value, would probably not continue to exist if further controls were imposed.

* * *

We further note that editorial online platforms’ implementation of effective content moderation measures predates the DSA. Already under Directive 2000/31/EC[1], they were required to immediately remove or block unlawful content such as unlawful reader comments, etc. and to take reasonable steps to prevent any recurrence. Significant, proportionate, and entirely sufficient obligations falling on publishers and their platform offerings are already in place which ensure the adequate implementation of measures not only for monitoring but also for often complex factual and legal assessments of user content and the implementation of corresponding assessments on discussion forums, comment sections, editorial communities, etc.

On top of tackling possible illegal content, the DSA imposed additional due diligence obligations. And irrespective thereof, publishers, while exercising the fundamental right to press and editorial freedom, in practice often apply higher editorial control standards than those prescribed under national and EU law. Adding additional obligations, disproportionate in light of actual risk level posed by such editorial platforms would be misguided. This is even more true, considering that, even if such editorial platforms were explicitly exempt from the requirement to implement additional measures as set out in the guidelines, the DSA's obligations to protect minors still apply.

Consequently, this would not create in any manner a legal loophole compromising the safety of minors. Instead, this would be a practical solution that would provide online safety while allowing these important editorial platforms to remain operational.

Finally, EMMA-ENPA welcomes the repeated mention made in the draft guidelines to the importance of safeguarding children’s rights to freedom of expression and information when implementing safeguards. This is a point of key importance, and we call for its explicit mention in the final guidelines with respect to any safeguards prescribed therein. We also welcome the explicit acknowledgement that the proportionality criteria ought to be assessed in light of the specific type of service being provided, its nature, and its intended or current use.

KEY RECOMMENDATIONS

We call on the Commission to exclude all editorial online platforms from the scope of the forthcoming Article 28(4) guidelines. Prescribing requirements to implement additional, disproportionate and unjustified safeguard measures which, in light of the type of service provided by editorial online platforms’, its nature, and its intended or current use, and also considering the key safeguards already put in place by press publishers, would not materially improve the safety of minors. Conversely, the economic impact of implementing additional safeguards would be very significant since the disproportionate increasing of compliance burdens would both decisively discourage the development of new editorial platforms and endanger the financial sustainability of existing ones and of the press publication as a whole.

The clarifications provided in the guidelines regarding what constitutes an online platform “accessible to minors” must be proportionate and not overly broad. This is necessary in order to differentiate between social media platforms, which pose a tangible risk to minors, and other online platforms, such as editorial forums, where the risk to minors is negligible or non-existent.

The final guidelines should underscore the importance of safeguarding children’s rights to freedom of expression and information when implementing any safeguards.

[1] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce')

Downloads

EMMA ENPA Reply to EC Article 28 DSA Draft Guidelines (english)

Download

Contact

José Guimarães

Senior Legal and Policy Manager - Acting Executive Director