The Digital Services Act, which the European Parliament overwhelmingly approved earlier this week, is now a reality and may serve as the benchmark for online content governance in the EU. The approval of the text by the European Council of Ministers in September will clear the last hurdle, which is merely a formality.
The groundbreaking legislation has some of the most extensive transparency and platform accountability requirements to date, which is good news. It will provide users with genuine control over, insight into, and protection from some of the most pervasive and harmful elements of our online spaces.
As the European Commission starts working seriously on creating the enforcement mechanisms, attention is now turned to putting the comprehensive law into practice. The proposed regime has a complicated structure wherein the European Commission and national regulators, in this case known as Digital Services Coordinators, share responsibility (DSCs). It will heavily rely on the development of new roles, the enlargement of current responsibilities, and seamless cross-border cooperation. It is obvious that there are currently insufficient institutional resources to effectively implement this legislation.
The Commission has offered a “sneak peek” into how they intend to address some of the more glaring implementation challenges, such as how they intend to monitor major online platforms and how they will try to avoid the issues that plague GDPR, like out-of-synch national regulators and selective enforcement, but their proposal only begs more questions. To help with the enforcement of the extensive new algorithmic transparency and data accessibility obligations, a large number of new employees will need to be hired, and a new European Centre for Algorithmic Transparency will need to entice top-tier data scientists and experts. According to the Commission’s initial plans, its regulatory responsibilities will be divided into different thematic groups, including a societal issues team that will be in charge of overseeing some of the novel due diligence requirements. Concerningly, a lack of resources in this area runs the risk of turning these fought-for commitments into meaningless checkbox tasks.
The platforms’ responsibility to conduct assessments to address systemic risks to their services is a crucial illustration. It will be necessary to consider all of the fundamental rights guaranteed by the EU Charter during this difficult process. To achieve this, the tech companies will need to create human rights impact assessments (HRIAs), which are evaluation processes designed to identify and mitigate potential human rights risks resulting from a service or business, or in this case, a platform. Throughout the negotiation process, civil society urged the tech companies to develop HRIAs. However, it will be up to the Board, which will be comprised of the DSCs and will be presided over by the Commission, to evaluate the most significant systemic risks identified each year and to provide guidelines for mitigation measures. As someone who has assisted in the creation and evaluation of HRIAs, I am aware that this will not be a simple task, even with the input of independent auditors and researchers.
The assessments must establish thorough baselines, detailed impact analyses, evaluation protocols, and stakeholder engagement strategies if they are to have an impact. The best HRIAs incorporate a gender-sensitive methodology and give special consideration to systemic risks that will disproportionately affect people from historically marginalized communities.
This is the most practical way to make sure all potential rights violations are taken into account.
Fortunately, there is guidance on how to develop these assessments provided by the international human rights framework, such as the UN Guiding Principles on Human Rights. However, platforms’ interpretations of and investments in these assessments, as well as the effectiveness of the Commission’s and national regulators’ enforcement of these obligations, will determine how well the provision is implemented. The institutions’ capacity, however, is nowhere near what the DSA will require in terms of developing the standards, best practices, and evaluating mitigation strategies.