The European Union’s Digital Services Act (“DSA”, finalized this Saturday (23), wants digital platforms, especially social networks and search servers, to comply with community legislation.
The following are the main points of this legislation:
– Rules for all online platforms — Obligation to designate a legal representative in one of the 27 member states;
– Obligation to act “quickly” to remove all illicit content or prevent its access;
– Platforms must “quickly” inform judicial authorities in case of suspicion of a “serious criminal offence” that threatens “the life or safety of persons”;
– Once a year, platforms must publish a statement detailing their initiatives in favor of content moderation and how long they took to act after being notified of the presence of illegal content. They will have to report on disputes with their users and the decisions taken;
– All platforms must offer a free complaints system;
– Platforms must suspend users who “frequently” publish illegal content;
– Online sales platforms must control the identity of sellers. They will also have to carry out controls by sampling;
– Advertising: each user must be aware of the parameters used to address them and who finances the advertisements;
– The use of “sensitive” user data (gender, political affiliation, religious affiliation, etc.) for targeted advertising is prohibited;
– Advertising directed at minors is prohibited;
– Deceptive interfaces that lead users to certain account settings or paid services are prohibited.
– Obligations for big platforms — Additional obligations are imposed for the “biggest” online platforms, with more than 45 million active users in the EU, potentially twenty companies including Google (and its subsidiary YouTube); the Goal (Facebook, Instagram and Whatsapp); Amazon, Microsoft (and its LinkedIN social network); Apple; Twitter and possibly also TikTok; Zaland and Booking;
– They must analyze the risks associated with their services in terms of the dissemination of illegal content, violation of privacy, freedom of expression, public health or safety. And they should act to mitigate them;
– The large platforms will provide the regulator with access to their data so that they can control compliance with regulations;
– They will be audited, once a year, by independent bodies to verify the fulfillment of their obligations;
– They must have an independent internal control service to prove that they comply with the regulation;
– They will have an obligation to fight against vindictive pornography.
– Control authorities – – Each EU member state will designate a competent authority, with powers of investigation and punishment, to enforce the regulation. The 27 authorities will cooperate with each other.
– Possibility for users to file complaints — Users will have the right to file a complaint against a digital service provider with the competent authority;
– Online sales platforms that do not comply with their obligations may be held liable for damages suffered by buyers of unsafe products.
– Sanctions and exemption — Fines may reach 6% of annual revenue;
– Micro and small companies are exempt from regulatory obligations.