Rules for online content moderation do not entirely depend on the choice of the online service providers. The EU and the Member States have been regulating this field for many years, resulting in a patchwork of different measures across the continent.
The result of the research showcased below shows the number of laws an online service provider has to abide by when operating in the EU and in specific Member States. As the DSA is set to become a horizontal framework for tackling illegalities online, DOT Europe team has conducted this research to show the extent of existing laws in the EU on content moderation and where those rules potentially overlap.
The DSA is meant to clarify – rather than confuse – the rules for stakeholders in the online space, and it will not exist in a vacuum. DOT Europe calls for policy-makers to keep the existing body of EU law in mind when discussing the proposal, consider what rules already exist and how they should be complemented, in order to ensure the coherence of the legal framework.
Please note that this mapping focuses on legislation that actually overlaps with the scope of the DSA and is not supposed to be a list of all legislation that has an effect on the digital sphere.
As this is still an ongoing project, it might not comprise each and every law currently in force. If you would like to contribute to this shared resource, send us your findings via email!
Terrorist Content
EU
-
Counter-terrorism directive (EU) 2017/541
Terrorist Content → EU
Counter-terrorism directive (EU) 2017/541
Scope (Service)
- Hosting service providers (no definition)
Scope (Content)
- Public provocation to commit a terrorist offence (list at Art. 3)
Obligations
- No obligations for HSP in this legislation, only references to the eCommerce Directive liability regime in the Recitals
-
Regulation on addressing the dissemination of terrorist content online (EU)
Terrorist Content → EU
Regulation on addressing the dissemination of terrorist content online (EU)
Based on the 26/11 compromise text
Scope (Service)
- ‘Hosting service provider’: a provider of information society services consisting in the storage of information provided by and at the request of the content provider
- This Regulation shall apply to hosting service providers offering services in the Union, irrespective of their place of main establishment, which disseminate information to the public
Scope (Content)
- Terrorist content means one or more of the following material:
- (a) inciting the commission of terrorist offences of Art. 3(1) CTD, where such material, directly or indirectly such as by the glorification of terrorist acts, advocates the commission of terrorist offences
- (b) soliciting a person or a group of persons to commit or contribute to the commission of terrorist offences
- (c) soliciting a person or a group of persons to participate in the activities of a terrorist group
- (d) providing instruction on the making use of explosives, firearms or other weapons, or on other specific methods or techniques for the purpose of committing or contribute to the commission of a terrorist offence
- (e) constituting a threat to commit one of the offences referred to in Art. 3(1) CTD.
Obligations
- HSP shall remove terrorist content or disable access to it in all MS as soon as possible and in any event within one hour from receipt of the removal order (Art. 4(2)) and inform the competent authority about it, in particular the time of action (Art. 4(3))
- HSP is considered exposed to terrorist content when it received two or more removal orders in the previous 12 months (Art. X(4)). HSP exposed to terrorist content shall include in their T&Cs provision to address the misuse of their service for the dissemination of terrorist content online (Art. X(1)) and take specific measures, which are totally up to them, to protect their services (Art. X(2)). HSP shall report on the specific measures it has taken and is intending to take within 3 months of receipt of the decision, and on an annual basis thereafter if the competent authority considers it necessary (Art. X(5)).
- HSP shall preserve terrorist content which has been removed/access disabled in case it is necessary for administrative or judicial remedy, complaint-handling, investigation/prevention/prosecution of terrorist offences (Art. 7(1)). Content shall be preserved, and properly safeguarded (Art. 7(3)), for 6 months or further upon request from competent authority or court (Art. 7(2)).
- HSP should set out clearly in their T&C, where appropriate, a meaningful explanation of the functioning of specific measures, including, where appropriate, the use of automated tools (Art. 8(1)).
- Any HSP which has undertaken actions against dissemination of terrorist content or has been required to take action shall make publicly available a transparency report on actions taken in a given calendar year; this report should be published within 2 months from the end of that year (Art. 8(2)).
- HSP shall establish an effective and accessible mechanism allowing content providers whose content has been removed/disabled access as a result of a removal order or specific measures to submit a complaint requesting reinstatement of the content (Art. 10(1)). HSP shall promptly examine every complaint and reinstate the content without undue delay if removal found unjustified; the complainant shall be informed by the HSP within 2 weeks of the receipt of the complaint with an explanation in case content not reinstated (Art. 10(2)).
- A reinstatement of content shall not preclude administrative or judicial measures against the decision of the HSP or of the competent authority (Art. 10(2)).
- If HSP remove terrorist content, they shall make information on removal available to the content provider (Art. 11(1)) and, upon request from the content provider, HSP shall inform about the reasons of the removal and possibilities to contest the decision or shall provide a copy of the removal order (Art. 11(2)) but, depending on the context, there should be no disclosure for as long as necessary, but not exceeding 6 weeks, potentially renewed where justified reasons exist (Art. 11(3)).
- HSPs shall endeavour to cooperate with competent authorities and Europol (Art. 13)
- Where HSP become aware of terrorist content involving an imminent threat to life or suspected terrorist offences, they shall promptly inform competent authorities (Art. 13(4)).
- HSP shall establish a point of contact allowing the receipt of removal orders by electronic means and ensure expeditious processing and the information on the point of contact shall be made publicly available (Art. 14(1))
Possible penalties
- MS shall ensure that a systematic or persistent failure to comply with obligations is subject to financial penalties of up to 4% of the HSP’s global turnover of the last business year (Art. 18(4))
-
Commission Recommendation on measures to effectively tackle illegal content online 2018/334 of 1 March 2018
Terrorist Content → EU
Commission Recommendation on measures to effectively tackle illegal content online 2018/334 of 1 March 2018
Scope (Service)
- ‘Hosting service providers’ (hsp): Providers of information society services consisting of the storage of information provided by the recipient of the service at his or her request, irrespective of its place of establishment, which directs its activities to consumers residing in the Union
Scope (Content)
- ‘terrorist content’: Any information that falls under definitions set out in Directive (EU) 2017/541 and the law of a Member State concerned, as well information produced by or attributable to terrorist groups or entities included in the relevant lists established by the Union or by the United Nations
Recommendations
- A hsp should set out in their terms of service that they will not store terrorist content
- A hsp should inform the competent authority or Europol of their decisions on whether or not access to specific content was disabled in response to a referral, as well as the reasoning behind that decision.
- Where appropriate, a hsp should remove or disable access to illegal content within one hour from receipt of a referral.
-
Directive 2018/1808 on Audiovisual Media Services
Terrorist Content → EU
Directive 2018/1808 on Audiovisual Media Services
Scope (Service)
- ‘Video-sharing platform service’ means a service where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.
- Social media services are in scope and are considered as video-sharing platform services if the provision of programmes and user-generated videos constitutes an essential functionality of that service .
Scope (Content)
- VSPs shall take appropriate measures to protect:
- minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1).
- the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence, offences concerning child pornography and offences concerning racism and xenophobia.
Obligations
- VSPs shall comply with requirements on audiovisual commercial communications, “taking into account the limited control exercised by those video-sharing platforms over those audiovisual commercial communications.”. The measures imposed by MS shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided. Those measures shall not lead to any ex-ante control measures or upload-filtering of content
- These measures shall consist of:
- (a) including and applying in the terms and conditions of the video-sharing platform services the requirements referred to in paragraph 1;
- (b) including and applying in the terms and conditions of the video-sharing platform services the requirements set out in Article 9(1) for audiovisual commercial communications that are not marketed, sold or arranged by the video-sharing platform providers;
- (c) having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
- (d) establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider concerned the content referred to in paragraph 1 provided on its platform;
- (e) establishing and operating systems through which video-sharing platform providers explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (d);
- (f) establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
- (g) establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate the content referred to in paragraph 1;
- (h) providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
- (i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users’ complaints to the video-sharing platform provider in relation to the implementation of the measures referred to in points (d) to (h);
- (j) providing for effective media literacy measures and tools and raising users’ awareness of those measures and tools.
- Member States shall entrust the assessment of those measures to the national regulatory authorities or bodies.
- These measures shall consist of:
- Out-of-court redress mechanisms shall be available for the settlement of disputes between users and video-sharing platform providers.
- Video-sharing platform providers are encouraged to exchange best practices on co-regulatory codes of conduct.
Possible penalties
- Penalties determined by Member States.
-
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Terrorist Content → EU
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Scope (Service)
- ‘Intermediary service’ means one of the following services:
- a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
- a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
- a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service;
- ‘Online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
- ‘Very large online platforms’ means online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.
Scope (Content)
- ‘Illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.
Proposals
- Obligations for all intermediary service providers:
- Providers need to establish a single point of contact for direct communication with authorities
(e.g., MS authorities and the Commission). - Providers which are not established in the Union but offer services here need to designate a legal representative in one of the Member States where its service is offered. This designated legal representative can be held liable for non-compliance with obligations under this Regulation.
- Providers’ terms and conditions must include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.
- At least once a year, providers must publish a report on any content moderation they engaged in during the relevant period, including the number of orders received from Member States, the number of notices submitted, information on own content moderation initiatives and the number of complaints received through internal complaints mechanisms.
- Providers need to establish a single point of contact for direct communication with authorities
- Obligations for hosting service providers, including online platforms:
- Providers must put in place notice and action mechanisms to submit notices. Complete notices confer actual knowledge of the specific item of information. The submitter of the notice shall be informed of the decision taken, providing information on redress. If automated means are used for the processing or decision-making, it must be stated that this was the case.
- When access to specific items is removed or disabled, the provider shall inform the poster about it, stating, among other things, the basis for taking the decision, information on the use of automated means in taking the decision, a reference to the legal basis for cases in which content is considered illegal and information on the redress possibilities. Providers shall publish the decisions and the statements of reasons (excluding personal data) in a publicly accessible database managed by the Commission.
- Obligations for all online platforms, excluding micro- and small enterprises:
- Online platforms must provide an internal complaint-handling system to fight decisions taken by the online platform on content that is illegal or incompatible with the terms and conditions for six months after the decision is taken. Where there are sufficient grounds for the platform to reconsider, the decision shall be reversed. Decisions must not be taken by purely automated means. In any case, complainants shall be informed of the decision taken by the platforms and about the possibility of out-of-court dispute settlements and other available redress possibilities.
- Complainants are entitled to select any certified out-of-court dispute settlement body in order to resolve disputes. Platforms are bound by the decision of that body. If the body decides in favour of the complainant, the platform shall reimburse the recipient for any expenses related to the dispute settlement. If the body decides in favour of the platform, the recipient shall not be required to reimburse the platform.
- Online platforms shall prioritize notices submitted by trusted flaggers. Applicants must fulfil specific criteria and are selected by the Digital Services Coordinator of establishment. When a trusted flagger submitted a significant number of inadequate notices, it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger.
- Online platforms shall, after a warning, suspend their services to users that frequently provide manifestly illegal content. Similarly, online platforms shall, after a warning, suspend, the processing of complaints by users which frequently submit unfounded notices. This policy is to be set out in the terms and conditions.
- Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State which is suspected to be affected of its suspicion and provide all relevant information available.
- If an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained a range of information such as contact details, an ID copy, bank details and the trade registry number. Upon receiving the information, platforms should make efforts to assess the information for validity and completeness. The information must be stored for the duration of their contractual relationship with the trader and subsequently deleted.
- Beyond Article 13, online platforms must also supply information on out-of-court dispute settlements, suspensions, and the use of automatic means for the purpose of content moderation. In addition, platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State. This information can also be requested by the Digital Services Coordinator of establishment at any moment.
- Platforms that display advertising must allow users to identify, for each specific advertisement displayed to each individual recipient the natural or legal person on whose behalf the advertisement is displayed as well as meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
- Obligations for very large online platforms:
- Very large platforms shall identify, at least once a year, any significant systemic risks stemming from the functioning and use made of their services. This risk assessment shall include the dissemination of illegal content, any negative effects for the exercise of fundamental rights and intentional manipulation of their services with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse or electoral processes and public security. This shall also account for how content moderation and recommender systems influence any of these risks.
- Very large platforms shall put in place mitigation measures, tailored to specific systemic risks by for example adapting content moderation or recommender systems, introducing targeted measures aimed at limiting the display of advertisements in association with the service they provide or initiating cooperation with other online platforms through the codes of conduct and the crisis protocols
- Very large platforms are subject, at their own expense and at least once a year, to audits to assess compliance with obligations set out in this chapter and commitments undertaken under codes of conduct. If an audit report is not positive, the platform must take the necessary measures to implement recommendations for improvement. One month after receiving the recommendations, an audit implementation report shall be adopted which sets out those measures. If operational recommendations are not implemented, platforms shall justify the reasons for not doing so and set out any alternative measures they may have taken.
- Very large platforms which use recommender systems shall set out the main parameters used in their recommender systems as well as any options for the user to modify or influence those main parameters in their terms and conditions. This shall include at least one option which is not based on profiling.
- Very large platforms that display advertising shall make publicly available a repository containing the content of each advertisement, the natural or legal person on whose behalf the advertisement is displayed, the period during which the advertisement was displayed, whether the advertisement was intended to be displayed specifically to particular groups and if so, which parameters were used for that purpose and the total number of recipients specifying, where applicable, aggregate numbers for the groups which were targeted specifically. Entries on this list shall be kept until one year after the advertisement was displayed for the last time on their online interfaces.
- Very large platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation.
- Very large platforms shall appoint one or more qualified compliance officers responsible for monitoring their compliance with this Regulation.
- Very large platforms must publicly publish the reports every six months, as well as annual risk assessment reports and the results of their audit (all of which must be also sent to the Digital Services Coordinator in their country of establishment). The platform can edit the results of these reports where there is a concern that e.g., trade secrets, will be impacted. However, complete reports must still be sent to the relevant Digital Services Coordinator.
- Other proposals:
- Where significant systemic risks emerge which concern several very large online platforms, the Commission may invite these platforms, other online platforms and other providers of intermediary services, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct.
- The Board can establish crisis protocol for extraordinary circumstances affecting public security or health. Platforms and very large platforms will have to work with the Commission to draw up and test these protocols, as well as other stakeholders where necessary, and Member State authorities. Work will be done to define what a crisis is, and how the protocols are to be put in place.
Possible penalties
- The maximum amount of penalties imposed for a failure to comply shall not exceed 6% of the annual income or turnover.
- Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover.
- Maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover.
- ‘Intermediary service’ means one of the following services:
-
EU Internet Forum – EU Crisis Protocol
Terrorist Content → EU
EU Internet Forum – EU Crisis Protocol
Scope (Service)
- Online service providers (on a voluntary basis)
Scope (Content)
- Dissemination of terrorist content online in the event of a terrorist attack
Obligations
- In the event of a crisis, law enforcement authorities and online service providers will share relevant information on the online content (e.g., URLs, audio-visual media, and metadata) on a voluntary basis, in a secure way and in real time.
- OSP shall also alert Member States affected, Europol and other service providers.
- A joint assessment of the response by all the actors involved shall be drafted to identify any gaps and lessons learnt for the future, while ensuring transparency towards the public on actions taken.
Austria
-
Communication Platforms Act [not finalized]
Terrorist Content → Austria
Communication Platforms Act [not finalized]
Scope (Service)
- ‘Service provider’: the natural or legal person who operates a communication platform.
- ‘Communication platform’: An information society service, the main purpose or an essential function of which is to enable the exchange of messages or presentations with intellectual content in written, aural or visual form between users and a larger group of other users by way of mass dissemination.
- Excluded are operators of platforms which, over the course of the previous quarter, did not exceed an average of 100 000 registered users and the turnover of which did not exceed EUR 500 000 in the previous year in Austria.
Scope (Content)
- ‘Illegal content’: 14 specific ‘criminal offences’ listed in the Austrian criminal code (StGB)
- Offences related to terrorist content listed in e.g. paragraphs 278f and 282a
Obligations
- Service providers shall set up an effective and transparent procedure for handling and processing reports on allegedly illegal content available on the platform.
- In response to a report, if the illegality of a piece of content is evident to a legal layperson without further investigation, it has to be removed or have access to it disabled no later than 24 hours after receipt of the report. If its illegality becomes apparent only after a detailed examination, it has to be removed or have access to it disabled after completion of the examination but no later than within 7 days of receipt of the report.
- In the case of illegalities being found, the content concerned, the time of its creation and the data required to identify the author have to be stored for evidence purposes, including for purposes of criminal prosecution, for up to 10 weeks.
- Service providers are obliged to prepare a report on the handling of reports of alleged illegal content on an annual basis (on a quarterly basis for communication platforms with over one million registered users) and to submit it to the supervisory authority no later than 1 month after the end of the calendar year covered in the report. In addition, the report shall be permanently made available and easily accessible on the service provider’s own website at the same time as it is sent.
Possible penalties
- The supervisory authority shall impose a fine of up to EUR 10 million on a service provider, depending on the severity of the violation.
France
-
Décret n° 2015-125 du 5 février 2015
Terrorist Content → France
Décret n° 2015-125 du 5 février 2015
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- ‘Illegal content’: two specific provisions laid down in the French penal code.
- Offences related to terrorist content listed in paragraph 421-2-5 of the penal code
Obligations
- Upon being notified by the National Police Office in charge of fighting digital crimes and without the need for a court order, services need to take down websites related to illegal content.
-
Décret n° 2015-253 du 4 mars 2015
Terrorist Content → France
Décret n° 2015-253 du 4 mars 2015
Scope (Service)
- Search engines
Scope (Content)
- ‘Illegal content’: two specific provisions laid down in the french penal code.
- Offences related to terrorist content listed in paragraph 421-2-5 of the penal code
Obligations
- Within forty-eight hours of the receipt of a notification from the relevant authorities, the operators of search engines shall take all reasonable measures to stop the listing of websites containing the aforementioned criminal offences.
-
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Terrorist Content → France
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- ‘Illegal content’: numerous specific provisions laid down in other legislation.
- Offences related to terrorist content listed in e.g paragraph 227-24 and 421-2-5 of the penal code
Obligations
- Public online communication services must put in place an easily accessible and visible system enabling anyone to bring to their attention any content falling under the aforementioned scope. They are also obliged to promptly inform the competent authorities of any of the aforementioned illegal activities which may be reported to them.
- Public online communication services must also make public the resources they devote to the fight against these illegal activities.
- In the specific cases of child sexual abuse material and terrorist content, the competent authorities may request the service provider to remove access to specific content. If such action is not taken within 24 hours, the authorities can request again for the service provider to remove access to the content in question which the service provider has to comply with without delay
Possible penalties
- Imprisonment of up to one year and a fine of EUR 75,000
Finland
-
Act on provision of information society services (2002)
Terrorist Content → Finland
Act on provision of information society services (2002)
Scope (Service)
- ‘information society services’: Services which are provided as distance services, i.e. without the parties being present at the same time; electronically, i.e. by sending and receiving services via devices handling information electronically; as data transfers requested personally by recipients of services and usually against a payment.
Scope (Content)
- Information, the making available or transmission of which is punishable or as a basis for civil liability – presumably based on the Finnish criminal code.
- Under the presumption that the illegality of content is based on the Finnish criminal code, the code lists numerous offences linked to terrorist content, notably in chapter 34a.
Obligations
- A court may order the service provider to disable access to certain information based on its criminal nature. In such a process, the service provider and the content producer must be consulted except if the consultation cannot be arranged as quickly as the urgency of the matter requires.
- The court order must also be notified to the content producer. If the content producer is not known, the court may order the service provider to take care of notification.
- The service provider must immediately notify the content producer of prevention of access to the material supplied by him/her and to supply the content producer with a copy of the notification on the basis of which prevention was made
Possible penalties
- Laid out within the Penalty Payment Act (1113/1990).
Germany
-
Network Enforcement Act – NetzDG
Terrorist Content → Germany
Network Enforcement Act – NetzDG
Scope (Service)
- ‘Telemedia service providers’: Internet platforms which, profit-making purposes, are designed to enable users to share any content with other users or to make such content available to the public (social networks). Platforms offering journalistic or editorial content, the responsibility for which lies with the service provider itself, shall not constitute social networks. The same applies to platforms which are designed to enable individual communication or the dissemination of specific content.
Scope (Content)
- 22 specific ‘criminal offences’ listed in the German criminal code (StGB)
- Offences related to terrorist content listed in e.g. sections 129a and 129b
Obligations
- Platforms must put in place an effective and transparent procedure that is easily recognisable, directly accessible and permanently available to users
- Platforms should immediately become aware of submitted complaints and analyse the legal or illegal nature of the litigious content and the possibility of its removal or blocking
- After receiving a complaint, the platform has seven days to remove or block it except:
- 1. When the illegality of a content depends on identifiable factual circumstances, the online platform may give the user the opportunity to comment on the complaint before taking a decision
- 2. When the online platform requires an ad hoc regulated self-regulatory body to rule on the illegality of a content
- In cases of obviously illegal content, the time limit is reduced to 24 hours
- Online platforms are obliged to inform and justify their decisions without delay to the complainant and the content’s author
- In the event of the removal of illegal content, the online platform must ensure that it is secured and kept for evidence purposes for ten weeks
Possible penalties
- Fine between EUR 500,000 to EUR 5 million depending on the type of infringement
Hungary
-
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Terrorist Content → Hungary
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Scope (Service)
- ‘Information society service’: Services provided electronically – normally for financial consideration – at a distance and at the individual request of the recipient of the services
- ‘Service provider’: Any natural or legal person, or business association lacking the legal status of a legal person providing an information society service
Scope (Content)
- ‘Illegal information’ – presumably based on the Hungarian criminal code.
- Under the presumption that the illegality of information is based on the Hungarian criminal code, the code lists numerous offences linked to terrorist content, notably in sections 314-318.
Obligations
- If in the course of criminal proceedings the court has ordered the rendering of electronic information inaccessible, the service provider shall take measures within one working day upon receipt of the court order for disabling access to the electronically published illegal information.
- The party ordered to render electronic information temporarily or permanently inaccessible shall inform the users, disclosing the name of the court and the number of the ruling, as to the legal basis for the removal of the content in question, for disabling access to such content.
- If the court orders the service provider to restore access to such electronic information upon the conclusion of the proceedings, the service provider shall take measures to restore access to such information within one working day of receipt of the court order.
Possible penalties
- In the event of non-compliance, the court can impose fines, the amount of which are prescribed in the Act on Criminal Proceedings, or the Act on the Execution of Penal Measures and Sanctions.
Lithuania
-
Law No. XII-1428 on Cyber Security of 11 December 2014:
Terrorist Content → Lithuania
Law No. XII-1428 on Cyber Security of 11 December 2014:
Scope (Service)
- ‘Digital services’: A group of services based on communications and information technology which encompasses the services of e-market, web search and/or cloud computing.
- ‘Digital service provider’: A legal entity which provides digital services in the Republic of Lithuania and/or other EU Member States.
Scope (Content)
- ‘Criminal offences’: Laid out in the Lithuanian criminal code
- Offences related to terrorist content listed in e.g. Art. 249(1) – 250(6)
Obligations
- While investigating criminal offences taking place on their sites, law enforcement may order the aforementioned services to limit the provision of their services for up to 48 hours without requiring a court sanction. If a court sanction is available the duration can be extended.
- Further, during an investigation, law enforcement can require services to preserve certain information collected during the provision of services such as a mail address, geographical location address, phone or any other access number as well as information about accounts and payments made on the basis of service agreement
Luxembourg
-
Loi du 14 août 2000 relative au commerce électronique
Terrorist Content → Luxembourg
Loi du 14 août 2000 relative au commerce électronique
Scope (Service)
- ‘Information society service’: Any service provided, normally for remuneration, remotely by electronic means and at the individual request of a recipient of services
- ‘Service provider’: Any natural or legal person who provides an information society service;
Scope (Content)
- 2 specific articles of the Luxembourgish penal code
- Offences related to hate speech listed in article 457-1.
Obligations
- Hosting service providers are subject to specific monitoring obligations in order to detect possible infringements relating to specific illegal content.
Malta
-
Electronic Commerce Act
Terrorist Content → Malta
Electronic Commerce Act
Scope (Service)
- ‘information society service’: Any service which is provided at a distance, by electronic means and at the individual request of a recipient of the service, whether such service is provided for consideration or not.
- ‘service provider’: Any person established in Malta providing an information society service.
Scope (Content)
- Illegal activity – presumably based on the Maltese criminal code
- Offences related to terrorist content listed in e.g. section 328
Obligations
- Information society service providers shall promptly inform the public authorities competent in the matter of any alleged illegal activity undertaken or information provided by recipients of their service and shall grant to any such authority upon request information enabling the identification of recipients of their service with whom they have storage agreements.
Possible penalties
- Any person contravening the aforementioned obligations shall be liable to a fine of not more than EUR 250,000 or to imprisonment not exceeding two years or both. In the case of a continuous offence a fine not exceeding EUR 2,500 for each day during which the offence continues can be imposed.
The Netherlands
-
Penal Code
Terrorist Content → The Netherlands
Penal Code
Scope (Service)
- An intermediary which provides a telecommunication service that consists of the transfer or storage of
- data from a third party
Scope (Content)
- Criminal offences – based on the Dutch criminal code
- Offences related to terrorist content listed in e.g. sections 115, 117, 117b, 121 to 123, 157, 161, 162, 162a, 164, 166, 168, 170, 172, 173a, 285, 287, 288, 289, 350, 350a, 351, 352, 354, 385b and 385d
Obligations
- An intermediary, by order of a public prosecutor, shall take all measures that may be reasonably required of it in order to disable data related to criminal offences.
Poland
-
Law on Counterterrorism of 10 June 2016
Terrorist Content → Poland
Law on Counterterrorism of 10 June 2016
Scope (Service)
- IT and communications system
Scope (Content)
- ‘Events of terrorist nature’: An act, committed with the aim of seriously terrorising a large number of people; or forcing a public authority of the Republic of Poland, or another state or international organisation, to take or not to take a certain course of action; or cause a serious disturbance in the political system or the economy of the Republic of Poland, or another state or international organisation; or a threat to commit such an act.
Obligations
- Upon receiving a written request or order from the Head of the Internal Security Agency, the administrator of the IT and communications system shall block access to specified IT data or IT and communications services which are related to the event of a terrorist nature, for a time no longer than 30 days, in order to prevent, counteract, and detect offences of a terrorist nature, as well as to prosecute their perpetrators.
-
Law on the protection of freedom of speech on online social networking sites
Terrorist Content → Poland
Law on the protection of freedom of speech on online social networking sites
[Based on the draft version of 15th January 2021]
Scope (Services)
- ‘Online networking sites’: understood as a service provided electronically within the meaning of the law of 18 July 2002 on the provision of electronic services, which allows users to share any content with other users or the general public, and which is used by at least one million registered users in the Republic of Poland.
- ‘Service provider’: understood as the provider of online social networking services, consisting in storing on the online social network information provided by the user at his request, with at least one million registered users.
Scope (Content)
- Illegal content: understood as content that violates personal rights, disinformation, criminal content, as well as content that violates decency, in particular, disseminating or praising violence, suffering or humiliation
- Disinformation: understood as false or misleading information, produced, presented and disseminated for profit or in violation of the public interest.
- Criminal content: understood as content that praises or incites to commit prohibited acts, specified in Arts.: 117-119, 127, 130, 133, 134-135, 137, 140, 148-150, 189-189a, 190a, 194-204, 222-224a, 249-251, 255, 258, 343 of the law of 6 June 1997 – Penal Code (Journal of Laws of 2020, items 1444 and 1517) or which constitutes a prohibited act.
Obligations
- A service provider which receives over 100 user complaints in a calendar year regarding the distribution of access to illegal content, restriction of access to content or limitation of access to the user’s profile, must prepare a report in Polish every six months on how these complaints are resolved and publish it on the online social networking site no later than one month from the end of the given six months.
- Reports posted on an online social networking site must be clearly visible, directly and permanently accessible.
- The Service Provider must appoint at least one, but not more than three, representatives in the country.
- The Service Provider must immediately inform the President of the UKE about the appointment or change of a representative in the country, and provide their information, including their email address and address for service. Where the representative in the country is a legal person, the service provider shall provide details of natural persons authorised to act on behalf of that legal person.
- The service provider must publish on the online social network site in a clearly visible, directly and permanent accessible manner:
- full details of the representative in the country, including the electronic address and address for service, and if the representative in the country is a legal person, also details of natural persons authorised to act on behalf of that legal person;
- details of the service provider, including the exact name of the person running the online social networking site or their name and surname, registration or residential address, delivery address, registration data and email address.
- Persons who conduct internal review proceedings shall be regularly provided with training in Polish by the service provider, at least every six months.
- If the service provider ceases to operate, they are obliged to provide the President of the UKE with the data referred to in Art. 18 of the law of 18 July 2002 on the provision of electronic services, for further storage, sharing and protection.
- The service provider must establish an effective and understandable internal review procedure in Polish in matters relating to user complaints regarding:
- restriction of access to content;
- restriction of access to the user’s profile;
- disseminating illegal content.
- The service provider must publish the rules of the online social network site available to all users on the online social network site in Polish, which must also contain the rules of conducting internal review proceedings. The rules cannot be inconsistent with the provisions of generally applicable law.
- The service provider is obliged to ensure that complaints can be sent to internal review proceedings in a clearly visible, directly and permanently accessible manner.
- A representative in the country shall immediately send confirmation of receipt of the submitted complaint via the email address indicated in the complaint.
- A representative in the country shall examine the user’s complaint and inform them through the indicated email address about the method of consideration, within 48 hours of lodging the complaint.
- If the complaint is accepted, the service provider shall:
- restore restricted access to content, or
- restore limited access to the user’s profile, or
- prevent the distribution of illegal content.
- Information on the method of considering the complaint should include the reasons for the resolution, indicating in particular the legal grounds with reference to the rules of the online social network site and the actual grounds for the decision with justification. The information should also contain instructions on the possibility of submitting a complaint to the Council, as well as the date and manner of filing a complaint.
- The service provider shall inform the user about the possibility of pursuing claims through civil proceedings and the possibility of notifying about the commission of an offence.
- In case a user submits a complaint to the Freedom of Speech Council, because he/she is dissatisfied with the way a complaint was handled by the internal review proceedings, the representative in the country shall provide the Council with the materials collected in the internal review proceedings within 24 hours from the notification of the complaint submitted by the user.
- The service provider shall execute the Council’s decision immediately, and not later than within 24 hours of its delivery.
- A representative in the country, within 24 hours from the deadline for implementing the decision referred to in Para. 2, shall inform the Council about the manner of its implementation. If the service provider fails to implement the decision, the Council shall immediately notify the President of the UKE.
- The service provider may not limit access to the content that was the subject of the examination by the Council again.
- In the event that criminal content is found, a prosecutor may ask the service provider or representative in the country to send the necessary information, in particular regarding data specifying the user and publications posted on the online social networking site.
- If it is found that the content of a criminal content contains a publication with pornographic content involving a minor or content that praises or incites to commit acts of a terrorist nature, or that further access to this publication creates the risk of causing significant damage or causing effects difficult to reverse, the prosecutor shall immediately issue a decision ordering the service provider to prevent access to this content.
- Within 3 months from the entry into force of the law, service providers shall submit a report for the previous year, containing information on the methods of counteracting disinformation and disseminating illegal content to date.
[Modification of Art. 111 of the law of 5 January 2011 – Election Code (Journal of Laws of 2020, item 1319)]
- After § 1, §1a is added as follows: “§ 1a. The candidate or representative of the election committee concerned has the right to submit to the district court a request for a decision to prohibit the service provider within the meaning of the law of … on the protection of freedom of speech on online social networking sites (Journal of Laws of … item …) of disseminating illegal content within the meaning of the law, produced with the intention of deliberately violating the integrity of elections.”
Penalties
- For a breach by the service provider of any of its obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
- For a breach by a representative in the country of any of his/her obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
United Kingdom
-
Terrorism Act (2006)
Terrorist Content → United Kingdom
Terrorism Act (2006)
Scope (Service)
- Any provider of an electronic service
Scope (Content)
- ‘Terrorist publication’:direct or indirect encouragement or other inducement, to the commission, preparation or instigation of acts of terrorism; or information that is useful in the commission or preparation of such acts and is shared mainly in that interest; or content that is likely to be understood by a reasonable person as indirectly encouraging the commission or preparation of acts of terrorism.
Obligations
- Upon being notified of the existence of terrorist content by a constable, a provider has 2 working days to take down the specified content, otherwise he is considered to have endorsed the content. If the original poster publishes the same material again, a notice is not required for the provider to be considered to have endorsed the content.
- Liability is only avoided upon showing that every reasonable step was taken by the provider to prevent a repeat statement from becoming available to the public and to ascertain whether it does, and that the provider was not aware of the publication of the repeat statement.
-
Online harms White Paper [not finalized]
Terrorist Content → United Kingdom
Online harms White Paper [not finalized]
Scope (Service)
- Companies (explicitly including search engines) whose services host user-generated content which can be accessed by users in the UK facilitate public or private online interaction between service users, one or more of whom is in the UK. The regulatory framework will also apply to public communication channels and services where users expect a greater degree of privacy, e.g. online instant messaging services and closed social media groups.
- Excluded are business-to-business services, services which play a functional role in enabling online activity, such as internet service providers and low-risk businesses.
- The legislation will take a tiered approach differentiating between smaller and larger services based on the audience and the functionalities offered.
Scope (Content)
- ‘Illegal content’: Presumably based on the UK’s criminal code, specific mentions were made of child sexual exploitation and abuse material as well as terrorist content.
- ‘Harmful content’: The legislation will provide a general definition of harmful content. As of yet it is only stated that for content to be considered harmful, it must give rise to a “reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals”
Obligations
- The legislation will require companies to prevent the proliferation of illegal content and activity online and ensure that children who use their services are not exposed to harmful content. The duty of care will also cover disinformation and misinformation that could cause significant harm to an individual.
- Depending on the size of the audience and the functionalities offered by a service, larger platforms face additional requirements regarding content that is legal but harmful to adults.
- The regulator, Ofcom, will issue codes of practice which are mandatory for companies to comply with or requires them to show an equally effective approach.
Possible penalties
- Ofcom will be able to issue fines of up to £18 million or 10% of global annual turnover, whichever is the higher in cases of non-compliance.
Child Sexual Abuse Material
EU
-
Child Sexual Abuse and Exploitation Directive 2011/93/EU
Child Sexual Abuse Material → EU
Child Sexual Abuse and Exploitation Directive 2011/93/EU
Scope (Service)
- Mention of web pages, social networking websites and chat rooms in the recitals (no definitions)
Scope (Content)
- Child pornography means:
- (i) any material that visually depicts a child engaged in real or simulated sexually explicit conduct;
- (ii) any depiction of the sexual organs of a child for primarily sexual purposes;
- (iii) any material that visually depicts any person appearing to be a child engaged in real or simulated sexually explicit conduct or any depiction of the sexual organs of any person appearing to be a child, for primarily sexual purposes; or
- (iv) realistic images of a child engaged in sexually explicit conduct or realistic images of the sexual organs of a child, for primarily sexual purposes (Art. 2(c)).
- Pornographic performance means a live exhibition aimed at an audience, including by means of information and communication technology, of:
- (i) a child engaged in real or simulated sexually explicit conduct; or
- (ii) the sexual organs of a child for primarily sexual purposes (Art. 2(e))
Obligations
- No obligation for online services
-
Proposal for a Regulation on a temporary derogation from certain provisions of Directive 2002/58/EC
Child Sexual Abuse Material → EU
Proposal for a Regulation on a temporary derogation from certain provisions of Directive 2002/58/EC
Scope (Service)
- ‘number-independent interpersonal communications service’ means a service as defined in Article 2(7) of Directive (EU) 2018/1972 (Art. 2(1)).
- ‘number-independent interpersonal communications service’ means an interpersonal communications service which does not connect with publicly assigned numbering resources, namely, a number or numbers in national or international numbering plans, or which does not enable communication with a number or numbers in national or international numbering plans (Electronic Communications Code)
- The derogation is limited to technologies regularly used by number-independent interpersonal communications services for the purpose of detecting and reporting child sexual abuse online and removing child sexual abuse material.
Scope (Content)
- Material constituting child pornography as defined in Article 2, point (c), of Directive 2011/93/EU of the European Parliament and of the Council;
- Solicitation of children for the purpose of engaging in sexual activities with a child or of producing child pornography by any of the following:
- (i) luring the child by means of offering gifts or other advantages;
- (ii) threatening the child with a negative consequence likely to have a significant impact on the child;
- (iii) presenting the child with pornographic materials or making them available to the child .
- (c)‘pornographic performance’ as defined in Article 2(e) of Directive 2011/93/EU.
Obligations
- The providers should publish annual reports on the undertaken processing including on the type and volumes of data processed, number of cases identified, measures applied to select and improve key indicators, numbers and ratios of errors (false positives) of the different technologies deployed, measures applied to limit the error rate and the error rate achieved, the retention policy and the data protection safeguards applied (Art. 3(e))
- ‘number-independent interpersonal communications service’ means a service as defined in Article 2(7) of Directive (EU) 2018/1972 (Art. 2(1)).
-
European Strategy for a Better Internet for Children COM(2012) 196
Child Sexual Abuse Material → EU
European Strategy for a Better Internet for Children COM(2012) 196
Scope (Service)
- Internet industry (no definition)
Scope (Content)
- Child sexual abuse material (no definition)
Proposals
- Industry:
- will be encouraged to take steps, including proactive measures, to remove child sexual abuse material from the internet.
- should reinforce cooperation with law enforcement bodies and hotlines to refine notice and takedown procedures and to establish benchmarks.
- will be encouraged to develop and use tools to increase the effectiveness of the identification of child sexual abuse images, notice and takedown procedures, and the prevention of re-uploading (p. 14/15)
-
Directive 2018/1808 on Audiovisual Media Services
Child Sexual Abuse Material → EU
Directive 2018/1808 on Audiovisual Media Services
Scope (Service)
- ‘Video-sharing platform service’ means a service where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.
- Social media services are in scope and are considered as video-sharing platform services if the provision of programmes and user-generated videos constitutes an essential functionality of that service .
Scope (Content)
- VSPs shall take appropriate measures to protect:
- minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1).
- the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence, offences concerning child pornography and offences concerning racism and xenophobia.
Obligations
- VSPs shall comply with requirements on audiovisual commercial communications, “taking into account the limited control exercised by those video-sharing platforms over those audiovisual commercial communications.”. The measures imposed by MS shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided. Those measures shall not lead to any ex-ante control measures or upload-filtering of content
- These measures shall consist of:
- (a) including and applying in the terms and conditions of the video-sharing platform services the requirements referred to in paragraph 1;
- (b) including and applying in the terms and conditions of the video-sharing platform services the requirements set out in Article 9(1) for audiovisual commercial communications that are not marketed, sold or arranged by the video-sharing platform providers;
- (c) having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
- (d) establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider concerned the content referred to in paragraph 1 provided on its platform;
- (e) establishing and operating systems through which video-sharing platform providers explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (d);
- (f) establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
- (g) establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate the content referred to in paragraph 1;
- (h) providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
- (i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users’ complaints to the video-sharing platform provider in relation to the implementation of the measures referred to in points (d) to (h);
- (j) providing for effective media literacy measures and tools and raising users’ awareness of those measures and tools.
- Member States shall entrust the assessment of those measures to the national regulatory authorities or bodies.
- These measures shall consist of:
- Out-of-court redress mechanisms shall be available for the settlement of disputes between users and video-sharing platform providers.
- Video-sharing platform providers are encouraged to exchange best practices on co-regulatory codes of conduct.
Possible penalties
- Penalties determined by Member States.
-
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Child Sexual Abuse Material → EU
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Scope (Service)
- ‘Intermediary service’ means one of the following services:
- a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
- a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
- a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service;
- ‘Online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
- ‘Very large online platforms’ means online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.
Scope (Content)
- ‘Illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.
Proposals
- Obligations for all intermediary service providers:
- Providers need to establish a single point of contact for direct communication with authorities
(e.g., MS authorities and the Commission). - Providers which are not established in the Union but offer services here need to designate a legal representative in one of the Member States where its service is offered. This designated legal representative can be held liable for non-compliance with obligations under this Regulation.
- Providers’ terms and conditions must include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.
- At least once a year, providers must publish a report on any content moderation they engaged in during the relevant period, including the number of orders received from Member States, the number of notices submitted, information on own content moderation initiatives and the number of complaints received through internal complaints mechanisms.
- Providers need to establish a single point of contact for direct communication with authorities
- Obligations for hosting service providers, including online platforms:
- Providers must put in place notice and action mechanisms to submit notices. Complete notices confer actual knowledge of the specific item of information. The submitter of the notice shall be informed of the decision taken, providing information on redress. If automated means are used for the processing or decision-making, it must be stated that this was the case.
- When access to specific items is removed or disabled, the provider shall inform the poster about it, stating, among other things, the basis for taking the decision, information on the use of automated means in taking the decision, a reference to the legal basis for cases in which content is considered illegal and information on the redress possibilities. Providers shall publish the decisions and the statements of reasons (excluding personal data) in a publicly accessible database managed by the Commission.
- Obligations for all online platforms, excluding micro- and small enterprises:
- Online platforms must provide an internal complaint-handling system to fight decisions taken by the online platform on content that is illegal or incompatible with the terms and conditions for six months after the decision is taken. Where there are sufficient grounds for the platform to reconsider, the decision shall be reversed. Decisions must not be taken by purely automated means. In any case, complainants shall be informed of the decision taken by the platforms and about the possibility of out-of-court dispute settlements and other available redress possibilities.
- Complainants are entitled to select any certified out-of-court dispute settlement body in order to resolve disputes. Platforms are bound by the decision of that body. If the body decides in favour of the complainant, the platform shall reimburse the recipient for any expenses related to the dispute settlement. If the body decides in favour of the platform, the recipient shall not be required to reimburse the platform.
- Online platforms shall prioritize notices submitted by trusted flaggers. Applicants must fulfil specific criteria and are selected by the Digital Services Coordinator of establishment. When a trusted flagger submitted a significant number of inadequate notices, it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger.
- Online platforms shall, after a warning, suspend their services to users that frequently provide manifestly illegal content. Similarly, online platforms shall, after a warning, suspend, the processing of complaints by users which frequently submit unfounded notices. This policy is to be set out in the terms and conditions.
- Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State which is suspected to be affected of its suspicion and provide all relevant information available.
- If an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained a range of information such as contact details, an ID copy, bank details and the trade registry number. Upon receiving the information, platforms should make efforts to assess the information for validity and completeness. The information must be stored for the duration of their contractual relationship with the trader and subsequently deleted.
- Beyond Article 13, online platforms must also supply information on out-of-court dispute settlements, suspensions, and the use of automatic means for the purpose of content moderation. In addition, platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State. This information can also be requested by the Digital Services Coordinator of establishment at any moment.
- Platforms that display advertising must allow users to identify, for each specific advertisement displayed to each individual recipient the natural or legal person on whose behalf the advertisement is displayed as well as meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
- Obligations for very large online platforms:
- Very large platforms shall identify, at least once a year, any significant systemic risks stemming from the functioning and use made of their services. This risk assessment shall include the dissemination of illegal content, any negative effects for the exercise of fundamental rights and intentional manipulation of their services with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse or electoral processes and public security. This shall also account for how content moderation and recommender systems influence any of these risks.
- Very large platforms shall put in place mitigation measures, tailored to specific systemic risks by for example adapting content moderation or recommender systems, introducing targeted measures aimed at limiting the display of advertisements in association with the service they provide or initiating cooperation with other online platforms through the codes of conduct and the crisis protocols
- Very large platforms are subject, at their own expense and at least once a year, to audits to assess compliance with obligations set out in this chapter and commitments undertaken under codes of conduct. If an audit report is not positive, the platform must take the necessary measures to implement recommendations for improvement. One month after receiving the recommendations, an audit implementation report shall be adopted which sets out those measures. If operational recommendations are not implemented, platforms shall justify the reasons for not doing so and set out any alternative measures they may have taken.
- Very large platforms which use recommender systems shall set out the main parameters used in their recommender systems as well as any options for the user to modify or influence those main parameters in their terms and conditions. This shall include at least one option which is not based on profiling.
- Very large platforms that display advertising shall make publicly available a repository containing the content of each advertisement, the natural or legal person on whose behalf the advertisement is displayed, the period during which the advertisement was displayed, whether the advertisement was intended to be displayed specifically to particular groups and if so, which parameters were used for that purpose and the total number of recipients specifying, where applicable, aggregate numbers for the groups which were targeted specifically. Entries on this list shall be kept until one year after the advertisement was displayed for the last time on their online interfaces.
- Very large platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation.
- Very large platforms shall appoint one or more qualified compliance officers responsible for monitoring their compliance with this Regulation.
- Very large platforms must publicly publish the reports every six months, as well as annual risk assessment reports and the results of their audit (all of which must be also sent to the Digital Services Coordinator in their country of establishment). The platform can edit the results of these reports where there is a concern that e.g., trade secrets, will be impacted. However, complete reports must still be sent to the relevant Digital Services Coordinator.
- Other proposals:
- Where significant systemic risks emerge which concern several very large online platforms, the Commission may invite these platforms, other online platforms and other providers of intermediary services, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct.
- The Board can establish crisis protocol for extraordinary circumstances affecting public security or health. Platforms and very large platforms will have to work with the Commission to draw up and test these protocols, as well as other stakeholders where necessary, and Member State authorities. Work will be done to define what a crisis is, and how the protocols are to be put in place.
Possible penalties
- The maximum amount of penalties imposed for a failure to comply shall not exceed 6% of the annual income or turnover.
- Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover.
- Maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover.
- ‘Intermediary service’ means one of the following services:
-
WeProtect Global Alliance
Child Sexual Abuse Material → EU
WeProtect Global Alliance
Scope (Services)
- Voluntary countries and stakeholders
Scope (Content)
- Child sexual abuse material
Proposals:
- Endorsing the Declaration on the Launch of the Global Alliance against child sexual abuse online, countries and stakeholders commit to four key policy targets:
- enhancing efforts to identify victims and ensuring that they receive the necessary assistance, support and protection;
- enhancing efforts to investigate cases of child sexual abuse online and to identify and prosecute offenders;
- increasing awareness among children, parents, educators and the community at large about the risks;
- reducing the availability of child pornography online and the re-victimization of children.
-
Alliance to better protect minors online
Child Sexual Abuse Material → EU
Alliance to better protect minors online
Scope (Service)
- Voluntary stakeholders
Scope (Content)
- harmful content (e.g. violent or sexually exploitative content)
- harmful conduct (e.g. cyberbullying), and
- harmful contact (e.g. sexual extortion).
Proposals
- In addition to endorsing the Statement of Purpose, the signatory companies made individual commitments in line with their business activity:
- User empowerment
- Enhanced collaboration
- Awareness-raising
Austria
-
Communication Platforms Act [not finalized]
Child Sexual Abuse Material → Austria
Communication Platforms Act [not finalized]
Scope (Service)
- ‘Service provider’: the natural or legal person who operates a communication platform.
- ‘Communication platform’: An information society service, the main purpose or an essential function of which is to enable the exchange of messages or presentations with intellectual content in written, aural or visual form between users and a larger group of other users by way of mass dissemination.
- Excluded are operators of platforms which, over the course of the previous quarter, did not exceed an average of 100 000 registered users and the turnover of which did not exceed EUR 500 000 in the previous year in Austria.
Scope (Content)
- ‘Illegal content’: 14 specific ‘criminal offences’ listed in the Austrian criminal code (StGB)
- Offences related to child sexual abuse material listed in e.g. paragraphs 207a and 208a
Obligations
- Service providers shall set up an effective and transparent procedure for handling and processing reports on allegedly illegal content available on the platform.
- In response to a report, if the illegality of a piece of content is evident to a legal layperson without further investigation, it has to be removed or have access to it disabled no later than 24 hours after receipt of the report. If its illegality becomes apparent only after a detailed examination, it has to be removed or have access to it disabled after completion of the examination but no later than within 7 days of receipt of the report.
- In the case of illegalities being found, the content concerned, the time of its creation and the data required to identify the author have to be stored for evidence purposes, including for purposes of criminal prosecution, for up to 10 weeks.
- Service providers are obliged to prepare a report on the handling of reports of alleged illegal content on an annual basis (on a quarterly basis for communication platforms with over one million registered users) and to submit it to the supervisory authority no later than 1 month after the end of the calendar year covered in the report. In addition, the report shall be permanently made available and easily accessible on the service provider’s own website at the same time as it is sent.
Possible penalties
- The supervisory authority shall impose a fine of up to EUR 10 million on a service provider, depending on the severity of the violation.
France
-
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Child Sexual Abuse Material → France
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- ‘Illegal content’: numerous specific provisions laid down in other legislation.
- Offences related to child sexual abuse material listed in e.g paragraph 227-23 of the penal code
Obligations
- Public online communication services must put in place an easily accessible and visible system enabling anyone to bring to their attention any content falling under the aforementioned scope. They are also obliged to promptly inform the competent authorities of any of the aforementioned illegal activities which may be reported to them.
- Public online communication services must also make public the resources they devote to the fight against these illegal activities.
- In the specific cases of child sexual abuse material and terrorist content, the competent authorities may request the service provider to remove access to specific content. If such action is not taken within 24 hours, the authorities can request again for the service provider to remove access to the content in question which the service provider has to comply with without delay.
Possible penalties
- Failure to comply with the obligations can lead to Imprisonment of up to one year and a fine of EUR 75,000
-
Décret n° 2015-125 du 5 février 2015
Child Sexual Abuse Material → France
Décret n° 2015-125 du 5 février 2015
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- ‘Illegal content’: two specific provisions laid down in the french penal code.
- Offences related to child sexual abuse material listed in paragraph 227-23 of the penal code
Obligations
- Upon being notified by the National Police Office in charge of fighting digital crimes and without the need for a court order, services need to take down websites related to illegal content.
-
Décret n° 2015-253 du 4 mars 2015
Child Sexual Abuse Material → France
Décret n° 2015-253 du 4 mars 2015
Scope (Service)
- Search engines
Scope (Content)
- ‘Illegal content’: two specific provisions laid down in the french penal code.
- Offences related to child sexual abuse material listed in paragraph 227-23 of the penal code
Obligations
- Within forty-eight hours of the receipt of a notification from the relevant authorities, the operators of search engines shall take all reasonable measures to stop the listing of websites containing the aforementioned criminal offences.
Finland
-
Act on provision of information society services (2002)
Child Sexual Abuse Material → Finland
Act on provision of information society services (2002)
Scope (Service)
- ‘information society services’: Services which are provided as distance services, i.e. without the parties being present at the same time; electronically, i.e. by sending and receiving services via devices handling information electronically; as data transfers requested personally by recipients of services and usually against a payment.
Scope (Content)
- Information, the making available or transmission of which is punishable or as a basis for civil liability – presumably based on the Finnish criminal code.
- Under the presumption that the illegality of content is based on the finnish criminal code, the code lists numerous offences linked to child sexual abuse material, e.g. in chapter 17, sections 18a and 19, as well as chapter 10, sections 6 and 7.
Obligations
- A court may order the service provider to disable access to certain information based on its criminal nature. In such a process, the service provider and the content producer must be consulted except if the consultation cannot be arranged as quickly as the urgency of the matter requires.
- The court order must also be notified to the content producer. If the content producer is not known, the court may order the service provider to take care of notification.
- The service provider must immediately notify the content producer of prevention of access to the material supplied by him/her and to supply the content producer with a copy of the notification on the basis of which prevention was made
Possible penalties
- Laid out within the Penalty Payment Act (1113/1990).
Germany
-
Network Enforcement Act – NetzDG
Child Sexual Abuse Material → Germany
Network Enforcement Act – NetzDG
Scope (Service)
- ‘Telemedia service providers’: Internet platforms which, profit-making purposes, are designed to enable users to share any content with other users or to make such content available to the public (social networks). Platforms offering journalistic or editorial content, the responsibility for which lies with the service provider itself, shall not constitute social networks. The same applies to platforms which are designed to enable individual communication or the dissemination of specific content.
Scope (Content)
- 22 specific ‘criminal offences’ listed in the German criminal code (StGB)
- Offences related to child sexual abuse material listed in e.g. sections 184b and 184d
Obligations
- Platforms must put in place an effective and transparent procedure that is easily recognisable, directly accessible and permanently available to users
- Platforms should immediately become aware of submitted complaints and analyse the legal or illegal nature of the litigious content and the possibility of its removal or blocking
- After receiving a complaint, the platform has seven days to remove or block it except:
- 1. When the illegality of a content depends on identifiable factual circumstances, the online platform may give the user the opportunity to comment on the complaint before taking a decision
- 2. When the online platform requires an ad hoc regulated self-regulatory body to rule on the illegality of a content
- In cases of obviously illegal content, the time limit is reduced to 24 hours
- Online platforms are obliged to inform and justify their decisions without delay to the complainant and the content’s author
- In the event of the removal of illegal content, the online platform must ensure that it is secured and kept for evidence purposes for ten weeks
Possible penalties
- Fine between EUR 500,000 to EUR 5 million depending on the type of infringement
Hungary
-
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Child Sexual Abuse Material → Hungary
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Scope (Service)
- ‘Information society service’: Services provided electronically – normally for financial consideration – at a distance and at the individual request of the recipient of the services
- ‘Service provider’: Any natural or legal person, or business association lacking the legal status of a legal person providing an information society service
Scope (Content)
- ‘Illegal information’ – presumably based on the Hungarian criminal code.
- Under the presumption that the illegality of information is based on the Hungarian criminal code, the code lists offences linked to child sexual abuse material, notably in section 204.
Obligations
- If in the course of criminal proceedings the court has ordered the rendering of electronic information inaccessible, the service provider shall take measures within one working day upon receipt of the court order for disabling access to the electronically published illegal information.
- The party ordered to render electronic information temporarily or permanently inaccessible shall inform the users, disclosing the name of the court and the number of the ruling, as to the legal basis for the removal of the content in question, for disabling access to such content.
- If the court orders the service provider to restore access to such electronic information upon the conclusion of the proceedings, the service provider shall take measures to restore access to such information within one working day of receipt of the court order.
Possible penalties
- In the event of non-compliance, the court can impose fines, the amount of which are prescribed in the Act on Criminal Proceedings, or the Act on the Execution of Penal Measures and Sanctions.
Lithuania
-
Law No. XII-1428 on Cyber Security of 11 December 2014:
Child Sexual Abuse Material → Lithuania
Law No. XII-1428 on Cyber Security of 11 December 2014:
Scope (Service)
- ‘Digital services’: A group of services based on communications and information technology which encompasses the services of e-market, web search and/or cloud computing.
- ‘Digital service provider’: A legal entity which provides digital services in the Republic of Lithuania and/or other EU Member States.
Scope (Content)
- ‘Criminal offences’: Laid out in the Lithuanian criminal code
- Offences related to Child Sexual Abuse Material listed in e.g. Art. 152; 162; 309
Obligations
- While investigating criminal offences taking place on their sites, law enforcement may order the aforementioned services to limit the provision of their services for up to 48 hours without requiring a court sanction. If a court sanction is available the duration can be extended.
- Further, during an investigation, law enforcement can require services to preserve certain information collected during the provision of services such as a mail address, geographical location address, phone or any other access number as well as information about accounts and payments made on the basis of service agreement
Luxembourg
-
Loi du 14 août 2000 relative au commerce électronique
Child Sexual Abuse Material → Luxembourg
Loi du 14 août 2000 relative au commerce électronique
Scope (Service)
- ‘Information society service’: Any service provided, normally for remuneration, remotely by electronic means and at the individual request of a recipient of services
- ‘Service provider’: Any natural or legal person who provides an information society service;
Scope (Content)
- 2 specific articles of the Luxembourgish penal code
- Offences related to child sexual abuse material listed in article 383bis.
Obligations
- Hosting service providers are subject to specific monitoring obligations in order to detect possible infringements relating to specific illegal content.
Malta
-
Electronic Commerce Act
Child Sexual Abuse Material → Malta
Electronic Commerce Act
Scope (Service)
- ‘information society service’: Any service which is provided at a distance, by electronic means and at the individual request of a recipient of the service, whether such service is provided for consideration or not.
- ‘service provider’: Any person established in Malta providing an information society service.
Scope (Content)
- Illegal activity – presumably based on the Maltese criminal code
- Offences related to child sexual abuse material listed in e.g. section 208a
Obligations
- Information society service providers shall promptly inform the public authorities competent in the matter of any alleged illegal activity undertaken or information provided by recipients of their service and shall grant to any such authority upon request information enabling the identification of recipients of their service with whom they have storage agreements.
Possible penalties
- Any person contravening the aforementioned obligations shall be liable to a fine of not more than EUR 250,000 or to imprisonment not exceeding two years or both. In the case of a continuous offence a fine not exceeding EUR 2,500 for each day during which the offence continues can be imposed.
The Netherlands
-
Penal Code
Child Sexual Abuse Material → The Netherlands
Penal Code
Scope (Service)
- An intermediary which provides a telecommunication service that consists of the transfer or storage of
- data from a third party
Scope (Content)
- Criminal offences – based on the Dutch criminal code
- Offences related to hate speech listed in e.g. section 240b
Obligations
- An intermediary, by order of a public prosecutor, shall take all measures that may be reasonably required of it in order to disable data related to criminal offences.
Poland
-
Law on the protection of freedom of speech on online social networking sites
Child Sexual Abuse Material → Poland
Law on the protection of freedom of speech on online social networking sites
[Based on the draft version of 15th January 2021]
Scope (Services)
- ‘Online networking sites’: understood as a service provided electronically within the meaning of the law of 18 July 2002 on the provision of electronic services, which allows users to share any content with other users or the general public, and which is used by at least one million registered users in the Republic of Poland.
- ‘Service provider’: understood as the provider of online social networking services, consisting in storing on the online social network information provided by the user at his request, with at least one million registered users.
Scope (Content)
- Illegal content: understood as content that violates personal rights, disinformation, criminal content, as well as content that violates decency, in particular, disseminating or praising violence, suffering or humiliation
- Disinformation: understood as false or misleading information, produced, presented and disseminated for profit or in violation of the public interest.
- Criminal content: understood as content that praises or incites to commit prohibited acts, specified in Arts.: 117-119, 127, 130, 133, 134-135, 137, 140, 148-150, 189-189a, 190a, 194-204, 222-224a, 249-251, 255, 258, 343 of the law of 6 June 1997 – Penal Code (Journal of Laws of 2020, items 1444 and 1517) or which constitutes a prohibited act.
Obligations
- A service provider which receives over 100 user complaints in a calendar year regarding the distribution of access to illegal content, restriction of access to content or limitation of access to the user’s profile, must prepare a report in Polish every six months on how these complaints are resolved and publish it on the online social networking site no later than one month from the end of the given six months.
- Reports posted on an online social networking site must be clearly visible, directly and permanently accessible.
- The Service Provider must appoint at least one, but not more than three, representatives in the country.
- The Service Provider must immediately inform the President of the UKE about the appointment or change of a representative in the country, and provide their information, including their email address and address for service. Where the representative in the country is a legal person, the service provider shall provide details of natural persons authorised to act on behalf of that legal person.
- The service provider must publish on the online social network site in a clearly visible, directly and permanent accessible manner:
- full details of the representative in the country, including the electronic address and address for service, and if the representative in the country is a legal person, also details of natural persons authorised to act on behalf of that legal person;
- details of the service provider, including the exact name of the person running the online social networking site or their name and surname, registration or residential address, delivery address, registration data and email address.
- Persons who conduct internal review proceedings shall be regularly provided with training in Polish by the service provider, at least every six months.
- If the service provider ceases to operate, they are obliged to provide the President of the UKE with the data referred to in Art. 18 of the law of 18 July 2002 on the provision of electronic services, for further storage, sharing and protection.
- The service provider must establish an effective and understandable internal review procedure in Polish in matters relating to user complaints regarding:
- restriction of access to content;
- restriction of access to the user’s profile;
- disseminating illegal content.
- The service provider must publish the rules of the online social network site available to all users on the online social network site in Polish, which must also contain the rules of conducting internal review proceedings. The rules cannot be inconsistent with the provisions of generally applicable law.
- The service provider is obliged to ensure that complaints can be sent to internal review proceedings in a clearly visible, directly and permanently accessible manner.
- A representative in the country shall immediately send confirmation of receipt of the submitted complaint via the email address indicated in the complaint.
- A representative in the country shall examine the user’s complaint and inform them through the indicated email address about the method of consideration, within 48 hours of lodging the complaint.
- If the complaint is accepted, the service provider shall:
- restore restricted access to content, or
- restore limited access to the user’s profile, or
- prevent the distribution of illegal content.
- Information on the method of considering the complaint should include the reasons for the resolution, indicating in particular the legal grounds with reference to the rules of the online social network site and the actual grounds for the decision with justification. The information should also contain instructions on the possibility of submitting a complaint to the Council, as well as the date and manner of filing a complaint.
- The service provider shall inform the user about the possibility of pursuing claims through civil proceedings and the possibility of notifying about the commission of an offence.
- In case a user submits a complaint to the Freedom of Speech Council, because he/she is dissatisfied with the way a complaint was handled by the internal review proceedings, the representative in the country shall provide the Council with the materials collected in the internal review proceedings within 24 hours from the notification of the complaint submitted by the user.
- The service provider shall execute the Council’s decision immediately, and not later than within 24 hours of its delivery.
- A representative in the country, within 24 hours from the deadline for implementing the decision referred to in Para. 2, shall inform the Council about the manner of its implementation. If the service provider fails to implement the decision, the Council shall immediately notify the President of the UKE.
- The service provider may not limit access to the content that was the subject of the examination by the Council again.
- In the event that criminal content is found, a prosecutor may ask the service provider or representative in the country to send the necessary information, in particular regarding data specifying the user and publications posted on the online social networking site.
- If it is found that the content of a criminal content contains a publication with pornographic content involving a minor or content that praises or incites to commit acts of a terrorist nature, or that further access to this publication creates the risk of causing significant damage or causing effects difficult to reverse, the prosecutor shall immediately issue a decision ordering the service provider to prevent access to this content.
- Within 3 months from the entry into force of the law, service providers shall submit a report for the previous year, containing information on the methods of counteracting disinformation and disseminating illegal content to date.
[Modification of Art. 111 of the law of 5 January 2011 – Election Code (Journal of Laws of 2020, item 1319)]
- After § 1, §1a is added as follows: “§ 1a. The candidate or representative of the election committee concerned has the right to submit to the district court a request for a decision to prohibit the service provider within the meaning of the law of … on the protection of freedom of speech on online social networking sites (Journal of Laws of … item …) of disseminating illegal content within the meaning of the law, produced with the intention of deliberately violating the integrity of elections.”
Penalties
- For a breach by the service provider of any of its obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
- For a breach by a representative in the country of any of his/her obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
Sweden
-
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Child Sexual Abuse Material → Sweden
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Scope (Service)
- ‘Electronic bulletin boards’: A service for conveyance of electronic messages in text, picture, sound or other information formats. Excluded are services which only provide a network or other connections for the transmission of messages or other services necessary to use a network or other transmission channel. Also excluded is the mediation of messages within a government agency or between government agencies or within an company or a legal group of companies, Emails and services which are covered by the regulations in the Freedom of the Press Act or the Fundamental Law on Freedom of Expression (printed media, film, radio, television and sound recordings).
Scope (Content)
- A number of fixed offences listed in the Swedish penal code, including provisions on child sexual abuse material (chapter 16 section 10a).
Obligations
- If a user submits a message to an electronic bulletin board which contains the aforementioned illegalities, the supplier must remove the message, or in other ways make it inaccessible. In order to be able to fulfill the obligation, the service provider is allowed to check the content of message in the service
Possible penalties
- Fines or a prison sentence of not more than six month unless the crime is considered severe, in which case a prison sentence shall be no more than two years. Slight infringement should not be punished.
United Kingdom
-
Online harms White Paper [not finalized]
Child Sexual Abuse Material → United Kingdom
Online harms White Paper [not finalized]
Scope (Service)
- Companies (explicitly including search engines) whose services host user-generated content which can be accessed by users in the UK facilitate public or private online interaction between service users, one or more of whom is in the UK. The regulatory framework will also apply to public communication channels and services where users expect a greater degree of privacy, e.g. online instant messaging services and closed social media groups.
- Excluded are business-to-business services, services which play a functional role in enabling online activity, such as internet service providers and low-risk businesses.
- The legislation will take a tiered approach differentiating between smaller and larger services based on the audience and the functionalities offered.
Scope (Content)
- ‘Illegal content’: Presumably based on the UK’s criminal code, specific mentions were made of child sexual exploitation and abuse material as well as terrorist content.
- ‘Harmful content’: The legislation will provide a general definition of harmful content. As of yet it is only stated that for content to be considered harmful, it must give rise to a “reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals”
Obligations
- The legislation will require companies to prevent the proliferation of illegal content and activity online and ensure that children who use their services are not exposed to harmful content. The duty of care will also cover disinformation and misinformation that could cause significant harm to an individual.
- Depending on the size of the audience and the functionalities offered by a service, larger platforms face additional requirements regarding content that is legal but harmful to adults.
- The regulator, Ofcom, will issue codes of practice which are mandatory for companies to comply with or requires them to show an equally effective approach.
Possible penalties
- Ofcom will be able to issue fines of up to £18 million or 10% of global annual turnover, whichever is the higher in cases of non-compliance.
Hate Speech
EU
-
Counter-Racism Framework Decision 2008/913/JHA
Hate Speech → EU
Counter-Racism Framework Decision 2008/913/JHA
Scope (Service)
- Not applicable
Scope (Content)
- Certain forms of conduct as outlined below, are punishable as criminal offences:
- public incitement to violence or hatred directed against a group of persons or a member of such a group defined on the basis of race, colour, descent, religion or belief, or national or ethnic origin;
- the above-mentioned offence when carried out by the public dissemination or distribution of tracts, pictures or other material;
- publicly condoning, denying or grossly trivialising crimes of genocide, crimes against humanity and war crimes as defined in the Statute of the International Criminal Court (Articles 6, 7 and 8) and crimes defined in Article 6 of the Charter of the International Military Tribunal, when the conduct is carried out in a manner likely to incite violence or hatred against such a group or a member of such a group.
- Instigating, aiding or abetting in the commission of the above offences is also punishable.
Obligations
- Provides that Member States must make sure that racism and xenophobic hate speech is punishable by law.
-
Code of Conduct on countering illegal hate speech online
Hate Speech → EU
Code of Conduct on countering illegal hate speech online
Scope (Service)
- Voluntary code signed by Facebook, Microsoft, Twitter and YouTube, later also joined by Snapchat, Instagram, Dailymotion, Jeuxvideos.com and TikTok.
Scope (Content)
- “Illegal hate speech” – referring to the Counter-Racism Framework Decision
Obligations
- Platforms need to have in place clear and effective processes to review notifications regarding illegal hate speech on their services so they can remove or disable access to such content.
- Platforms must review most valid notifications for removal of illegal hate speech in less than 24 hours and, if necessary, remove or disable access to such content.
- Platforms shall work together with member states to improve the speed and effectiveness of notifications and on disabling access to or removal of illegal hate speech online.
- Platforms must provide regular training to their staff on current societal developments and to exchange views on the potential for further improvement. Further, they shall intensify cooperation between themselves and other platforms and social media companies to enhance best practice sharing.
-
Directive 2018/1808 on Audiovisual Media Services
Hate Speech → EU
Directive 2018/1808 on Audiovisual Media Services
Scope (Service)
- ‘Video-sharing platform service’ means a service where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.
- Social media services are in scope and are considered as video-sharing platform services if the provision of programmes and user-generated videos constitutes an essential functionality of that service .
Scope (Content)
- VSPs shall take appropriate measures to protect:
- minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1).
- the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter.
- the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence, offences concerning child pornography and offences concerning racism and xenophobia.
Obligations
- VSPs shall comply with requirements on audiovisual commercial communications, “taking into account the limited control exercised by those video-sharing platforms over those audiovisual commercial communications.”. The measures imposed by MS shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided. Those measures shall not lead to any ex-ante control measures or upload-filtering of content
- These measures shall consist of:
- (a) including and applying in the terms and conditions of the video-sharing platform services the requirements referred to in paragraph 1;
- (b) including and applying in the terms and conditions of the video-sharing platform services the requirements set out in Article 9(1) for audiovisual commercial communications that are not marketed, sold or arranged by the video-sharing platform providers;
- (c) having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
- (d) establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider concerned the content referred to in paragraph 1 provided on its platform;
- (e) establishing and operating systems through which video-sharing platform providers explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (d);
- (f) establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
- (g) establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate the content referred to in paragraph 1;
- (h) providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
- (i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users’ complaints to the video-sharing platform provider in relation to the implementation of the measures referred to in points (d) to (h);
- (j) providing for effective media literacy measures and tools and raising users’ awareness of those measures and tools.
- Member States shall entrust the assessment of those measures to the national regulatory authorities or bodies.
- These measures shall consist of:
- Out-of-court redress mechanisms shall be available for the settlement of disputes between users and video-sharing platform providers.
- Video-sharing platform providers are encouraged to exchange best practices on co-regulatory codes of conduct.
Possible penalties
- Penalties determined by Member States.
Austria
-
Communication Platforms Act [not finalized]
Hate Speech → Austria
Communication Platforms Act [not finalized]
Scope (Service)
- ‘Service provider’: the natural or legal person who operates a communication platform.
- ‘Communication platform’: An information society service, the main purpose or an essential function of which is to enable the exchange of messages or presentations with intellectual content in written, aural or visual form between users and a larger group of other users by way of mass dissemination.
- Excluded are operators of platforms which, over the course of the previous quarter, did not exceed an average of 100 000 registered users and the turnover of which did not exceed EUR 500 000 in the previous year in Austria.
Scope (Content)
- ‘Illegal content’: 14 specific ‘criminal offences’ listed in the Austrian criminal code (StGB)
- Offences related to hate speech listed in e.g. paragraphs 188 and 283
Obligations
- Service providers shall set up an effective and transparent procedure for handling and processing reports on allegedly illegal content available on the platform.
- In response to a report, if the illegality of a piece of content is evident to a legal layperson without further investigation, it has to be removed or have access to it disabled no later than 24 hours after receipt of the report. If its illegality becomes apparent only after a detailed examination, it has to be removed or have access to it disabled after completion of the examination but no later than within 7 days of receipt of the report.
- In the case of illegalities being found, the content concerned, the time of its creation and the data required to identify the author have to be stored for evidence purposes, including for purposes of criminal prosecution, for up to 10 weeks.
- Service providers are obliged to prepare a report on the handling of reports of alleged illegal content on an annual basis (on a quarterly basis for communication platforms with over one million registered users) and to submit it to the supervisory authority no later than 1 month after the end of the calendar year covered in the report. In addition, the report shall be permanently made available and easily accessible on the service provider’s own website at the same time as it is sent.
Possible penalties
- The supervisory authority shall impose a fine of up to EUR 10 million on a service provider, depending on the severity of the violation.
France
-
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Hate Speech → France
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- ‘Illegal content’: numerous specific provisions laid down in other legislation.
- Offences related to hate speech listed in e.g paragraphs 7 and 8 of article 24 of the law of July 29, 1881 on freedom of the press
Obligations
- Public online communication services must put in place an easily accessible and visible system enabling anyone to bring to their attention any content falling under the aforementioned scope. They are also obliged to promptly inform the competent authorities of any of the aforementioned illegal activities which may be reported to them.
- Public online communication services must also make public the resources they devote to the fight against these illegal activities.
Possible penalties
- Imprisonment of up to one year and a fine of EUR 75,000
Finland
-
Act on provision of information society services (2002)
Hate Speech → Finland
Act on provision of information society services (2002)
Scope (Service)
- ‘information society services’: Services which are provided as distance services, i.e. without the parties being present at the same time; electronically, i.e. by sending and receiving services via devices handling information electronically; as data transfers requested personally by recipients of services and usually against a payment.
Scope (Content)
- Information, the making available or transmission of which is punishable or as a basis for civil liability – presumably based on the Finnish criminal code.
- Under the presumption that the illegality of content is based on the finnish criminal code, the code lists numerous offences linked to hate speech, e.g. in chapter 11, section 3.
Obligations
- A court may order the service provider to disable access to certain information based on its criminal nature. In such a process, the service provider and the content producer must be consulted except if the consultation cannot be arranged as quickly as the urgency of the matter requires.
- The court order must also be notified to the content producer. If the content producer is not known, the court may order the service provider to take care of notification.
- The service provider must immediately notify the content producer of prevention of access to the material supplied by him/her and to supply the content producer with a copy of the notification on the basis of which prevention was made
Possible penalties
- Laid out within the Penalty Payment Act (1113/1990).
Germany
-
Network Enforcement Act – NetzDG
Hate Speech → Germany
Network Enforcement Act – NetzDG
Scope (Service)
- ‘Telemedia service providers’: Internet platforms which, profit-making purposes, are designed to enable users to share any content with other users or to make such content available to the public (social networks). Platforms offering journalistic or editorial content, the responsibility for which lies with the service provider itself, shall not constitute social networks. The same applies to platforms which are designed to enable individual communication or the dissemination of specific content.
Scope (Content)
- 22 specific ‘criminal offences’ listed in the German criminal code (StGB)
- Offences related to hate speech listed in e.g. section 166
Obligations
- Platforms must put in place an effective and transparent procedure that is easily recognisable, directly accessible and permanently available to users
- Platforms should immediately become aware of submitted complaints and analyse the legal or illegal nature of the litigious content and the possibility of its removal or blocking
- After receiving a complaint, the platform has seven days to remove or block it except:
- 1. When the illegality of a content depends on identifiable factual circumstances, the online platform may give the user the opportunity to comment on the complaint before taking a decision
- 2. When the online platform requires an ad hoc regulated self-regulatory body to rule on the illegality of a content
- In cases of obviously illegal content, the time limit is reduced to 24 hours
- Online platforms are obliged to inform and justify their decisions without delay to the complainant and the content’s author
- In the event of the removal of illegal content, the online platform must ensure that it is secured and kept for evidence purposes for ten weeks
Possible penalties
- Fine between EUR 500,000 to EUR 5 million depending on the type of infringement
Hungary
-
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Hate Speech → Hungary
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Scope (Service)
- ‘Information society service’: Services provided electronically – normally for financial consideration – at a distance and at the individual request of the recipient of the services
- ‘Service provider’: Any natural or legal person, or business association lacking the legal status of a legal person providing an information society service
Scope (Content)
- ‘Illegal information’ – presumably based on the Hungarian criminal code.
- Under the presumption that the illegality of information is based on the Hungarian criminal code, the code lists offences linked to hate speech, notably in section 332..
Obligations
- If in the course of criminal proceedings the court has ordered the rendering of electronic information inaccessible, the service provider shall take measures within one working day upon receipt of the court order for disabling access to the electronically published illegal information.
- The party ordered to render electronic information temporarily or permanently inaccessible shall inform the users, disclosing the name of the court and the number of the ruling, as to the legal basis for the removal of the content in question, for disabling access to such content.
- If the court orders the service provider to restore access to such electronic information upon the conclusion of the proceedings, the service provider shall take measures to restore access to such information within one working day of receipt of the court order.
Possible penalties
- In the event of non-compliance, the court can impose fines, the amount of which are prescribed in the Act on Criminal Proceedings, or the Act on the Execution of Penal Measures and Sanctions.
Lithuania
-
Law No. XII-1428 on Cyber Security of 11 December 2014:
Hate Speech → Lithuania
Law No. XII-1428 on Cyber Security of 11 December 2014:
Scope (Service)
- ‘Digital services’: A group of services based on communications and information technology which encompasses the services of e-market, web search and/or cloud computing.
- ‘Digital service provider’: A legal entity which provides digital services in the Republic of Lithuania and/or other EU Member States.
Scope (Content)
- ‘Criminal offences’: Laid out in the Lithuanian criminal code
- Offences related to Hate Speech listed in e.g. Art. 170 & 170(1)
Obligations
- While investigating criminal offences taking place on their sites, law enforcement may order the aforementioned services to limit the provision of their services for up to 48 hours without requiring a court sanction. If a court sanction is available the duration can be extended.
- Further, during an investigation, law enforcement can require services to preserve certain information collected during the provision of services such as a mail address, geographical location address, phone or any other access number as well as information about accounts and payments made on the basis of service agreement
Luxembourg
-
Loi du 14 août 2000 relative au commerce électronique
Hate Speech → Luxembourg
Loi du 14 août 2000 relative au commerce électronique
Scope (Service)
- ‘Information society service’: Any service provided, normally for remuneration, remotely by electronic means and at the individual request of a recipient of services
- ‘Service provider’: Any natural or legal person who provides an information society service;
Scope (Content)
- 2 specific articles of the Luxembourgish penal code
- Offences related to hate speech listed in article 457-1.
Obligations
- Hosting service providers are subject to specific monitoring obligations in order to detect possible infringements relating to specific illegal content.
Malta
-
Electronic Commerce Act
Hate Speech → Malta
Electronic Commerce Act
Scope (Service)
- ‘information society service’: Any service which is provided at a distance, by electronic means and at the individual request of a recipient of the service, whether such service is provided for consideration or not.
- ‘service provider’: Any person established in Malta providing an information society service.
Scope (Content)
- Illegal activity – presumably based on the Maltese criminal code
- Offences related to hate speech listed in e.g. section 82a
Obligations
- Information society service providers shall promptly inform the public authorities competent in the matter of any alleged illegal activity undertaken or information provided by recipients of their service and shall grant to any such authority upon request information enabling the identification of recipients of their service with whom they have storage agreements.
Possible penalties
- Any person contravening the aforementioned obligations shall be liable to a fine of not more than EUR 250,000 or to imprisonment not exceeding two years or both. In the case of a continuous offence a fine not exceeding EUR 2,500 for each day during which the offence continues can be imposed.
The Netherlands
-
Penal Code
Hate Speech → The Netherlands
Penal Code
Scope (Service)
- An intermediary which provides a telecommunication service that consists of the transfer or storage of data from a third party.
Scope (Content)
- Criminal offences – based on the Dutch criminal code
- Offences related to hate speech listed in e.g. sections 137c – 137f.
Obligations
- An intermediary, by order of a public prosecutor, shall take all measures that may be reasonably required of it in order to disable data related to criminal offences.
Poland
-
Law on the protection of freedom of speech on online social networking sites
Hate Speech → Poland
Law on the protection of freedom of speech on online social networking sites
[Based on the draft version of 15th January 2021]
Scope (Services)
- ‘Online networking sites’: understood as a service provided electronically within the meaning of the law of 18 July 2002 on the provision of electronic services, which allows users to share any content with other users or the general public, and which is used by at least one million registered users in the Republic of Poland.
- ‘Service provider’: understood as the provider of online social networking services, consisting in storing on the online social network information provided by the user at his request, with at least one million registered users.
Scope (Content)
- Illegal content: understood as content that violates personal rights, disinformation, criminal content, as well as content that violates decency, in particular, disseminating or praising violence, suffering or humiliation
- Disinformation: understood as false or misleading information, produced, presented and disseminated for profit or in violation of the public interest.
- Criminal content: understood as content that praises or incites to commit prohibited acts, specified in Arts.: 117-119, 127, 130, 133, 134-135, 137, 140, 148-150, 189-189a, 190a, 194-204, 222-224a, 249-251, 255, 258, 343 of the law of 6 June 1997 – Penal Code (Journal of Laws of 2020, items 1444 and 1517) or which constitutes a prohibited act.
Obligations
- A service provider which receives over 100 user complaints in a calendar year regarding the distribution of access to illegal content, restriction of access to content or limitation of access to the user’s profile, must prepare a report in Polish every six months on how these complaints are resolved and publish it on the online social networking site no later than one month from the end of the given six months.
- Reports posted on an online social networking site must be clearly visible, directly and permanently accessible.
- The Service Provider must appoint at least one, but not more than three, representatives in the country.
- The Service Provider must immediately inform the President of the UKE about the appointment or change of a representative in the country, and provide their information, including their email address and address for service. Where the representative in the country is a legal person, the service provider shall provide details of natural persons authorised to act on behalf of that legal person.
- The service provider must publish on the online social network site in a clearly visible, directly and permanent accessible manner:
- full details of the representative in the country, including the electronic address and address for service, and if the representative in the country is a legal person, also details of natural persons authorised to act on behalf of that legal person;
- details of the service provider, including the exact name of the person running the online social networking site or their name and surname, registration or residential address, delivery address, registration data and email address.
- Persons who conduct internal review proceedings shall be regularly provided with training in Polish by the service provider, at least every six months.
- If the service provider ceases to operate, they are obliged to provide the President of the UKE with the data referred to in Art. 18 of the law of 18 July 2002 on the provision of electronic services, for further storage, sharing and protection.
- The service provider must establish an effective and understandable internal review procedure in Polish in matters relating to user complaints regarding:
- restriction of access to content;
- restriction of access to the user’s profile;
- disseminating illegal content.
- The service provider must publish the rules of the online social network site available to all users on the online social network site in Polish, which must also contain the rules of conducting internal review proceedings. The rules cannot be inconsistent with the provisions of generally applicable law.
- The service provider is obliged to ensure that complaints can be sent to internal review proceedings in a clearly visible, directly and permanently accessible manner.
- A representative in the country shall immediately send confirmation of receipt of the submitted complaint via the email address indicated in the complaint.
- A representative in the country shall examine the user’s complaint and inform them through the indicated email address about the method of consideration, within 48 hours of lodging the complaint.
- If the complaint is accepted, the service provider shall:
- restore restricted access to content, or
- restore limited access to the user’s profile, or
- prevent the distribution of illegal content.
- Information on the method of considering the complaint should include the reasons for the resolution, indicating in particular the legal grounds with reference to the rules of the online social network site and the actual grounds for the decision with justification. The information should also contain instructions on the possibility of submitting a complaint to the Council, as well as the date and manner of filing a complaint.
- The service provider shall inform the user about the possibility of pursuing claims through civil proceedings and the possibility of notifying about the commission of an offence.
- In case a user submits a complaint to the Freedom of Speech Council, because he/she is dissatisfied with the way a complaint was handled by the internal review proceedings, the representative in the country shall provide the Council with the materials collected in the internal review proceedings within 24 hours from the notification of the complaint submitted by the user.
- The service provider shall execute the Council’s decision immediately, and not later than within 24 hours of its delivery.
- A representative in the country, within 24 hours from the deadline for implementing the decision referred to in Para. 2, shall inform the Council about the manner of its implementation. If the service provider fails to implement the decision, the Council shall immediately notify the President of the UKE.
- The service provider may not limit access to the content that was the subject of the examination by the Council again.
- In the event that criminal content is found, a prosecutor may ask the service provider or representative in the country to send the necessary information, in particular regarding data specifying the user and publications posted on the online social networking site.
- If it is found that the content of a criminal content contains a publication with pornographic content involving a minor or content that praises or incites to commit acts of a terrorist nature, or that further access to this publication creates the risk of causing significant damage or causing effects difficult to reverse, the prosecutor shall immediately issue a decision ordering the service provider to prevent access to this content.
- Within 3 months from the entry into force of the law, service providers shall submit a report for the previous year, containing information on the methods of counteracting disinformation and disseminating illegal content to date.
[Modification of Art. 111 of the law of 5 January 2011 – Election Code (Journal of Laws of 2020, item 1319)]
- After § 1, §1a is added as follows: “§ 1a. The candidate or representative of the election committee concerned has the right to submit to the district court a request for a decision to prohibit the service provider within the meaning of the law of … on the protection of freedom of speech on online social networking sites (Journal of Laws of … item …) of disseminating illegal content within the meaning of the law, produced with the intention of deliberately violating the integrity of elections.”
Penalties
- For a breach by the service provider of any of its obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
- For a breach by a representative in the country of any of his/her obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
Portugal
-
Decree-Law no. 7/2004, of 7 January
Hate Speech → Portugal
Decree-Law no. 7/2004, of 7 January
Scope (Service)
- ‘Information society services’: Any service provided at a distance by electronic means, for remuneration or at least in the scope of an economic activity at the individual request of a recipient of services.
- ‘Intermediary networking service providers’: The providers that render technical services for the access, availability and use of online information or services, which are independent of the generation of the information or service themselves.
Scope (Content)
- Illegal content is understood as content threatening to damage or damages:
- Human dignity or public policy, including the protection of minors and the fight against any incitement to hatred for reasons of race, sex, religion or nationality, namely on grounds of the prevention of, or combat against, crimes or offences against public policy;
- Public health;
- Public security, namely the safeguarding of national security and defence;
- Consumers, including investors.
Obligations
- Intermediary service providers shall promptly inform competent authorities when they become aware of illegal activities undertaken via services rendered.
- Upon request of the competent authorities, intermediary services should identify recipients of their services with whom they have entered into storage agreements. In addition, upon request, intermediary services should supply lists of owners of hosted websites.
- Intermediary services are required to promptly comply with instructions aiming to terminate or prevent an offence, namely to remove or disable access to a given information
- Service providers only have to disable access to disputed content if its illegality is obvious.
- In case of such a decision, any interested party may appeal to the relevant supervisory entity, which shall present a preliminary settlement within forty-eight hours and notify it immediately to all those involved by electronic means.
- Whatever the decision may be, the intermediary service provider shall not be liable for having or not having removed the contents or disabled the access based on a mere request, where the illegality is not obvious.
Possible penalties
- Non-compliance of service providers with regards to the aforementioned obligations are liable to a fine from EUR 5000€ to EUR 100 000€.
Sweden
-
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Hate Speech → Sweden
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Scope (Service)
- ‘Electronic bulletin boards’: A service for conveyance of electronic messages in text, picture, sound or other information formats. Excluded are services which only provide a network or other connections for the transmission of messages or other services necessary to use a network or other transmission channel. Also excluded is the mediation of messages within a government agency or between government agencies or within an company or a legal group of companies, Emails and services which are covered by the regulations in the Freedom of the Press Act or the Fundamental Law on Freedom of Expression (printed media, film, radio, television and sound recordings).
Scope (Content)
- A number of fixed offences listed in the Swedish penal code, including provisions on hate speech (chapter 16 section 8).
Obligations
- If a user submits a message to an electronic bulletin board which contains the aforementioned illegalities, the supplier must remove the message, or in other ways make it inaccessible. In order to be able to fulfill the obligation, the service provider is allowed to check the content of message in the service
Possible penalties
- Fines or a prison sentence of not more than six month unless the crime is considered severe, in which case a prison sentence shall be no more than two years. Slight infringement should not be punished.
Intellectual Property Rights
EU
-
Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market
Intellectual Property Rights → EU
Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market
Scope (Service)
- ‘Online content-sharing service provider’ means a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes.
- Providers of services, such as not-for-profit online encyclopedias, not-for-profit educational and scientific repositories, open source software-developing and-sharing platforms, providers of electronic communications services as defined in Directive (EU) 2018/1972, online marketplaces, business-to-business cloud services and cloud services that allow users to upload content for their own use, are not ‘online content-sharing service providers’ within the meaning of this Directive.
Scope (Content)
- Copyright-protected works or other protected subject matter uploaded by users
Obligations
- When online content-sharing service providers give the public access to copyright-protected works or other protected subject matter uploaded by its users, it shall obtain an authorisation from the relevant rightsholders, for instance by concluding a licensing agreement.
- Online content-sharing service providers shall put in place an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.
-
Memorandum of Understanding on the sale of counterfeit goods via the Internet
Intellectual Property Rights → EU
Memorandum of Understanding on the sale of counterfeit goods via the Internet
Scope (Service)
- Voluntary commitment signed by 28 signatories representing rights owners, internet platforms and associations. The “internet platforms” which have signed the commitment are Alibaba, Amazon, eBay, Facebook Marketplace, Allegro, OLX and Priceminister Rakuten.
Scope (Content)
- The MoU aims to prevent the sale of counterfeit goods which are defined as “non-original physical goods manufactured without the consent of the Rights Owner which infringe IPR, pursuant to applicable Member State or EU law”.
Obligations
- Platforms commit themselves to notice and takedown systems which allow rightsholders to flag items they deem counterfeit.
- If such a claim is found to be valid by platforms, they commit to removing the offer without undue delay.
- Further, Internet Platforms commit to take commercially and technically reasonable steps to request seller contact information and to verify this information, provided by sellers.
- Internet Platforms also commit to take appropriate, commercially reasonable and technically feasible measures, to identify and/or prevent pro- actively the sale of (obvious) Counterfeit Goods, and to prevent such goods being offered or sold through their services.
- The measures taken by Internet Platforms are up to their discretion.
- Internet Platforms and Rights Owners commit to provide appropriate means to consumers to identify and report Offers of Counterfeit Goods, prior to, or after purchase, to Internet Platforms and to Rights Owners.
-
Memorandum of Understanding on online advertising and IPR
Intellectual Property Rights → EU
Memorandum of Understanding on online advertising and IPR
Scope (Services)
- Signatories represent parties involved in placing, buying, selling and/or facilitating advertising and other parties with an interest in fighting IPR infringement.
Scope (Content)
- Advertising on websites and mobile applications that infringe copyright or that disseminate counterfeit goods, on a commercial scale.
- Counterfeit good: should be understood as goods which are the subject of an act infringing a trade mark in the country where they are found and bear without authorisation a sign which is identical to the trade mark validly registered in respect of the same type of goods, or which cannot be distinguished in its essential aspects from such a trade mark, as well as any packaging, label, sticker, brochure, operating instructions, warranty document or other similar item, even if presented separately, which is the subject of an act infringing a trade mark, which includes a sign, name or term which is identical to a validly registered trade mark, or which cannot be distinguished in its essential aspects from such a trade mark, and which can be used for the same type of goods as that for which the trade mark has been registered.
Obligations
- Advertisers should minimise the placement of their advertising on websites and applications which have been found to infringe copyright and disseminate counterfeit use on a commercial scale and on websites and applications which the advertisers have reasonably available evidence that they are infringing copyright or disseminating counterfeit goods, on a commercial scale.
- Advertisers should commit to adopt an IPR policy and make it publicly available.
- Advertising Intermediaries undertake to allow, in their contractual agreements, advertisers or other media buyers to use and/or require the use of tools and safeguards with the aim that the advertising placed through or with support of the Advertising Intermediaries’ services is not placed on websites and mobile applications.
- Advertising Intermediaries may also individually allow Advertisers to use and/or require the use of tools and safeguards with the aim that the advertising placed through —or with the support of —the Advertising Intermediaries’ services is not placed on websites or mobile applications that have been identified by Advertisers in their advertising placement policies.
- Advertising Intermediaries should commit to adopt an IPR policy and make it publicly available.
-
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Intellectual Property Rights → EU
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Scope (Service)
- ‘Intermediary service’ means one of the following services:
- a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
- a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
- a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service;
- ‘Online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
- ‘Very large online platforms’ means online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.
Scope (Content)
- ‘Illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.
Proposals
- Obligations for all intermediary service providers:
- Providers need to establish a single point of contact for direct communication with authorities
(e.g., MS authorities and the Commission). - Providers which are not established in the Union but offer services here need to designate a legal representative in one of the Member States where its service is offered. This designated legal representative can be held liable for non-compliance with obligations under this Regulation.
- Providers’ terms and conditions must include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.
- At least once a year, providers must publish a report on any content moderation they engaged in during the relevant period, including the number of orders received from Member States, the number of notices submitted, information on own content moderation initiatives and the number of complaints received through internal complaints mechanisms.
- Providers need to establish a single point of contact for direct communication with authorities
- Obligations for hosting service providers, including online platforms:
- Providers must put in place notice and action mechanisms to submit notices. Complete notices confer actual knowledge of the specific item of information. The submitter of the notice shall be informed of the decision taken, providing information on redress. If automated means are used for the processing or decision-making, it must be stated that this was the case.
- When access to specific items is removed or disabled, the provider shall inform the poster about it, stating, among other things, the basis for taking the decision, information on the use of automated means in taking the decision, a reference to the legal basis for cases in which content is considered illegal and information on the redress possibilities. Providers shall publish the decisions and the statements of reasons (excluding personal data) in a publicly accessible database managed by the Commission.
- Obligations for all online platforms, excluding micro- and small enterprises:
- Online platforms must provide an internal complaint-handling system to fight decisions taken by the online platform on content that is illegal or incompatible with the terms and conditions for six months after the decision is taken. Where there are sufficient grounds for the platform to reconsider, the decision shall be reversed. Decisions must not be taken by purely automated means. In any case, complainants shall be informed of the decision taken by the platforms and about the possibility of out-of-court dispute settlements and other available redress possibilities.
- Complainants are entitled to select any certified out-of-court dispute settlement body in order to resolve disputes. Platforms are bound by the decision of that body. If the body decides in favour of the complainant, the platform shall reimburse the recipient for any expenses related to the dispute settlement. If the body decides in favour of the platform, the recipient shall not be required to reimburse the platform.
- Online platforms shall prioritize notices submitted by trusted flaggers. Applicants must fulfil specific criteria and are selected by the Digital Services Coordinator of establishment. When a trusted flagger submitted a significant number of inadequate notices, it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger.
- Online platforms shall, after a warning, suspend their services to users that frequently provide manifestly illegal content. Similarly, online platforms shall, after a warning, suspend, the processing of complaints by users which frequently submit unfounded notices. This policy is to be set out in the terms and conditions.
- Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State which is suspected to be affected of its suspicion and provide all relevant information available.
- If an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained a range of information such as contact details, an ID copy, bank details and the trade registry number. Upon receiving the information, platforms should make efforts to assess the information for validity and completeness. The information must be stored for the duration of their contractual relationship with the trader and subsequently deleted.
- Beyond Article 13, online platforms must also supply information on out-of-court dispute settlements, suspensions, and the use of automatic means for the purpose of content moderation. In addition, platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State. This information can also be requested by the Digital Services Coordinator of establishment at any moment.
- Platforms that display advertising must allow users to identify, for each specific advertisement displayed to each individual recipient the natural or legal person on whose behalf the advertisement is displayed as well as meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
- Obligations for very large online platforms:
- Very large platforms shall identify, at least once a year, any significant systemic risks stemming from the functioning and use made of their services. This risk assessment shall include the dissemination of illegal content, any negative effects for the exercise of fundamental rights and intentional manipulation of their services with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse or electoral processes and public security. This shall also account for how content moderation and recommender systems influence any of these risks.
- Very large platforms shall put in place mitigation measures, tailored to specific systemic risks by for example adapting content moderation or recommender systems, introducing targeted measures aimed at limiting the display of advertisements in association with the service they provide or initiating cooperation with other online platforms through the codes of conduct and the crisis protocols
- Very large platforms are subject, at their own expense and at least once a year, to audits to assess compliance with obligations set out in this chapter and commitments undertaken under codes of conduct. If an audit report is not positive, the platform must take the necessary measures to implement recommendations for improvement. One month after receiving the recommendations, an audit implementation report shall be adopted which sets out those measures. If operational recommendations are not implemented, platforms shall justify the reasons for not doing so and set out any alternative measures they may have taken.
- Very large platforms which use recommender systems shall set out the main parameters used in their recommender systems as well as any options for the user to modify or influence those main parameters in their terms and conditions. This shall include at least one option which is not based on profiling.
- Very large platforms that display advertising shall make publicly available a repository containing the content of each advertisement, the natural or legal person on whose behalf the advertisement is displayed, the period during which the advertisement was displayed, whether the advertisement was intended to be displayed specifically to particular groups and if so, which parameters were used for that purpose and the total number of recipients specifying, where applicable, aggregate numbers for the groups which were targeted specifically. Entries on this list shall be kept until one year after the advertisement was displayed for the last time on their online interfaces.
- Very large platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation.
- Very large platforms shall appoint one or more qualified compliance officers responsible for monitoring their compliance with this Regulation.
- Very large platforms must publicly publish the reports every six months, as well as annual risk assessment reports and the results of their audit (all of which must be also sent to the Digital Services Coordinator in their country of establishment). The platform can edit the results of these reports where there is a concern that e.g., trade secrets, will be impacted. However, complete reports must still be sent to the relevant Digital Services Coordinator.
- Other proposals:
- Where significant systemic risks emerge which concern several very large online platforms, the Commission may invite these platforms, other online platforms and other providers of intermediary services, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct.
- The Board can establish crisis protocol for extraordinary circumstances affecting public security or health. Platforms and very large platforms will have to work with the Commission to draw up and test these protocols, as well as other stakeholders where necessary, and Member State authorities. Work will be done to define what a crisis is, and how the protocols are to be put in place.
Possible penalties
- The maximum amount of penalties imposed for a failure to comply shall not exceed 6% of the annual income or turnover.
- Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover.
- Maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover.
- ‘Intermediary service’ means one of the following services:
Finland
-
Act on provision of information society services (2002)
Intellectual Property Rights → Finland
Act on provision of information society services (2002)
Scope (Service)
- ‘information society services’: Services which are provided as distance services, i.e. without the parties being present at the same time; electronically, i.e. by sending and receiving services via devices handling information electronically; as data transfers requested personally by recipients of services and usually against a payment.
Scope (Content)
- Infringements of copyright and neighbouring rights.
Obligations
- A holder of copyright or his/her representative may request a service provider to prevent access to material infringing copyright. The same applies to a holder of neighbouring rights and his/her representative if it concerns material infringing this right.
- A request must be presented to the content producer whose material the request concerns. If the content producer cannot be identified or if he/she does not remove the material or prevent access to it expeditiously, the request may be submitted to the service provider.
Possible penalties
- Laid out within the Penalty Payment Act (1113/1990).
Hungary
-
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Intellectual Property Rights → Hungary
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Scope (Service)
- ‘Information society service’: Services provided electronically – normally for financial consideration – at a distance and at the individual request of the recipient of the services
- ‘Service provider’: Any natural or legal person, or business association lacking the legal status of a legal person providing an information society service
Scope (Content)
- Any infringements on copyright
Obligations
- Within twelve hours following receipt of a notification by a rightsholder, the service provider shall take the measures necessary to remove or disable access to the content in question. It shall concurrently inform in writing the recipient of the service who has provided the information that infringes upon the proprietor’s right within three working days, and shall indicate the basis on which the information was taken down.
- Upon receipt of the notice, the provider of the information has eight days to lodge an objection to the decision.
- Upon receipt of an objection, the service provider shall proceed without delay to restore access to the information in question, and shall simultaneously send a copy of the objection to the rightsholder.
- If the rightsholder opposes the objection and files a complaint within 10 days after receipt of the notice from the service provider, then the service provider shall disable access to content again within 12 hours of receiving the court decision and inform the recipient of the service within 1 working day, supplying the court decision.
Italy
-
AGCOM Regulations regarding Online Copyright Enforcement, 680/13/CONS, December 12, 201 [Currently being amended by Resolution No 540/20/CONS]
Intellectual Property Rights → Italy
AGCOM Regulations regarding Online Copyright Enforcement, 680/13/CONS, December 12, 201 [Currently being amended by Resolution No 540/20/CONS]
Text only available in Italian
Scope (Service)
Scope (Content)
- Any copyright violation
Obligations
- Rightsholders can submit a complaint, after which all parties concerned (e.g., service providers, uploaders, page/site owners) are invited to present relevant documentation.
- Where an actual infringement of the copyright law is attested in the online environment, AGCOM can adopt different measures depending on the location of the server hosting the content: in cases where the server is located in Italy, AGCOM may order the hosting provider to remove the digital work from the website; if the server is located outside Italy, AGCOM may intervene only towards mere conduit providers who may be ordered to disable the access to the website.
- With regard to AVMS providers, on-demand providers may be ordered to remove illegal content from their catalogues and linear service providers may be ordered to refrain from retransmitting illegal works in their future schedules.
Possible penalties
- In cases of non-compliance with the orders, AG-COM can impose a fine from EUR 10,000 up to EUR 250,000.
Lithuania
-
Law No. XII-1428 on Cyber Security of 11 December 2014:
Intellectual Property Rights → Lithuania
Law No. XII-1428 on Cyber Security of 11 December 2014:
Scope (Service)
- ‘Digital services’: A group of services based on communications and information technology which encompasses the services of e-market, web search and/or cloud computing.
- ‘Digital service provider’: A legal entity which provides digital services in the Republic of Lithuania and/or other EU Member States.
Scope (Content)
- ‘Criminal offences’: Laid out in the Lithuanian criminal code
- Offences related to Intellectual Property Rights listed in Chapter 29
Obligations
- While investigating criminal offences taking place on their sites, law enforcement may order the aforementioned services to limit the provision of their services for up to 48 hours without requiring a court sanction. If a court sanction is available the duration can be extended.
- Further, during an investigation, law enforcement can require services to preserve certain information collected during the provision of services such as a mail address, geographical location address, phone or any other access number as well as information about accounts and payments made on the basis of service agreement
Malta
-
Electronic Commerce Act
Intellectual Property Rights → Malta
Electronic Commerce Act
Scope (Service)
- ‘information society service’: Any service which is provided at a distance, by electronic means and at the individual request of a recipient of the service, whether such service is provided for consideration or not.
- ‘service provider’: Any person established in Malta providing an information society service.
Scope (Content)
- Illegal activity – presumably based on the Maltese criminal code
- Offences related to intellectual property rights listed in e.g. section 298b
Obligations
- Information society service providers shall promptly inform the public authorities competent in the matter of any alleged illegal activity undertaken or information provided by recipients of their service and shall grant to any such authority upon request information enabling the identification of recipients of their service with whom they have storage agreements.
Possible penalties
- Any person contravening the aforementioned obligations shall be liable to a fine of not more than EUR 250,000 or to imprisonment not exceeding two years or both. In the case of a continuous offence a fine not exceeding EUR 2,500 for each day during which the offence continues can be imposed.
Sweden
-
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Intellectual Property Rights → Sweden
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Scope (Service)
- ‘Electronic bulletin boards’: A service for conveyance of electronic messages in text, picture, sound or other information formats. Excluded are services which only provide a network or other connections for the transmission of messages or other services necessary to use a network or other transmission channel. Also excluded is the mediation of messages within a government agency or between government agencies or within an company or a legal group of companies, Emails and services which are covered by the regulations in the Freedom of the Press Act or the Fundamental Law on Freedom of Expression (printed media, film, radio, television and sound recordings).
Scope (Content)
- Copyright infringements
Obligations
- If a user submits a message to an electronic bulletin board which contains the aforementioned illegalities, the supplier must remove the message, or in other ways make it inaccessible. In order to be able to fulfill the obligation, the service provider is allowed to check the content of message in the service
Possible penalties
- Fines or a prison sentence of not more than six month unless the crime is considered severe, in which case a prison sentence shall be no more than two years. Slight infringement should not be punished.
Product Safety
EU
-
Directive 2009/48/EC of the European Parliament and of the Council of 18 June 2009 on the safety of toys
Product Safety → EU
Directive 2009/48/EC of the European Parliament and of the Council of 18 June 2009 on the safety of toys
Scope (Service)
- “Distributor” means any natural or legal person in the supply chain, other than the manufacturer or the importer, who makes a toy available on the market.
Scope (Content)
- “Toys”: products designed or intended, whether or not exclusively, for use in play by children under 14 years of age.
Obligations
- Before making a toy available on the market, distributors shall verify that the toy bears the required conformity marking, that it is accompanied by the required documents and by instructions and safety information in a language or languages easily understood by consumers in the Member State in which the toy is to be made available on the market, and that the manufacturer and the importer have complied with the requirements set out in Article 4(5) and (6) and Article 6(3).
- Distributors shall ensure that, while a toy is under their responsibility, storage or transport conditions do not jeopardise its compliance with the requirements set out in Article 10 and Annex II.
- Distributors who consider or have reason to believe that a toy which they have made available on the market is not in conformity with the relevant Community harmonisation legislation shall make sure that the corrective measures necessary to bring that toy into conformity, to withdraw it or recall it, if appropriate, are taken. Furthermore, where the toy presents a risk, distributors shall immediately inform the competent national authorities of the Member States in which they made the toy available to that effect, giving details, in particular, of the non-compliance and of any corrective measures taken.
- Distributors shall, further to a reasoned request from a competent national authority, provide that authority with all the information and documentation necessary to demonstrate the conformity of the toy. They shall cooperate with that authority, at its request, as regards any action taken to eliminate the risks posed by toys which they have made available on the market.
Possible penalties
- Penalties determined by Member States.
-
Regulation 2019/1148 on the marketing and use of explosives precursors
Product Safety → EU
Regulation 2019/1148 on the marketing and use of explosives precursors
Scope (Service)
- “Economic operator(s)”: any natural or legal person or public entity or group of such persons and/or bodies which delivers products or services on the market.
Scope (Content)
- Sell offers for certain substances or articles containing these substances which can be used in the manufacturing of explosives (listed in Regulation’s Annex).
Obligations
- “Suspicious transactions involving the substances listed in the Annexes, or involving mixtures or substances containing them, shall be reported in accordance with this Article”. This provision could imply a monitoring obligation for platforms to make sure that no goods are sold which include the aforementioned substances to the general public.
Beyond These Categories
EU
-
Directive 2018/1972 establishing the European Electronic Communications Code
Beyond These Categories → EU
Directive 2018/1972 establishing the European Electronic Communications Code
Scope (service)
- ‘Number-independent interpersonal communications service’ means an interpersonal communications service which does not connect with publicly assigned numbering resources, namely, a number or numbers in national or international numbering plans, or which does not enable communication with a number or numbers in national or international numbering plans.
- ‘Micro-enterprises’ providing number independent interpersonal communications services are exempted from many of the obligations.
Scope (content)
- Not applicable
Obligations
- Member States shall ensure that providers of public electronic communications networks or of publicly available electronic communications services take appropriate and proportionate technical and organisational measures to appropriately manage the risks posed to the security of networks and services (encryption where appropriate).
- The Commission can adopt implementing acts detailing the technical and organisational measures referred to in paragraph 1.
- Member States shall ensure that competent authorities have the power to require providers of public electronic communications networks or publicly available electronic communications services to:(a)provide information needed to assess the security of their networks and services, including documented security policies; and(b)submit to a security audit carried out by a qualified independent body or a competent authority and make the results thereof available to the competent authority; the cost of the audit shall be paid by the provider.
- National regulatory authorities, or other competent authorities shall be able to impose in justified cases, where end-to-end connectivity between end-users is endangered due to a lack of interoperability between interpersonal communications services, and to the extent necessary to ensure end-to-end connectivity between end-users, obligations on relevant providers of number-independent interpersonal communications services which reach a significant level of coverage and user uptake, to make their services interoperable.
- Possibility that digital platform service providers might in the future contribute towards the costs of meeting universal service obligation.
- Information requirements for contracts, including a concise and easily readable contract summary.
- Transparency, comparison of offers and publication of information.
- National regulatory authorities in coordination with other competent authorities may require providers of internet access services and of publicly available interpersonal communications services to publish comprehensive, comparable, reliable, user-friendly and up-to-date information for end-users on the quality of their services. National regulatory authorities in coordination with other competent authorities may also require providers of publicly available interpersonal communication services to inform consumers if the quality of the services they provide depends on any external factors, such as control of signal transmission or network connectivity.
- Member States shall ensure that all end-users, including users of public pay telephones, are able to access the emergency services through emergency communications free of charge and without having to use any means of payment, by using the single European emergency number ‘112’ and any national emergency number specified by Member States.
- Member States shall ensure that all providers of number-based interpersonal communications services which attribute numbers from a numbering plan meet all reasonable requests to make available, for the purposes of the provision of publicly available directory enquiry services and directories, the relevant information in an agreed format, on terms which are fair, objective, cost oriented and non-discriminatory.
Penalties
- Penalties determined by Member States “only where an undertaking or public authority knowingly or grossly negligently provides misleading, erroneous or incomplete information.”
-
Directive 2018/1808 on Audiovisual Media Services
Beyond These Categories → EU
Directive 2018/1808 on Audiovisual Media Services
Scope (Service)
- ‘Video-sharing platform service’ means a service where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.
- Social media services are in scope and are considered as video-sharing platform services if the provision of programmes and user-generated videos constitutes an essential functionality of that service .
Scope (Content)
- VSPs shall take appropriate measures to protect:
- minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1).
- the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter.
- the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence, offences concerning child pornography and offences concerning racism and xenophobia.
Obligations
- VSPs shall comply with requirements on audiovisual commercial communications, “taking into account the limited control exercised by those video-sharing platforms over those audiovisual commercial communications.”. The measures imposed by MS shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided. Those measures shall not lead to any ex-ante control measures or upload-filtering of content
- These measures shall consist of:
- (a) including and applying in the terms and conditions of the video-sharing platform services the requirements referred to in paragraph 1;
- (b) including and applying in the terms and conditions of the video-sharing platform services the requirements set out in Article 9(1) for audiovisual commercial communications that are not marketed, sold or arranged by the video-sharing platform providers;
- (c) having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
- (d) establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider concerned the content referred to in paragraph 1 provided on its platform;
- (e) establishing and operating systems through which video-sharing platform providers explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (d);
- (f) establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
- (g) establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate the content referred to in paragraph 1;
- (h) providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
- (i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users’ complaints to the video-sharing platform provider in relation to the implementation of the measures referred to in points (d) to (h);
- (j) providing for effective media literacy measures and tools and raising users’ awareness of those measures and tools.
- Member States shall entrust the assessment of those measures to the national regulatory authorities or bodies.
- These measures shall consist of:
- Out-of-court redress mechanisms shall be available for the settlement of disputes between users and video-sharing platform providers.
- Video-sharing platform providers are encouraged to exchange best practices on co-regulatory codes of conduct.
Penalties
- Penalties determined by Member States.
-
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Beyond These Categories → EU
Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)
Scope (Service)
- ‘Intermediary service’ means one of the following services:
- a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
- a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request;
- a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service;
- ‘Online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
- ‘Very large online platforms’ means online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million.
Scope (Content)
- ‘Illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law.
Proposals
- Obligations for all intermediary service providers:
- Providers need to establish a single point of contact for direct communication with authorities
(e.g., MS authorities and the Commission). - Providers which are not established in the Union but offer services here need to designate a legal representative in one of the Member States where its service is offered. This designated legal representative can be held liable for non-compliance with obligations under this Regulation.
- Providers’ terms and conditions must include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.
- At least once a year, providers must publish a report on any content moderation they engaged in during the relevant period, including the number of orders received from Member States, the number of notices submitted, information on own content moderation initiatives and the number of complaints received through internal complaints mechanisms.
- Providers need to establish a single point of contact for direct communication with authorities
- Obligations for hosting service providers, including online platforms:
- Providers must put in place notice and action mechanisms to submit notices. Complete notices confer actual knowledge of the specific item of information. The submitter of the notice shall be informed of the decision taken, providing information on redress. If automated means are used for the processing or decision-making, it must be stated that this was the case.
- When access to specific items is removed or disabled, the provider shall inform the poster about it, stating, among other things, the basis for taking the decision, information on the use of automated means in taking the decision, a reference to the legal basis for cases in which content is considered illegal and information on the redress possibilities. Providers shall publish the decisions and the statements of reasons (excluding personal data) in a publicly accessible database managed by the Commission.
- Obligations for all online platforms, excluding micro- and small enterprises:
- Online platforms must provide an internal complaint-handling system to fight decisions taken by the online platform on content that is illegal or incompatible with the terms and conditions for six months after the decision is taken. Where there are sufficient grounds for the platform to reconsider, the decision shall be reversed. Decisions must not be taken by purely automated means. In any case, complainants shall be informed of the decision taken by the platforms and about the possibility of out-of-court dispute settlements and other available redress possibilities.
- Complainants are entitled to select any certified out-of-court dispute settlement body in order to resolve disputes. Platforms are bound by the decision of that body. If the body decides in favour of the complainant, the platform shall reimburse the recipient for any expenses related to the dispute settlement. If the body decides in favour of the platform, the recipient shall not be required to reimburse the platform.
- Online platforms shall prioritize notices submitted by trusted flaggers. Applicants must fulfil specific criteria and are selected by the Digital Services Coordinator of establishment. When a trusted flagger submitted a significant number of inadequate notices, it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger.
- Online platforms shall, after a warning, suspend their services to users that frequently provide manifestly illegal content. Similarly, online platforms shall, after a warning, suspend, the processing of complaints by users which frequently submit unfounded notices. This policy is to be set out in the terms and conditions.
- Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State which is suspected to be affected of its suspicion and provide all relevant information available.
- If an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained a range of information such as contact details, an ID copy, bank details and the trade registry number. Upon receiving the information, platforms should make efforts to assess the information for validity and completeness. The information must be stored for the duration of their contractual relationship with the trader and subsequently deleted.
- Beyond Article 13, online platforms must also supply information on out-of-court dispute settlements, suspensions, and the use of automatic means for the purpose of content moderation. In addition, platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State. This information can also be requested by the Digital Services Coordinator of establishment at any moment.
- Platforms that display advertising must allow users to identify, for each specific advertisement displayed to each individual recipient the natural or legal person on whose behalf the advertisement is displayed as well as meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
- Obligations for very large online platforms:
- Very large platforms shall identify, at least once a year, any significant systemic risks stemming from the functioning and use made of their services. This risk assessment shall include the dissemination of illegal content, any negative effects for the exercise of fundamental rights and intentional manipulation of their services with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse or electoral processes and public security. This shall also account for how content moderation and recommender systems influence any of these risks.
- Very large platforms shall put in place mitigation measures, tailored to specific systemic risks by for example adapting content moderation or recommender systems, introducing targeted measures aimed at limiting the display of advertisements in association with the service they provide or initiating cooperation with other online platforms through the codes of conduct and the crisis protocols
- Very large platforms are subject, at their own expense and at least once a year, to audits to assess compliance with obligations set out in this chapter and commitments undertaken under codes of conduct. If an audit report is not positive, the platform must take the necessary measures to implement recommendations for improvement. One month after receiving the recommendations, an audit implementation report shall be adopted which sets out those measures. If operational recommendations are not implemented, platforms shall justify the reasons for not doing so and set out any alternative measures they may have taken.
- Very large platforms which use recommender systems shall set out the main parameters used in their recommender systems as well as any options for the user to modify or influence those main parameters in their terms and conditions. This shall include at least one option which is not based on profiling.
- Very large platforms that display advertising shall make publicly available a repository containing the content of each advertisement, the natural or legal person on whose behalf the advertisement is displayed, the period during which the advertisement was displayed, whether the advertisement was intended to be displayed specifically to particular groups and if so, which parameters were used for that purpose and the total number of recipients specifying, where applicable, aggregate numbers for the groups which were targeted specifically. Entries on this list shall be kept until one year after the advertisement was displayed for the last time on their online interfaces.
- Very large platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation.
- Very large platforms shall appoint one or more qualified compliance officers responsible for monitoring their compliance with this Regulation.
- Very large platforms must publicly publish the reports every six months, as well as annual risk assessment reports and the results of their audit (all of which must be also sent to the Digital Services Coordinator in their country of establishment). The platform can edit the results of these reports where there is a concern that e.g., trade secrets, will be impacted. However, complete reports must still be sent to the relevant Digital Services Coordinator.
- Other proposals:
- Where significant systemic risks emerge which concern several very large online platforms, the Commission may invite these platforms, other online platforms and other providers of intermediary services, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct.
- The Board can establish crisis protocol for extraordinary circumstances affecting public security or health. Platforms and very large platforms will have to work with the Commission to draw up and test these protocols, as well as other stakeholders where necessary, and Member State authorities. Work will be done to define what a crisis is, and how the protocols are to be put in place.
Possible penalties
- The maximum amount of penalties imposed for a failure to comply shall not exceed 6% of the annual income or turnover.
- Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover.
- Maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover.
- ‘Intermediary service’ means one of the following services:
Austria
-
Communication Platforms Act [not finalized]
Beyond These Categories → Austria
Communication Platforms Act [not finalized]
Scope (Service)
- ‘Service provider’: the natural or legal person who operates a communication platform.
- ‘Communication platform’: An information society service, the main purpose or an essential function of which is to enable the exchange of messages or presentations with intellectual content in written, aural or visual form between users and a larger group of other users by way of mass dissemination.
- Excluded are operators of platforms which, over the course of the previous quarter, did not exceed an average of 100 000 registered users and the turnover of which did not exceed EUR 500 000 in the previous year in Austria.
Scope (Content)
- ‘Illegal content’: 14 specific ‘criminal offences’ listed in the Austrian criminal code (StGB), namely paragraphs 105; 107; 107c; 113; 115; 120a; 144; 188; 207a; 208a; 278; 278f; 282a; 283
Obligations
- Service providers shall set up an effective and transparent procedure for handling and processing reports on allegedly illegal content available on the platform.
- In response to a report, if the illegality of a piece of content is evident to a legal layperson without further investigation, it has to be removed or have access to it disabled no later than 24 hours after receipt of the report. If its illegality becomes apparent only after a detailed examination, it has to be removed or have access to it disabled after completion of the examination but no later than within 7 days of receipt of the report.
- In the case of illegalities being found, the content concerned, the time of its creation and the data required to identify the author have to be stored for evidence purposes, including for purposes of criminal prosecution, for up to 10 weeks.
- Service providers are obliged to prepare a report on the handling of reports of alleged illegal content on an annual basis (on a quarterly basis for communication platforms with over one million registered users) and to submit it to the supervisory authority no later than 1 month after the end of the calendar year covered in the report. In addition, the report shall be permanently made available and easily accessible on the service provider’s own website at the same time as it is sent.
Possible penalties
- The supervisory authority shall impose a fine of up to EUR 10 million on a service provider, depending on the severity of the violation.
France
-
Projet de loi nº 3649 confortant le respect des principes de la République
Beyond These Categories → France
Projet de loi nº 3649 confortant le respect des principes de la République
[Article 19bis tabled by the French government, amending Loi n° 2004‑575 du 21 juin 2004 pour la confiance dans l’économie numérique]
Scope (Service)
- Online platforms operators offering an online communication service to the public based on the classification, referencing or sharing of content published online by third parties and whose activity on French territory exceeds a threshold of the number of connections determined by decree, whether or not they are established on French territory, contribute to the fight against the public dissemination of content.
- Social networks, video or image sharing platforms and search engines are in scope, unlike marketplaces, ISPs and cloud providers, which are excluded.
Scope (Content)
- Apology for crimes, war crimes, crimes against humanity, enslavement or exploitation of a person enslaved, incitement to hatred or violation, discrimination, racist or homophobic insults and negationist speech..
Proposals
- Online platforms operators implement procedures and proportionate human and technological means and procedures that allow :
- To inform, as soon as possible, the judicial or administrative authorities of the actions they have taken following the injunctions issued by the latter relating to illegal content;
- To acknowledge receipt without delay of requests from the judicial or administrative authorities for the communication of data in their possession which could enable the identification of users who have placed online the content mentioned in the first paragraph, and to inform these authorities as soon as possible of the action taken on these requests;
- To temporarily store content that has been reported to them as contrary to the provisions mentioned in the first paragraph and that they have removed or made inaccessible, for the purpose of making it available to the judicial authority for the purposes of investigating, establishing and prosecuting criminal offences; the duration and methods of storage of such content shall be defined by decree in the Conseil d’Etat, after obtaining the opinion of the Commission nationale de l’informatique et des libertés.
- They shall designate a single contact point in charge of communication with the public authorities for the implementation of the provisions of the present article, to which may in particular be addressed by electronic means the requests submitted by the Conseil supérieur de l’audiovisuel;
- They shall make available to the public, in an easily accessible manner, the general terms and conditions of use of the service they offer; they shall include provisions prohibiting the posting of illegal content; they describe in clear and precise terms their moderation system aimed at detecting, identifying and processing such content, detailing the procedures and the human or automated means employed for this purpose as well as the measures they implement affecting the availability, visibility and accessibility of such content; they indicate the measures they implement with regard to users who have placed such content online, as well as the domestic and legal remedies available to such users;
- They shall report to the public on the means implemented and the measures adopted to combat the dissemination of the content referred to in the said first paragraph, through the publication, in accordance with procedures and at intervals laid down by the Conseil supérieur de l’audiovisuel, of information and quantified indicators defined by the latter, relating in particular to the processing of notifications received and users’ internal appeals and, where appropriate, the criteria for the selection of trusted third parties whose notifications are given priority treatment and the arrangements for cooperation with such third parties;
- They shall put in place an easily accessible and user-friendly mechanism enabling any person to bring to their attention, by electronic means, content which they consider to be contrary to the provisions referred to in the first subparagraph, allowing them to specify clearly their location and the reasons why they consider that such content should be considered illegal and to provide contact information, and informing them of the penalties incurred in the event of improper notification;
- They implement procedures and proportionate human and technological means and procedures that allow :
- to acknowledge receipt without delay of notifications relating to the contents mentioned in the first paragraph, subject to having the necessary information to contact their author;
- to ensure their appropriate examination within a short period of time;
- to inform their authors of the action taken on them and of the domestic and legal remedies available to them, subject to having the necessary information to contact them;
- When they decide to remove or make inaccessible content for a reason based on ignorance of the provisions mentioned in the first paragraph, to inform the user who published it, subject to the availability of the necessary contact information:
- indicating the reasons for this decision;
- by specifying whether this decision was taken using an automated tool;
- by informing him/her of the domestic and legal remedies available to him/her;
- and informing it that civil and criminal penalties are incurred for the publication of illegal content.
- They implement internal recourse mechanisms allowing :
- the author of a notification relating to a content mentioned in the first subparagraph to challenge the decision adopted by the operator in response to that notification;
- the user who is at the origin of the publication of content that has been the subject of a decision mentioned in paragraph 6(d) to challenge that decision;
- a user who has been the subject of a decision related to an infringement of their terms and conditions.
- They shall ensure that these devices are easily accessible and user-friendly and that they permit the appropriate handling of appeals as soon as possible, not relying solely on the use of automated means, the prompt informing of the user of the decision adopted, and the cancellation without delay of measures relating to the content in question or to the user implemented by the operator where the appeal leads him to consider that the contested decision was not justified;
- When they decide to implement such procedures, they shall set out in their conditions of use, in clear and precise terms, the procedures leading :
- (a) suspend or, in the most serious cases, terminate the accounts of users who have repeatedly placed online content contrary to the provisions mentioned in the same first subparagraph;
- (b) To suspend access to the notification system for users who have repeatedly submitted manifestly unfounded notifications relating to the content mentioned in the first paragraph.
- Where such procedures are implemented, they shall provide for a case-by-case examination aimed at objectively characterising the existence of conduct referred to in (a) or (b), taking into account in particular:
- the number of unlawful contents mentioned in the first subparagraph, or manifestly unfounded notifications initiated by the user during the past year, both in absolute terms and as a proportion of the total number of contents or notifications initiated by the user;
- and the seriousness and consequences of such abuse.
- When implemented, these procedures shall provide that the measures referred to in a and b are proportionate in nature to the seriousness of the conduct in question and, in the case of a suspension, that it is for a reasonable period of time. They shall provide for prior warning of the user and information on the domestic and judicial remedies available to him.
- Obligation for online platforms operators with a number of connections above a certain threshold:
- They shall carry out an annual assessment of the systemic risks linked to the operation and use of their services with regard to the dissemination of the content referred to in the first subparagraph and with regard to infringements of fundamental rights, in particular freedom of expression;
- They shall implement reasonable, proportionate and effective measures to mitigate the risks of dissemination of such content, which may relate in particular to the procedures and human and technological means used to detect, identify and process such content, while taking care to prevent the risks of unjustified withdrawal in accordance with applicable law and their general conditions of use;
- They shall report to the public, according to modalities and at intervals to be determined by the Conseil supérieur de l’audiovisuel, on the assessment of these systemic risks and the risk mitigation measures implemented;
Penalties
- The Conseil supérieur de l’audiovisuel may impose a financial penalty, the amount of which shall take into consideration the seriousness of the breaches and, where applicable, their repeated nature, but may not exceed 20 million euros or 6% of the total worldwide annual turnover of the previous financial year, whichever amount is higher.
Note: The provisions of this Article shall apply until 31 December 2023.
View Legislation -
Loi n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l’information
Beyond These Categories → France
Loi n° 2018-1202 du 22 décembre 2018 relative à la lutte contre la manipulation de l’information
Scope (Service)
- Online platforms operators are natural or legal person offering, in a professional capacity, whether remunerated or not, an online communication service to the public based on :
- The classification or referencing, by means of computer algorithms, of content, goods or services offered or put online by third parties;
- Or the bringing together of several parties with a view to the sale of a good, the provision of a service or the exchange or sharing of content, goods or services.
- Online platforms operators reaching 5 million unique visitors per month on the French territory (number calculated on the basis of the last calendar year) or earning 100 euros before tax per advertising campaign, for each publication containing information related to a debate of general interest.
Scope (Content)
- No definition of what constitutes “information related to a debate of general interest”.
Obligations
- The online platform operators implement measures to combat the dissemination of false information likely to disturb public order or alter the sincerity of the polls.
- They set up an easily accessible and visible mechanism enabling their users to report such information, particularly when it comes from content promoted on behalf of a third party.
- They shall also implement additional measures that may relate in particular to :
- The transparency of their algorithms;
The promotion of content from companies and press agencies and audiovisual communication services;
- The fight against accounts that massively propagate false information;
- The information of users on the identity of the natural person or the corporate name, the registered office and the corporate purpose of legal entities paying them remuneration in return for the promotion of information content related to a debate of general interest;
- Information to users on the nature, origin and methods of dissemination of content;
- Media and information education.
- These measures, as well as the means they devote to them, are made public. Each operator shall address each year to the Conseil supérieur de l’audiovisuel a declaration specifying the modalities of implementation of the said measures.
- The online platform operators shall appoint a legal representative to act as a point of contact on French territory for the application of the provisions set out in this Title.
- The online platform operators who use algorithms for recommending, classifying or referencing information content related to a debate of general interest publish aggregate statistics on their operation, and for each content shall be mentioned:
- The share of direct access, without recourse to recommendation, classification or referencing algorithms;
- Indirect access shares attributable, on the one hand, to the platform’s internal search engine algorithm, where applicable, and, on the other hand, to the platform’s other recommendation, ranking or referencing algorithms that have been used to access the content.
- These statistics are published online and accessible to all, in a free and open format.
- During the three months preceding the first day of the month of general elections and until the date of the ballot where they are acquired, online platform operators whose activity exceeds a given threshold of the number of connections on French territory are required to:
- To provide the user with fair, clear and transparent information on the identity of the natural person or on the company name, registered office and corporate purpose of the legal entity and of the legal entity on whose behalf, where applicable, it has declared itself to be acting, which pays to the platform remuneration in return for the promotion of information content related to a debate of general interest;
- To provide users with fair, clear and transparent information on the use of their personal data in the context of the promotion of information content related to a debate of general interest;
- To make public the amount of remuneration received in return for the promotion of such information content when the amount exceeds a specified threshold.
- This information is aggregated in a register made available to the public by electronic means, in an open format, and regularly updated during the period mentioned in the first paragraph of this Article.
Penalties (if applicable)
The Conseil supérieur de l’audiovisuel may, after formal notice, pronounce the sanction of unilateral termination of the agreement concluded with a legal entity controlled, by a foreign State or placed under the influence of such State if the service which is the subject of the said agreement prejudices the fundamental interests of the Nation, including the regular functioning of its institutions, in particular through the dissemination of false information.
View Legislation - Online platforms operators are natural or legal person offering, in a professional capacity, whether remunerated or not, an online communication service to the public based on :
-
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Beyond These Categories → France
Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique
Scope (Service)
- ‘Public online communication service’: Any non-private transmission of digital data resulting from an individual request through an electronic communication process which allows for a reciprocal exchange of information between the sender and the receiver.
Scope (Content)
- Paragraphs 5, 7 and 8 of article 24 of the law of July 29, 1881 on freedom of the press
- Articles 222-33, 225-4-1, 225-5, 225-6, 227-23, 227-24 and 421-2-5 of the penal code
Obligations
- Public online communication services must put in place an easily accessible and visible system enabling anyone to bring to their attention any content falling under the aforementioned scope. They are also obliged to promptly inform the competent authorities of any of the aforementioned illegal activities which may be reported to them.
- Public online communication services must also make public the resources they devote to the fight against these illegal activities.
Penalties (if applicable)
- Imprisonment of up to one year and a fine of EUR 75,000
Finland
-
Act on provision of information society services (2002)
Beyond These Categories → Finland
Act on provision of information society services (2002)
Scope (Service)
- ‘Information society services’: Services which are provided as distance services, i.e. without the parties being present at the same time; electronically, i.e. by sending and receiving services via devices handling information electronically; as data transfers requested personally by recipients of services and usually against a payment.
Scope (Content)
- Information, the making available or transmission of which is punishable or as a basis for civil liability – presumably based on the Finnish criminal code.
Obligations
- A court may order the service provider to disable access to certain information based on its criminal nature. In such a process, the service provider and the content producer must be consulted except if the consultation cannot be arranged as quickly as the urgency of the matter requires.
- The court order must also be notified to the content producer. If the content producer is not known, the court may order the service provider to take care of notification.
- The service provider must immediately notify the content producer of prevention of access to the material supplied by him/her and to supply the content producer with a copy of the notification on the basis of which prevention was made
Possible penalties
- Laid out within the Penalty Payment Act (1113/1990).
Germany
-
Network Enforcement Act – NetzDG
Beyond These Categories → Germany
Network Enforcement Act – NetzDG
Scope (Service)
- ‘Telemedia service providers’: Internet platforms which, profit-making purposes, are designed to enable users to share any content with other users or to make such content available to the public (social networks). Platforms offering journalistic or editorial content, the responsibility for which lies with the service provider itself, shall not constitute social networks. The same applies to platforms which are designed to enable individual communication or the dissemination of specific content.
Scope (Content)
- 22 specific ‘criminal offences’ listed in the German criminal code (StGB), namely sections 86; 86a; 89a; 91; 100a; 111; 126; 129 to 129b; 130; 131; 140; 166; 184b in connection with 184d; 185 to 187; 241; 269
Obligations
- Platforms must put in place an effective and transparent procedure that is easily recognisable, directly accessible and permanently available to users
- Platforms should immediately become aware of submitted complaints and analyse the legal or illegal nature of the litigious content and the possibility of its removal or blocking
- After receiving a complaint, the platform has seven days to remove or block it except:
- 1. When the illegality of a content depends on identifiable factual circumstances, the online platform may give the user the opportunity to comment on the complaint before taking a decision
- 2. When the online platform requires an ad hoc regulated self-regulatory body to rule on the illegality of a content
- In cases of obviously illegal content, the time limit is reduced to 24 hours
- Online platforms are obliged to inform and justify their decisions without delay to the complainant and the content’s author
- In the event of the removal of illegal content, the online platform must ensure that it is secured and kept for evidence purposes for ten weeks
Possible penalties
- Fine between EUR 500,000 to EUR 5 million depending on the type of infringement
Hungary
-
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Beyond These Categories → Hungary
Act CVIII of 2001 on Electronic Commerce and on Information Society Service
Scope (Service)
- ‘Information society service’: Services provided electronically – normally for financial consideration – at a distance and at the individual request of the recipient of the services
- ‘Service provider’: Any natural or legal person, or business association lacking the legal status of a legal person providing an information society service
Scope (Content)
- ‘Illegal information’ – presumably based on the Hungarian criminal code.
Obligations
- If in the course of criminal proceedings the court has ordered the rendering of electronic information inaccessible, the service provider shall take measures within one working day upon receipt of the court order for disabling access to the electronically published illegal information.
- The party ordered to render electronic information temporarily or permanently inaccessible shall inform the users, disclosing the name of the court and the number of the ruling, as to the legal basis for the removal of the content in question, for disabling access to such content.
- If the court orders the service provider to restore access to such electronic information upon the conclusion of the proceedings, the service provider shall take measures to restore access to such information within one working day of receipt of the court order.
Possible penalties
- In the event of non-compliance, the court can impose fines, the amount of which are prescribed in the Act on Criminal Proceedings, or the Act on the Execution of Penal Measures and Sanctions.
Ireland
-
Online Safety and Media Regulation Bill
Beyond These Categories → Ireland
Online Safety and Media Regulation Bill
Still in draft form
View Legislation
Lithuania
-
Law No. XII-1428 on Cyber Security of 11 December 2014:
Beyond These Categories → Lithuania
Law No. XII-1428 on Cyber Security of 11 December 2014:
Scope (Service)
- ‘Digital services’: A group of services based on communications and information technology which encompasses the services of e-market, web search and/or cloud computing.
- ‘Digital service provider’: A legal entity which provides digital services in the Republic of Lithuania and/or other EU Member States.
Scope (Content)
- ‘Criminal offences’: Laid out in the Lithuanian criminal code
Obligations
- While investigating criminal offences taking place on their sites, law enforcement may order the aforementioned services to limit the provision of their services for up to 48 hours without requiring a court sanction. If a court sanction is available the duration can be extended.
- Further, during an investigation, law enforcement can require services to preserve certain information collected during the provision of services such as a mail address, geographical location address, phone or any other access number as well as information about accounts and payments made on the basis of service agreement
-
Regulation on Denial of Access to Information which was Acquired, Created, Modified or Used Illegally, Resolution No. 881 of August 22, 2007
Beyond These Categories → Lithuania
Regulation on Denial of Access to Information which was Acquired, Created, Modified or Used Illegally, Resolution No. 881 of August 22, 2007
Text only available in Lithuanian
View Legislation
Malta
-
Electronic Commerce Act
Beyond These Categories → Malta
Electronic Commerce Act
Scope (Service)
- ‘Information society service’: Any service which is provided at a distance, by electronic means and at the individual request of a recipient of the service, whether such service is provided for consideration or not.
- ‘Service provider’: Any person established in Malta providing an information society service.
Scope (Content)
- Illegal activity – presumably based on the Maltese criminal code
- If so, includes the spread of ‘fake news’ under section 82
Obligations
- Information society service providers shall promptly inform the public authorities competent in the matter of any alleged illegal activity undertaken or information provided by recipients of their service and shall grant to any such authority upon request information enabling the identification of recipients of their service with whom they have storage agreements.
Penalties (if applicable)
- Any person contravening the aforementioned obligations shall be liable to a fine of not more than EUR 250,000 or to imprisonment not exceeding two years or both. In the case of a continuous offence a fine not exceeding EUR 2,500 for each day during which the offence continues can be imposed.
The Netherlands
-
Notice-and-Take-Down Code of Conduct
Beyond These Categories → The Netherlands
Notice-and-Take-Down Code of Conduct
Scope (Service)
Scope (Content)
Obligations
- Participation is voluntary
- a
-
Dutch Code of Conduct on Transparency of Online Political Advertisements
Beyond These Categories → The Netherlands
Dutch Code of Conduct on Transparency of Online Political Advertisements
Scope (Service)
- ‘Online platforms’: both platforms that allow paid political advertisements and platforms that only offer organic content.
Scope (Content)
- Paid online political advertising.
- The signatories recognize the important role of unpaid (so-called organic) content for fair and democratic online political campaigns.
- Partly covers unpaid political reporting.
Obligations
- Participation is voluntary
- The responsibility to use online political advertisements conscientiously so as to maintain the integrity of elections.
- Promote and adhere to transparency of online political advertisements, to the extent possible.
- The need to avoid the dissemination of misleading content, hate speech and messages that incite violence.
- As far as online platforms offer political advertising, develop and enforce relevant transparency mechanisms.
- Provide relevant advertiser and advertisement details in publicly available libraries with easily downloadable and comparable data.
- In order to ensure transparency and scrutiny of political advertisement placements, require and enforce registration by and verification of political advertisers.
- Report adequately and clearly on “paid for by” and spending range.
- Ban cross-border political advertisements from outside the European Union.
- Establish transparency on online political advertisements, including on spend and reach.
- Raise awareness on this code of conduct by drawing the attention of relevant employees to the commitments made; ensure that situations that could affect compliance negatively are addressed in a timely manner.
- Put in place a user-friendly response mechanism to answer questions or address issues related to the Dutch elections; have clear and transparent rules for placing and removing messages and advertisements; respond quickly and accurately to enquiries about placement and removal of messages and advertisements by competent authorities and political parties.
- Develop and enforce platform policies on the safety and privacy of users and electoral integrity in relation to online political advertising; remove political advertisements or any content that incite violence or hate speech within the shortest possible period of time once identified.
- Consider countering any inaccurate information on the electoral process, such as voter, voting process and polling station information, in case such information is identified.
- Consider sharing experience and lessons from previous elections in other countries with Dutch political parties and relevant election authorities.
- Consider conducting a post-election review on the Dutch electoral campaign and compliance with the code of conduct; consider the possibility of publishing a report in case there have been any relevant incidents in relation to the Dutch elections and the correlated platform actions.
Poland
-
Law on the protection of freedom of speech on online social networking sites
Beyond These Categories → Poland
Law on the protection of freedom of speech on online social networking sites
[Based on the draft version of 15th January 2021]
Scope (Services)
- ‘Online networking sites’: understood as a service provided electronically within the meaning of the law of 18 July 2002 on the provision of electronic services, which allows users to share any content with other users or the general public, and which is used by at least one million registered users in the Republic of Poland.
- ‘Service provider’: understood as the provider of online social networking services, consisting in storing on the online social network information provided by the user at his request, with at least one million registered users.
Scope (Content)
- Illegal content: understood as content that violates personal rights, disinformation, criminal content, as well as content that violates decency, in particular, disseminating or praising violence, suffering or humiliation
- Disinformation: understood as false or misleading information, produced, presented and disseminated for profit or in violation of the public interest.
- Criminal content: understood as content that praises or incites to commit prohibited acts, specified in Arts.: 117-119, 127, 130, 133, 134-135, 137, 140, 148-150, 189-189a, 190a, 194-204, 222-224a, 249-251, 255, 258, 343 of the law of 6 June 1997 – Penal Code (Journal of Laws of 2020, items 1444 and 1517) or which constitutes a prohibited act.
Obligations
- A service provider which receives over 100 user complaints in a calendar year regarding the distribution of access to illegal content, restriction of access to content or limitation of access to the user’s profile, must prepare a report in Polish every six months on how these complaints are resolved and publish it on the online social networking site no later than one month from the end of the given six months.
- Reports posted on an online social networking site must be clearly visible, directly and permanently accessible.
- The Service Provider must appoint at least one, but not more than three, representatives in the country.
- The Service Provider must immediately inform the President of the UKE about the appointment or change of a representative in the country, and provide their information, including their email address and address for service. Where the representative in the country is a legal person, the service provider shall provide details of natural persons authorised to act on behalf of that legal person.
- The service provider must publish on the online social network site in a clearly visible, directly and permanent accessible manner:
- full details of the representative in the country, including the electronic address and address for service, and if the representative in the country is a legal person, also details of natural persons authorised to act on behalf of that legal person;
- details of the service provider, including the exact name of the person running the online social networking site or their name and surname, registration or residential address, delivery address, registration data and email address.
- Persons who conduct internal review proceedings shall be regularly provided with training in Polish by the service provider, at least every six months.
- If the service provider ceases to operate, they are obliged to provide the President of the UKE with the data referred to in Art. 18 of the law of 18 July 2002 on the provision of electronic services, for further storage, sharing and protection.
- The service provider must establish an effective and understandable internal review procedure in Polish in matters relating to user complaints regarding:
- restriction of access to content;
- restriction of access to the user’s profile;
- disseminating illegal content.
- The service provider must publish the rules of the online social network site available to all users on the online social network site in Polish, which must also contain the rules of conducting internal review proceedings. The rules cannot be inconsistent with the provisions of generally applicable law.
- The service provider is obliged to ensure that complaints can be sent to internal review proceedings in a clearly visible, directly and permanently accessible manner.
- A representative in the country shall immediately send confirmation of receipt of the submitted complaint via the email address indicated in the complaint.
- A representative in the country shall examine the user’s complaint and inform them through the indicated email address about the method of consideration, within 48 hours of lodging the complaint.
- If the complaint is accepted, the service provider shall:
- restore restricted access to content, or
- restore limited access to the user’s profile, or
- prevent the distribution of illegal content.
- Information on the method of considering the complaint should include the reasons for the resolution, indicating in particular the legal grounds with reference to the rules of the online social network site and the actual grounds for the decision with justification. The information should also contain instructions on the possibility of submitting a complaint to the Council, as well as the date and manner of filing a complaint.
- The service provider shall inform the user about the possibility of pursuing claims through civil proceedings and the possibility of notifying about the commission of an offence.
- In case a user submits a complaint to the Freedom of Speech Council, because he/she is dissatisfied with the way a complaint was handled by the internal review proceedings, the representative in the country shall provide the Council with the materials collected in the internal review proceedings within 24 hours from the notification of the complaint submitted by the user.
- The service provider shall execute the Council’s decision immediately, and not later than within 24 hours of its delivery.
- A representative in the country, within 24 hours from the deadline for implementing the decision referred to in Para. 2, shall inform the Council about the manner of its implementation. If the service provider fails to implement the decision, the Council shall immediately notify the President of the UKE.
- The service provider may not limit access to the content that was the subject of the examination by the Council again.
- In the event that criminal content is found, a prosecutor may ask the service provider or representative in the country to send the necessary information, in particular regarding data specifying the user and publications posted on the online social networking site.
- If it is found that the content of a criminal content contains a publication with pornographic content involving a minor or content that praises or incites to commit acts of a terrorist nature, or that further access to this publication creates the risk of causing significant damage or causing effects difficult to reverse, the prosecutor shall immediately issue a decision ordering the service provider to prevent access to this content.
- Within 3 months from the entry into force of the law, service providers shall submit a report for the previous year, containing information on the methods of counteracting disinformation and disseminating illegal content to date.
[Modification of Art. 111 of the law of 5 January 2011 – Election Code (Journal of Laws of 2020, item 1319)]
- After § 1, §1a is added as follows: “§ 1a. The candidate or representative of the election committee concerned has the right to submit to the district court a request for a decision to prohibit the service provider within the meaning of the law of … on the protection of freedom of speech on online social networking sites (Journal of Laws of … item …) of disseminating illegal content within the meaning of the law, produced with the intention of deliberately violating the integrity of elections.”
Penalties
- For a breach by the service provider of any of its obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
- For a breach by a representative in the country of any of his/her obligations, the Council shall impose, by way of a decision, a fine in the amount of PLN 50,000 to PLN 50,000,000.
Portugal
-
Decree-Law no. 7/2004, of 7 January
Beyond These Categories → Portugal
Decree-Law no. 7/2004, of 7 January
Scope (Service)
- ‘Information society services’: Any service provided at a distance by electronic means, for remuneration or at least in the scope of an economic activity at the individual request of a recipient of services.
- ‘Intermediary networking service providers’: The providers that render technical services for the access, availability and use of online information or services, which are independent of the generation of the information or service themselves.
Scope (Content)
- Illegal content is understood as content threatening to damage or damages:
- Human dignity or public policy, including the protection of minors and the fight against any incitement to hatred for reasons of race, sex, religion or nationality, namely on grounds of the prevention of, or combat against, crimes or offences against public policy;
- Public health;
- Public security, namely the safeguarding of national security and defence;
- Consumers, including investors.
Obligations
- Intermediary service providers shall promptly inform competent authorities when they become aware of illegal activities undertaken via services rendered.
- Upon request of the competent authorities, intermediary services should identify recipients of their services with whom they have entered into storage agreements. In addition, upon request, intermediary services should supply lists of owners of hosted websites.
- Intermediary services are required to promptly comply with instructions aiming to terminate or prevent an offence, namely to remove or disable access to a given information
- Service providers only have to disable access to disputed content if its illegality is obvious.
- In case of such a decision, any interested party may appeal to the relevant supervisory entity, which shall present a preliminary settlement within forty-eight hours and notify it immediately to all those involved by electronic means.
- Whatever the decision may be, the intermediary service provider shall not be liable for having or not having removed the contents or disabled the access based on a mere request, where the illegality is not obvious.
Possible penalties
- Non-compliance of service providers with regards to the aforementioned obligations are liable to a fine from EUR 5000€ to EUR 100 000€.
Spain
-
Royal Decree 1889/2011 on the functioning of the Commission for the protection of IPR
Beyond These Categories → Spain
Royal Decree 1889/2011 on the functioning of the Commission for the protection of IPR
Text only available in Spanish
View Legislation
Sweden
-
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Beyond These Categories → Sweden
Act on Responsibility for Electronic Bulletin Boards (1998:112)
Scope (Service)
- ‘Electronic bulletin boards’: A service for conveyance of electronic messages in text, picture, sound or other information formats. Excluded are services which only provide a network or other connections for the transmission of messages or other services necessary to use a network or other transmission channel. Also excluded is the mediation of messages within a government agency or between government agencies or within an company or a legal group of companies, Emails and services which are covered by the regulations in the Freedom of the Press Act or the Fundamental Law on Freedom of Expression (printed media, film, radio, television and sound recordings).
Scope (Content)
- A number of fixed offences listed in the Swedish penal code, namely chapter 16, section 5, chapter 16 section 8, chapter 16 section 10 and chapter 16 section 10.
- Copyright infringements
Obligations:
- If a user submits a message to an electronic bulletin board which contains the aforementioned illegalities, the supplier must remove the message, or in other ways make it inaccessible. In order to be able to fulfill the obligation, the service provider is allowed to check the content of message in the service
Possible penalties
- Fines or a prison sentence of not more than six month unless the crime is considered severe, in which case a prison sentence shall be no more than two years. Slight infringement should not be punished.
United Kingdom
-
Online harms White Paper [not finalized]
Beyond These Categories → United Kingdom
Online harms White Paper [not finalized]
Scope (Service)
- Companies (explicitly including search engines) whose services host user-generated content which can be accessed by users in the UK facilitate public or private online interaction between service users, one or more of whom is in the UK. The regulatory framework will also apply to public communication channels and services where users expect a greater degree of privacy, e.g. online instant messaging services and closed social media groups.
- Excluded are business-to-business services, services which play a functional role in enabling online activity, such as internet service providers and low-risk businesses.
- The legislation will take a tiered approach differentiating between smaller and larger services based on the audience and the functionalities offered.
Scope (Content)
- ‘Illegal content’: Presumably based on the UK’s criminal code, specific mentions were made of child sexual exploitation and abuse material as well as terrorist content.
- ‘Harmful content’: The legislation will provide a general definition of harmful content. As of yet it is only stated that for content to be considered harmful, it must give rise to a “reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals”
Obligations
- The legislation will require companies to prevent the proliferation of illegal content and activity online and ensure that children who use their services are not exposed to harmful content. The duty of care will also cover disinformation and misinformation that could cause significant harm to an individual.
- Depending on the size of the audience and the functionalities offered by a service, larger platforms face additional requirements regarding content that is legal but harmful to adults.
- The regulator, Ofcom, will issue codes of practice which are mandatory for companies to comply with or requires them to show an equally effective approach.
Possible penalties
- Ofcom will be able to issue fines of up to £18 million or 10% of global annual turnover, whichever is the higher in cases of non-compliance.