It can take a while depending on the size of the document..please wait
Digital Services Act, JURI's Opinion to IMCO
0 days left (ends 27 Apr)
Less than a year ago, the European Commission announced their plan for the Digital Services Act. This is about a legislative tool, either Directive or Regulation, that will recast the current E-Commerce Directive. The E-Commerce Directive dates back in 2000 and is the framework based on which Member States have adjusted their national laws as regards the function, responsibilities and rights of the “information society services” in the internal market. More specifically, it addresses issues such as the “country of origin” of the information society services, the tackling of “unsolicited commercial communication”, electronic contracts, the limited liability of intermediaries online and the restriction of any general obligation to monitor the information transmitted or stored by them.
Ahead of this legislative proposal that is expected by the end of this year, three Committees in the European Parliament have been assigned with the drafting of own-initiative Reports which will serve as a first opinion by the Parliament and a valuable guide for the Commission’s proposal. These reports are the “Digital Services Act: Improving the functioning of the Single Market” in the Internal Market and Consumer Protection Committee (IMCO), the “Digital Services Act and fundamental rights issues posed” in the Civil Liberties, Justice and Home Affairs Committee (LIBE) and the “Digital Services Act: adapting commercial and civil law rules for commercial entities operating online” in the Legal Affairs Committees (JURI). On top of these Reports the relevant Committees of the Parliament are assigned with the Opinion to these Reports, in order to finally have a text, that responds to the needs of the EU citizens taking into account all the relevant angles.
In this process, I have been allocated as a Rapporteur in JURI for an Opinion to the Report of IMCO, which is the item of the respective call. Apart from that, I am allocated as a Shadow Rapporteur for the Report in JURI, the Report in LIBE, and the LIBE Opinion to IMCO’s Report.
The Opinion in JURI, for which I am the Rapporteur, focuses solely on the Report prepared in IMCO. This means that taking into account the competences of the JURI Committee, the Opinion should provide useful and efficient proposals that should be taken into account by the the IMCO Rapporteur. Taking the above into consideration, I have addressed very specific points in my draft Opinion. Also, due to the limitation set by the administration as regards the first draft, the text cannot be extended at the moment, however, your input is valuable and it could be taken into consideration in the future, when I will consider the addition of Amendments.
As you may see the Opinion is drafted as such:
Right to Privacy online and anonymous use of digital services: requests anonymity online where technically and reasonable, privacy and data protection laws should be respected
General monitoring and automated tools: asks for a ban on any general monitoring obligation and mandatory upload-filters
Contract Terms and conditions: demands fairness and compliance with fundamental rights standards of terms and conditions, otherwise those terms shall remain non-binding
Addressing illegal content: goes through a “notice and action” procedure based on the duty of care of the intermediaries. Keeps the current “limited liability” regime, where intermediaries are deemed liable and could be requested to remove content solely when they have actual knowledge of the illegal activity. Any “notice and action” procedure shall remain clear, proportionate and with respect to fundamental rights, especially the freedom of expression and the right to privacy.
Addressing the spread of unwanted content: clarifies the difference between “illegal” and “harmful” content and calls for alternative methods to tackle what would be deemed as “harmful” by the intermediaries. Platforms shall not act as the “internet police” and content shall be removed based on existing laws and judicial orders on order to respect the rights of both the users and the businesses online.
Interconnectivity of platforms: calls for giving users the opportunity to choose which platform to use but still stay connected with users that decide to use different platforms, which could be achieved via API access cross-platform interaction.
As mentioned above, the aim is to have all Reports adopted in plenary by September 2020. The European Commission will evaluate these Reports and will issue its legislative proposal by December 2020. After that, the European Parliament and the European Council will consider their positions for a final legislative text to be adopted and implemented across the EU. This means that this is only the beginning of one of the most significant “digital” files of this mandate. We would therefore like to invite you to be part of this fruitful discussion through your feedback to my draft. My team and I remain at your disposal for any further questions or concerns.
LEVEL OF AGREEMENT
MOST DISCUSSED PARAGRAPHS
As much as we support the content provider's right for a counter-notice, there are reasonable exceptions in which it might be legitimate to refrain from informing the provider about blocking/removing their content. After the sentence 'that the content provider shall be heard before disabling access to their content', it can thus be considered to add something along this line: 'unless it would risk impeding criminal investigations in exceptional cases (e.g., sharing child sexual abuse material)'.
Given the ongoing development of automated technologies, and their already established application in some areas (e.g., recognizing child sexual abusive material), we would suggest the wording 'automated tools struggle to differentiate' instead of 'are unable to differentiate' in the first sentence, to make the paragraph more future-proof. In addition, we support the comment of Access Now.
- Highlights that in order to protect fundamental rights and to ensure legal certainty, the Digital Services Act shall not use the legally undefined concept of “harmful content”, but shall address the publication of content that is illegal by criminal or civil law standards; emphasizes that the spreading of false and racist information on social media should be contained by giving users control over content proposed to them; stresses that curating content on the basis of tracking user actions shall require the user’s consent; proposes that users of social networks should have a right to see their timeline in chronological order; suggests that dominant platforms shall provide users with an API to have content curated by software or services of their choice.
Add/View comments (2)
- Stresses that in order to overcome the lock-in effect of centralised networks and to ensure competition and consumer choice, users of dominant social networks and messaging services shall be given a right to cross-platform interaction via API access (interconnectivity); highlights that these users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them.