Discuto is Loading your document from Drive

It can take a while depending on the size of the document..please wait

Discuto is submitting your document

It might take a while depending on the size of the document you uploaded..

Discuto is creating your discussion

Please do not close this window.

Discuto is submitting your comment

Did you know you can vote on comments? You can also reply directly to people's comments.

Your invites are being queued for sending

This might take some time depending on the number of invites, please do not close this window.

Discuto

Discuto

Digital Services Act, JURI's Opinion to IMCO

Starting: 16 Apr Ending

0 days left (ends 27 Apr)

Go to discussion, participate and give your opinion

description

Less than a year ago, the European Commission announced their plan for the Digital Services Act. This is about a legislative tool, either Directive or Regulation, that will recast the current E-Commerce Directive. The E-Commerce Directive dates back in 2000 and is the framework based on which Member States have adjusted their national laws as regards the function, responsibilities and rights of the “information society services” in the internal market. More specifically, it addresses issues such as the “country of origin” of the information society services, the tackling of “unsolicited commercial communication”, electronic contracts, the limited liability of intermediaries online and the restriction of any general obligation to monitor the information transmitted or stored by them.

Ahead of this legislative proposal that is expected by the end of this year, three Committees in the European Parliament have been assigned with the drafting of own-initiative Reports which will serve as a first opinion by the Parliament and a valuable guide for the Commission’s proposal. These reports are the Digital Services Act: Improving the functioning of the Single Market” in the Internal Market and Consumer Protection Committee (IMCO), the Digital Services Act and fundamental rights issues posed” in the Civil Liberties, Justice and Home Affairs Committee (LIBE) and the Digital Services Act: adapting commercial and civil law rules for commercial entities operating online” in the Legal Affairs Committees (JURI). On top of these Reports the relevant Committees of the Parliament are assigned with the Opinion to these Reports, in order to finally have a text, that responds to the needs of the EU citizens taking into account all the relevant angles.

In this process, I have been allocated as a Rapporteur in JURI for an Opinion to the Report of IMCO, which is the item of the respective call. Apart from that, I am allocated as a Shadow Rapporteur for the Report in JURI, the Report in LIBE, and the LIBE Opinion to IMCO’s Report.

The Opinion in JURI, for which I am the Rapporteur, focuses solely on the Report prepared in IMCO. This means that taking into account the competences of the JURI Committee, the Opinion should provide useful and efficient proposals that should be taken into account by the the IMCO Rapporteur. Taking the above into consideration, I have addressed very specific points in my draft Opinion. Also, due to the limitation set by the administration as regards the first draft, the text cannot be extended at the moment, however, your input is valuable and it could be taken into consideration in the future, when I will consider the addition of Amendments.

As you may see the Opinion is drafted as such:

  1. Right to Privacy online and anonymous use of digital services: requests anonymity online where technically and reasonable, privacy and data protection laws should be respected

  2. General monitoring and automated tools: asks for a ban on any general monitoring obligation and mandatory upload-filters

  3. Contract Terms and conditions: demands fairness and compliance with fundamental rights standards of terms and conditions, otherwise those terms shall remain non-binding

  4. Addressing illegal content: goes through a “notice and action” procedure based on the duty of care of the intermediaries. Keeps the current “limited liability” regime, where intermediaries are deemed liable and could be requested to remove content solely when they have actual knowledge of the illegal activity. Any “notice and action” procedure shall remain clear, proportionate and with respect to fundamental rights, especially the freedom of expression and the right to privacy. 

  5. Addressing the spread of unwanted content: clarifies the difference between “illegal” and “harmful” content and calls for alternative methods to tackle what would be deemed as “harmful” by the intermediaries. Platforms shall not act as the “internet police” and content shall be removed based on existing laws and judicial orders on order to respect the rights of both the users and the businesses online.

  6. Interconnectivity of platforms: calls for giving users the opportunity to choose which platform to use but still stay connected with users that decide to use different platforms, which could be achieved via API access cross-platform interaction.

As mentioned above, the aim is to have all Reports adopted in plenary by September 2020. The European Commission will evaluate these Reports and will issue its legislative proposal by December 2020. After that, the European Parliament and the European Council will consider their positions for a final legislative text to be adopted and implemented across the EU. This means that this is only the beginning of one of the most significant “digital” files of this mandate. We would therefore like to invite you to be part of this fruitful discussion through your feedback to my draft. My team and I remain at your disposal for any further questions or concerns.

Further info

LATEST ACTIVITY

LATEST COMMENTS

Status: Closed
Privacy: Public
Dr. Patrick Breyer. Digitaler Freiheitskämpfer und Europaabgeordneter der Piratenpartei. MEP for Piratenpartei and the European Pirate Party. Homepage: https://www.patrick-breyer.de

CONTRIBUTORS (21)

+3
Share:
_

P1

SUGGESTIONS

You agreeCan't vote

Add comment

P2

The Committee on Legal Affairs calls on the Committee on the Internal Market and Consumer Protection to incorporate the following suggestions:

You agreeCan't vote

Add comment

P3

A. Whereas the rules enshrined in Directive 2000/31/EC on electronic commerce have allowed for the development of the Internet and of digital services in the EU since two decades, and are key in protecting fundamental rights as well as in safeguarding an innovative business environment; considering that their revision should not be envisaged without thorough scrutiny and utmost caution.

You agreeCan't vote

Add/View comment (1)

people_img

P4

Right to privacy online and anonymous use of digital services

You agreeCan't vote

Add comment

P5

  1. Stresses that wherever it is technically possible and reasonable, intermediaries shall be required to enable the anonymous use of their services and payment for them, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; where the Directive on Consumer Rights requires commercial traders to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld.
You agreeCan't vote

Add/View comments (2)

people_imgpeople_img

P6

  1. [possibly address single-sign-on service]
You agreeCan't vote

Add/View comments (2)

P7

  1. Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the collection and use of personal data concerning the use of digital services shall be subjected to a specific privacy framework and limited to the extent necessary to provide and bill the use of the service. Public authorities shall be given access to Internet subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation.
You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img

P8

General monitoring and upload-filtering of content

You agreeCan't vote

Add comment

P9

  1. Reiterates that hosting service providers or other technical intermediaries shall not be obliged to generally monitor user-generated content.
You agreeCan't vote

Add/View comments (4)

people_imgpeople_imgpeople_imgpeople_img

P10

  1. Notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lacks independence, qualification and accountability; therefore stresses that the Digital Services Act shall explicitly prohibit any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation; content moderation procedures used by providers shall not lead to any ex-ante control measures or upload-filtering of content;
You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img

P11

Contract Terms and conditions

You agreeCan't vote

Add comment

P12

  1. Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services shall be subject to judicial review. Terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, shall not be binding.
You agreeCan't vote

Add/View comment (1)

people_img

P13

Adressing illegal content

You agreeCan't vote

Add/View comment (1)

P14

  1. Highlights that, in order to constructively build upon the rules of the e-Commerce Directive and to ensure legal certainty, the Digital Services Act shall exhaustively and explicitly spell out the obligations of digital service providers rather than imposing a general duty of care; highlights that the existing legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers.
You agreeCan't vote

Add/View comments (2)

people_imgpeople_img

P15

8. Stresses that the responsibility for enforcing the law, deciding on the legality of speech online and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent and democratically accountable public authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations.

You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img

P16

  1. Underlines that illegal content should be removed at its source, and that access providers shall not be required to block access to content.
You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img

P17

10. Suggests that major commercial hosting service providers should provide a publicly and anonymously accessible notice and action mechanism for reporting allegedly illegal content published on their platform, and that notices should be examined by qualified staff based on clear criteria, that the content provider shall be heard before disabling access to their content, and that adequate redress mechanisms, both via dispute settlement bodies and judicial authorities, should be made available; while applying reasonable time-frames; highlights that persons who systematically and repeatedly submit wrongful or abusive notices shall be sanctioned; underscores that smaller commercial and non-commercial providers shall not be subject to these obligations.

You agreeCan't vote

Add/View comments (5)

people_imgpeople_imgpeople_imgpeople_imgpeople_img

P18

  1. Calls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge or awareness of such a crime.
You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img

P19

  1. Stresses that proportionate sanctions should be applied to violations of criminal and civil law, which shall not encompass excluding individuals from digital services.
You agreeCan't vote

Add/View comment (1)

people_img

P20

  1. Highlights that in order to protect freedom of speech standards, to avoid conflicts of laws, to avert unjustified and ineffective geo-blocking and to aim for a harmonised digital single market hosting service providers shall not be required to remove or disable access to information that is legal in their country of origin.
You agreeCan't vote

Add/View comments (3)

people_imgpeople_imgpeople_img