Since the Electronic Commerce Directive was adopted in 2000, the digital space has significantly evolved with the emergence of social media platforms, OTT and sharing-economy players as well as with the convergence of media consumption, and the explosion of user-generated content. Concerns about developments in the digital economy, including the rapid proliferation of online hate speech and disinformation, have spurred calls for a revision of the current rules, where it is argued self-regulation is not having the necessary impact. In this context the European Commission is expected to launch a review of the existing E-Commerce Directive through a proposed Digital Services Act, which would “upgrade our liability and safety rules for digital platforms, services and products, and complete our Digital Single Market.”
This Forum Europe conference will discuss the objectives and potential provisions of any proposed Digital Services Act, addressing issues relating to liability, content moderation and regulatory oversight of intermediary service providers. It will assess what is required to create conditions that foster fair competition for digital platforms and the media sector to thrive, and to safeguard an open and safe online space, ensuring the Internet continues to be a key driver for innovation, economic growth, democratic discourse and social progress.
Margrethe Vestager, Executive Vice President of the European Commission responsible for “A Europe Fit For The Digital Age” has agreed to be our first keynote speaker.
Margrethe will follow this opening keynote with an interview with our chairman, Paul Adamson.
Executive Vice President, A Europe Fit for the Digital Age
Deputy Head of Unit Data Policy and Innovation, DG CNECT
Director-General of DG DIGIT
Business Strategist for Artificial Intelligence
With Margrethe Vestager, Executive Vice President, “A Europe Fit for the Digital Age”, European Commission
Setting the scene for the rest of the day, this session will explore the extent to which new rules governing the responsibilities of online platforms are needed and will discuss what the best policy and regulatory instruments may be so that the new rules remain fit-for-purpose in the future.
An expected provision of the Digital Services Act would oblige online platforms to have a “duty of care” for content that is both hosted and shared on their sites. In the past years, policymakers have called for online platforms to take a more proactive role in addressing illegal content, through initiatives such as the ‘Code of Conduct on Countering Illegal Hate Speech Online’. It is argued however that voluntary self-regulation hasn’t proved to be efficient, with platforms increasingly being criticised for failing at tackling illegal content, hate speech, activities infringing copyright rules or the spreading of disinformation. This has led to calls for clarified rules regarding platform responsibilities and to question the liability exemptions that platforms have been enjoying under the e-Commerce Directive, known as ‘Safe Harbour’. In an era where online content is increasingly user-generated, this session will discuss the extent to which tech companies can be incentivised to be more proactive in removing illegal material without undermining freedom of expression, hindering innovation and worsening the user experience.
The E-Commerce Directive (ECD) sought to deal with issues of liability relating to any illegal activity of users of internet platforms. When the ECD was originally drawn up, much of this centred on goods sold online. While hate speech and illegal content online is subject to much focus from policymakers and the media, counterfeit and unsafe products sold on internet platforms continue to be a priority area for e-commerce players, brands and consumer advocates amongst others. This session will look at how the DSA will seek to address and update policy on the relationships between platforms, their users and consumers, brands and manufacturers, focusing on liability, competition, trust and cross-sectoral cooperation.
Availability and access to trusted information is at the core of the functioning of our democracies. In recent years however, the increasing use of online platforms for the proliferation of so-called ‘fake news’ and ideological ideas in rapid, repetitive patterns, have impacted the integrity of the political discourse, leading the European Commission to establish a ‘Code of Practice on Disinformation’ to address these issues. Following the publication of the first annual self-assessment reports by the signatories of the Code last October, this session will debate the main findings, explore the concerns that remain to be addressed and will discuss the role that the Digital Services Act provisions can have in creating the right conditions to improve the integrity of media services.
Applies to: Corporate Organisations
Applies to: NGO/Not for Profit, Academic/ Student, National Government/ Regulator, Diplomatic Mission to the EU
Applies to: European Commission/ Parliament/ Council, EU Permanent Representations, Journalist/Press
* Please note that fees do not include Belgian VAT @ 21%, and this amount will be added to the total price when you are invoiced.
Group discounts are available when registering multiple delegates on the same booking, as shown below.
|3 – 5||10%|
|6 - 8||20%|