On 20 April 2021, EU Watch and ECLA organised an online panel debate with four experts on the subject “Free speech under attack? Regulating online content in Europe and beyond”. Following is an edited transcript of the debate. If you have comments or thoughts, please leave them in the comment box below.
HELGA TRÜPEL: That ban followed an illegal action – the storming of Capitol Hill. It was issued in the context of calling for violence and subverting democracy in the United States. From this point of view, it was the right decision by Twitter and Facebook to ban Trump. Generally, I would never allow private companies to go for such wide-ranging decisions without any political regulation and good laws in place.
In Europe, we need good rules on how the new digital companies will communicate with the general public, a good approach on what their impact means for our democracies.
«I would never allow private companies to go for such wide-ranging decisions without any political regulation and good laws in place» – HELGA TRÜPEL
BILL ECHIKSON: I guess a couple of years ago I would have questioned this decision, but when I see the beneficial impact of banning Donald Trump and how it’s calmed the political discourse in the United States. What he was saying broke even American free-speech laws. As a Supreme Court justice once said: You can’t go into a theater and scream “fire”. Trump was literally doing that, only that it was not a theater, but Congress.
On the larger question, it is problematic that private companies have this responsibility. Unfortunately, I don’t see that governments will be very effective in stepping up and finding the right formula. I often worry that they might go too far, like in Turkey, Russia, or be ineffective, as I would say the German hate-speech law has been.
«I don’t see that governments are finding the right formula» – BILL ECHIKSON
Germany has tried to find a hate speech law that respects free speech and is democratic. However, the government didn’t really enforce it, and so it didn’t change the takedown levels. A lot of the good ideas in Europe are either too complicated or ineffective in really making a difference. Debate is healthy, and we need to strike a balance between free speech and privacy or between free speech and responsibility.
PHILIPPE COËN: Since we are in democracies, the idea is that free speech is something that is supposed to be fueled and fulfilled by the consciousness of the citizens. Depending on where you sit and where you speak, the definition of free speech may vary, and that’s fine. We don’t have a single concept of free speech.
«Depending on where you sit, the definition of free speech may vary, and that’s fine» – PHILIPPE COËN
This is going to remain a moving target. Hence, regulating it is is a real challenge. The EU is only one piece of the globe and trying to agree on the fundamentals of what is harmful, what should be free. So we need to define “respect for the human dignity” in the online sphere. Only when you have gathered these concepts can you define what is free speech.
PAUL NEMITZ: Let’s not have today a discussion where we are learning from America what’s free speech. I think much has gone wrong there: A country which had a president who according to the Washington Post gave 20,000 false or misleading claims is really not in the position to lecture others about free speech. Get your house in order, adopt the democratic laws that are necessary to come back, and be a vibrant democracy! Then, we can continue talking. In rankings on freedom of speech and freedom of the press, the US regularly ranks behind Europe. Why is that? Because of lawlessness. People have to fear that when they speak out freely they are being chased and being threatened. That is not what free speech is about.
Free speech is also about the plurality of the press, and the press in the US has been going down the hill. Press plurality in America is today much less than in Europe.
I am proud of the European legacy of plurality of the press.I would just say that there is no external control of what’s happening inside America on free speech. However, all countries in Europe are subject to control. Free speech laws in Europe is largely made and controlled by the European Court of Human Rights. The ECHR is external to all our countries and external to the European Union. It controls and protects individuals against overreach by the state when it comes to free speech. I am fed up with getting lessons from America about free speech. I would say this: Clean up your own shop and make laws and subject yourself to national external control by courts!
«Free speech is also about the plurality of the press» – PAUL NEMITZ
When it is about repression, the United States is very quick to join international conventions. For example, the US has joined a convention of the Council of Europe, the Budapest Convention on cyber-crime. But when it is about protecting individual rights, for example data protection, the US has not joined the relevant multilateral agreement of the Council of Europe. It could be, though.
In the meantime, America should not get into the way of Europe when we are getting our own shop in order. The laws which we are now proposing here in Europe are good also for Americans because they solve a lot of problems which America doesn’t seem to be able to solve itself. I would find it right if the new American government and Congress supported the hard work, which is before the European Parliament and the Council now, to make sure that the outcomes are beneficial also to the United States. I have gone through this experience when we did the GDPR in Europe. President Obama tried twice to adopt what was caused Baseline Privacy Act to get it adopted in Congress. He failed twice. There’s a lot of lecturing from America, but I say this: Get your house in order, and don’t try to destroy the good works others are doing!
HELGA TRÜPEL: This is not unique to free speech, we are faced with the same issue a lot in lawmaking. We don’t always have the same traditions, emotions and political perspectives in the various European member states. One of the consequences is that we strive for “minimum harmonization” of European laws, which allows member states to go further if they wish. The problem is that whereas illegal content is more or less clearly defined, that is not the case with so-called harmful content.
HELGA TRÜPEL: Some still have this philosophy of John Perry Barlow of 20 years ago, who advocated the “independence of cyberspace”. I think it is outdated. The EU’s lawmaking was always geared at new companies., and for a certain period of time, there were good reasons to proceed like this. However, with 20 years of experience on what it meant not to have proper rules for what are now digital giants, we need to rethink our philosophy on how deal with them. They have had a huge impact on our democracies, on our economies and on the mindset of people. Therefore, we must discuss freedom and the abuse of freedoms. How do we combine the political and philosophical freedom and responsibility? Democracies don’t function if we don’t have a match between freedom on the one hand and responsibility on the other.
PAUL NEMITZ: The EU cannot solve all the problems of this world. Plus, laws are never perfect. The engineering view of this world – the one which Silicon Valley spreads – is that we want perfect laws, or no laws at all. That’s not okay. Every democratic law is a matter of compromise. No law is perfect. Everybody knows this! It is very dangerous and even anti-democratic to put the bar too high when it comes to laws concerning the Internet.
We are trying to put down rules for a society that respects fundamental rights of its individuals, operates according to the rule of law is a functioning democracy in the age of artificial intelligence. We are doing that with the Action Plan on the Digital Single Market, and the resulting proposals for legislation, and the Action Plan on Democracy. They go forward in parallel because the Commission recognizes that what we do in the digital sphere is not only an issue of commerce and the internal market and competition, it is also vital for the democracy.
Let me move quickly through the legislative proposals which are on the table: The first is the Digital Governance Act. This act is an effort to set up independent intermediaries which make data available in the public interest, both for the public interest in terms of market activities (i.e. businesses), but also for public interest in the narrower sense of the word, namely civil society and government activities. We have currently a dominance on data holdings by American companies. In the cloud market, for example, many parts of the digital economy are dominated by the big five US companies, which is not healthy either for the market itself nor for democracy. Hence, we need independent intermediaries. Second are the Digital Markets Act (DMA) and the Digital Services Act (DSA). They serve to discipline the giants and the actors in the market and in democracy. The DMA is a measure to improve our competition law so that dominant positions can be prevented and we don’t have to wait until the problem is there.
«We have to be careful about this technology of speech artificial intelligence as it allows to dominate people and whole democracies» – PAUL NEMITZ
Authorities can already act when they see something coming. This is very important because innovation and business development also in terms of business success dominating the market are very fast nowadays. It’s a winner-takes-it-all economy on the platforms. Therefore, it is important to preempt those dominant positions rather than wait, because afterwards it will be very hard to turn the clock back.
The DSA is about the behaviour on the platforms – commercial behaviour as well as behaviour regarding free speech. This is, however, not only about speech. It is also about commerce, selling and self-preferencing – all these things which we see on Amazon Marketplace, for example. Moreover, we have proposals on Artificial Intelligence (AI). Supposedly, it will be one day the most ubiquitous all-purpose technology. It will be as present in our lives as electricity. It will take decisions on speech, on schooling, health and policing – areas which are all highly sensitive. There is great money to be made with Speech AI. However, it is also about power – power over people. Plus, it is about the ability to notch people in a certain direction, of course, to nudge them to buy and participate in commerce. But maybe also to nudge them in terms of democracy? We have to be careful about this technology of speech artificial intelligence as it allows to dominate people and whole democracies. We are proposing rules which ensure that this technology is compatible with the requirements that define our constitutional order, namely respect for fundamental rights, the rule of law and democracy.
PHILIPPE COËN: It’s very ambitious, which is great, but it can be perfected. In which sense? One, this is all about big players and gatekeepers. Every six to nine months, a once great or large company is organizing some accountability measure, but then comes along another player from anywhere in the world with encrypted content and serves as the new conduit for hate speech, radicalization or incitement to terrorism. So, we must not only focus on large players but make sure that 100 percent of the players who operate in the EU member states become accountable.
«We need to promote positive values instead of only fighting negative aspects like hate speech, or trying to sanction gatekeepers» – PHILIPPE COËN
We also need to try to promote positive values, instead of only fighting the negative aspects like hate speech, or trying to sanction gatekeepers. We need to insist a little bit more on education and prevention.The EU institutions need to be the engine and to make sure the 27 member states are going to provide the right budget. That is as important as sanctioning bad actors or worry about non-compliance. We are trying to use all the upsides of the EU legislation with the “privacy by design” concept and we are trying to transform it into “respect by design”. The GDPR regulation, which has been an exemplary success and achievement of the EU institutions, is now the benchmark worldwide for what privacy should be like, including in the US.
Finally, we need to create an ethical employment. We just talked about Microsoft AI. We are talking about how to distinguish between human and AI monitoring of content. However, when you want to moderate what is called “grey content”, you need human eyes at some stage. It is always a matter of context, and context cannot be processed fully by a machine. Moderating has mostly been happening outside of the EU. The idea would be to repatriate this profession into Europe in order to create employment and make sure that, in the DSA, there is a real standard of what should be moderation. The companies should not be the only ones to be able to explain how content should be moderated. You need independent experts and NGOs. That should be in the draft law which is now under review.
HELGA TRÜPEL: What is the character of those companies we’re talking about? For two decades more or less they have been saying that they are neutral platforms on which people upload content, and that they, the platforms, are not responsible for that content. That is obviously not true because a lot of the big commercial companies communicate to the general public, recommend, and monetize advertisement on their content. They’re not only a new access provider, they are de facto very often a content provider. And when you are a content provider, rules must apply to you because your impact on society or the quality of democracy is potentially huge.
«Platforms are not only access providers, they are de facto very often content providers» – HELGA TRÜPEL
Algorithms alone aren’t enough, because there are a lot of cases where they cannot judge correctly on what is going on. That means that you have to hire people who are well educated and can decide if what was uploaded is okay or not. That means that the companies have to be ready to pay for those people because they know the cultural and political context of the things that are uploaded. It is also about the transparency of algorithms. How do they work? How are they programmed? That must be part of the information given by the companies. But then we have to discuss how we find a good approach to the question between illegal content and harmful content.
BILL ECHIKSON: The EU needs to find the right balance between free speech and responsibility. I really should stick to illegal content. Trying to define harmful content and how that should be treated is very difficult. The idea that there should be special rules just for large companies and not the same rules or not any responsibilities for smaller ones is faulty, too. I worry that the focus is too much on the DMA. I think antitrust laws have been too slow. That doesn’t work well. While I understand the need to speed up and accelerate I think the DMA might end up reducing competition by basically underestimating the amount of competition between the big companies and even the ability of small companies.
«The EU should stick to regulating illegal content and not try to define harmful content» – BILL ECHIKSON
PAUL NEMITZ: With different activities come different duties. First of all, the legislation is also applicable to smaller players, but some elements of obligations are only applicable to the big players. I think that’s perfectly okay! We have to get used to an asymmetrical legislation in this world. The fact is we have some extremely dominant players and not only dominant in the marketplace but also dominant in democracy. After all, if you look at where people get their political opinion from both in America and in Europe, at least 40 percent now get their opinion from these dominant networks. We have to take size and power seriously. Control of power is a central function of democracy. This must include technological power. There can be purely passive technology providers and that’s one thing.
But when we talk about the platforms in our discussion so far, we’re talking about Facebook, YouTube, we’re talking about the platforms that act in terms of what you get to see. They decide through the algorithms what you get to see and they enrich it. They organised this and, let’s not kid ourselves, the reality is these business models can only work with upload filters. The upload filter tells the company what is there in terms of content so that the company can place advertisement in an optimal way right next to the content. These companies all have upload filters, that’s how their business model works.
«We should not decide according to technological, but according to societal and legal criteria» – PAUL NEMITZ
We have to come from a different angle. What does the functioning of democracy require? Then we have to talk with editors and newspaper publishers and talk about the economic problems which have been created by these platforms, in part. We have to talk with the victims who have nothing to do with digital rights. The people who are subjected to antisemitism, or the refugees who are subjected to violence that was incited on these platforms, are not about technology. I wish that in the debate on these laws, the technology people must not be the ones who decide at the end.
It should be a discussion about real life and the impact impact on democracy, fundamental rights, etc. We cannot rely on the technological information or the business information the big companies give us. There are just too many examples, including an example where the Commission had to impose a $120 million fine on one of these companies for simply not telling the truth. I would caution against relying on what they say in terms of what is their business model and what they’re doing. We cannot be certain that they tell us the truth so we have to find other sources of information in this debate and I would pay in the end.
We should not decide according to technological criteria, but instead according to societal and legal criteria. These rules are not in the first or second place technological rules, these are rules about how society, how democracy or fundamental rights will function in the future or not. That means also we must get people engaged who have nothing to do with technology and actually are not keen to get into the technology. They still must have a say in these laws because their life will also be determined by this legislation.
PHILIPPE COËN: One of the ideas to build trust with stakeholders is to impose the existence of a “respect officer” within the digital companies. To make sure that they have an independent role within the company to be also the contact person on the EU territory. What we want to add is to create an equivalent of a DPO into each platform and the digital company. One of the first points that we debated we started is the one about what can or should not do companies. For instance, to ban someone forever. Even in the case of a devilish person who posts hateful content all the time, the idea that a company could forever render some kind of private justice and ban someone for life is, at least in a lawyer’s mind, problematic. Even courts do not rule or ban or sanction citizens forever. So the idea that the ban can be definitive pains me, although I am a defender of the victims of hate speech.
HELGA TRÜPEL: I am not in favour of putting harmful content into European laws. I think you really have to distinguish between illegal and harmful. That’s not only that you have your own channel where you only can tell what you think if you are a platform or communication to the general public, but that you have to make sure that we as citizens get very diverse information to make up our minds and don’t just switch from the general media to their friends’ platforms. Instead, we should look at a code of conduct, and ask what is responsibility, how are young people educated, what do they upload? And we should at ourselves, too. Before I upload things, I should at least for some seconds think and ask: Is it right? Is it correct? Do I harm other people? Is it nuanced enough?
We also should not go for the dominance of tech discourse. We have it in other fields of society concerning mobility, energy, nanotechnology and by bio energetics and all of that, that we have a common debate in society. What should be common rules concerning our common interests and what we had with this independence of cyberspace? It was a sphere and it was rebellious. It was wonderful. It was right. It was not the old world. A lot of youngsters were very much in favour. But we have seen now what are the consequences. Seeing that is at the beginning of all that law-making.
Again, the question, what is our assessment of the technologies and the discourse we are facing? Many still defend this concept and tech discourse. They say, for people of my generation, you are too old to understand the Internet. I doubt it because I think even if I don’t understand all the details, as a political woman, it’s important to ask what are the consequences on society. Therefore, we have to question this concept of dominance of technology and to reflect on what are its consequences.
Q13 – What kind of content should be taken down?
BILL ECHIKSON: I still believe that the Internet gave all of us a voice in a way that we did not have before. It allowed us to share information more than ever before. I still worry that governments around the world are actually scared about giving all of us a voice. They don’t like that freedom. While the freedom has gone too far and Europe is trying to find the balance, I remember the time at Google, I would wake up and there would be another country banning this product or that. Usually, it was YouTube clips, speech products that they did not like because to be online. I think that’s also dangerous. I do like the idea of a of trusted flaggers. But in a way, that too can be abused, and I am worried about the definition in the DMA because it would basically allow anybody, including competitors, to complain about content or products found on marketplaces. That can be abused as well. We need regulation and trust inspectors, but that could be independent reviewers. It is kind of ironic that the only one trying to get an independent group to do that right now is Facebook, with their own “Supreme Court” of sorts. I think this is an area where we could work to make the system better. Nonetheless, we have to be careful.
«I still worry that governments around the world are actually scared about giving all of us a voice» – BILL ECHIKSON
PAUL NEMITZ: We are not talking only about the Internet here, but also about the press, about freedom of speech in the wider sense. But as far as the Internet is concerned, of course, on the one hand we have an issue of the EU internal market functioning, and it is logical that if we say that as far as commerce is on the Internet is concerned we cannot have member states just run off and establish their own rules. That is already the situation today: in terms of commerce, the Internet is part of the internal market legislation, and rightly so. When it comes to a well-functioning democracy, we have to consider the nexus between commerce and the internal market on the one hand and democracy freedom of speech press on the other. This is a complicated matter.
The DSA has taken the line that we do not harmonize what is illegal but that it is instead a matter for member states legislation. But the DSA harmonizes the methodology of how freedom of speech has to be taken down and which defenses exist in relation to what is illegal and that’s a middle way. It is complicated and we have to see how this works, but it is a good start. In these matters of cutting edge legislation, the legislative process itself, the great adventure of democracy, is a discovery process. This is not about positions, not about smart talking, but about trying to struggle and to find the best solution for society. I hope in the ongoing debates in the European Parliament and in the Council, that will be the attitude. It is not by chance that, when it comes to the free press in a free and democratic country, there exist certain types of self-regulation which tend to make sure that the press follows a certain code, e.g. reporting the truth.
Democracy cannot function if we just say “Sorry, but we can’t do anything about it” with all the lies and organized propaganda being spread, whether privately or publicly financed weather from third states or within. We must make sure that we continue to have a vibrant privately-financed free press that is able to function as the Fourth Estate, namely to control power whether public or private in our society. That is why you know in the democracy action plan we have elements where other member states have to do more. We must make sure that we have a broad plurality of press.
«Democracy cannot function if we say sorry, but we can’t do anything about it» – PAUL NEMITZ
As far as the protection of the election process is concerned, where democracy and the debate on truth really becomes crucial the European Commission will make a legislative proposal to regulate the transparency of election advertisement. Beyond that, the obligations to make sure that platforms are not abused, as set out in the DSA, go beyond the question of pure illegality. I am well aware that this is a sensitive thing. But in a democratic society, to say that we just do not care that Russia spreads lies in an organized way, or that President Trump told 20,000 lies, is a little bit too easy. In this way, democracies go down the drain. We need more of an intellectual effort, more honesty, and we need to have the guts to address this issue in a way which that in line with our democratic rules. We have addressed this issue in the context of free press, and without establishing a “Ministry of Truth”.
HELGA TRÜPEL: We do need to be very careful. However, it is too easy to say that we are not able to do anything, that we have to let it go when we see what happened in terms of economic behavior and the impact of troll farms and other hostile governments to what the European Union and the US and so forth. I think it is really worthwhile to dig in deeper, to have a look at what the Commission will propose. Obviously, we need for the Internet platforms which are de facto content providers, similar laws to that of the press. In democracies, the fact that you have press laws does not mean you abolish the freedom of the press. We have to figure out what we can do, always taking into consideration that it is a slippery road, so we have to be careful.
PHILPPE COËN: Most of this discussion has been how do we discipline the platforms and some platforms. But what about education, which is one of the role of the EU organizations? It is not only about being able to fine US companies, it’s also about helping education and having new tools. One of the educational tools would be that, if you want to have any subscriber in Europe, you must have easy to understand terms and conditions about what can be posted, what is legal to be posited and what is not. A kind of one-pager TNC, not a 40-page-long one. A one-pager about the charter of respect online. This is what is missing. We should talk about the fact that we have 15,000 people, single mothers mostly, somewhere in the Philippines exposed to hate for 14 hours a day so that they make a living. But we must also talk about how it is possible that there are only 15,000 of them! Moreover, most platforms have only a couple of hundreds! My point is: We need to reduce the flow of hate.
«Hate is a result of the total absence of freedom of speech» – PHILIPPE COËN
The role of the platforms should also be to encourage a conversation that includes empathy and listening skills. Hate is a result of the total absence of freedom of speech. It starts with the lack of ability to listen to the others and to listen to others’ viewpoints and opinions. What happens when you are provided a smartphone as a nine-year-old child, a phone with an Internet subscription? Nothing is being said in the DSA or the DMA in this regard. At the very least you ought to pass a two-minute tutorial to understand what the rules are and that you must not hurt someone else’s feelings. It is the role of the EU institutions to oblige all the ISPs to do that. In other words, parents cannot offer their kids a smartphone if they have not passed the two- minute test. This is also about democracy, because democracy is education first and foremost.
BILL ECHIKSON: I agree that the platforms that we are talking about need to take on more responsibilities. We just have to be careful that we don’t make it so that no one can post anything anymore because liability is so high that the platforms will just take down everything with automatic filters. There would be no nuance. Finding that balance is going to be tough. I think there’s a lot more we can do on content filters: trusted flaggers and so forth.
Helga Trüpel is a co-founder of the German Green Party. She served as one of the first ministers of her party at the regional level in the early 1990s in the city state of Bremen, in Germany. From 2004 to 2019, she was a member of the European Parliament where she dealt with a number of digital issues, including the GDPR and the Copyright Directive. She was also a member of the Delegation for Relations with China and the Delegation for Relations with the United States. Helga Trüpel is one of the co-founders of EU Watch.
Paul Nemitz serves as principal adviser in DG Justice and Consumers in the European Commission. Until 2018, he was the director of Fundamental Rights and Union citizenship in the same directorate-general. During that time, he spearheaded the General Data Protection Regulation (GDPR) and led the Privacy Shield negotiations with the United States. He also helped to set up the voluntary Code of Conduct against hate speech on the Internet. Recently, Paul co-authored a book called Prinzip Mensch – Macht, Freiheit und Demokratie im Zeitalter der Künstlichen Intelligenz, which discusses the impact of new technology on our democracies.
Bill Echikson is head of the Digital Forum at the Centre for European Policy Studies. From 2008 to 2015, Bill worked as Google in Brussels, where he was responsible for coordinating communications for the European Union, the Middle East and Africa. He also handled the company’s high profile antitrust and other policy-related issues. Before Google, Bill worked for three decades as a foreign correspondent in Europe for a series of US publications including the Christian Science Monitor, the Wall Street Journal, Fortune, and BusinessWeek. From 2001 until 2007, he headed the Brussels bureau for Dow Jones. Bill is a historian and graduated from Yale University with Magna cum laude.
Philippe Coën is a French lawyer with degrees from Paris and Harvard universities. For a long time, he served as president of the European Company Lawyers Association (ECLA), of which he is now honorary president. Apart from working as senior legal counsel for Walt Disney in Europe, he also founded the NGO Respect Zone and is a member of the French Observatory of Online Hatred. Philippe Coën has also written a book called Internet contre Internhate – Plaidoyer pour le respect.
Subscribe to our newsletter