Deprecated: Array and string offset access syntax with curly braces is deprecated in /home/wgig/public_html/igf/website8/web/cms/components/com_fabrik/helpers/string.php on line 264

Deprecated: Function get_magic_quotes_gpc() is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/cegcore2/gcloader.php on line 63
2015 11 12 WS 128 Mitigate Online Hate Speech and Youth Radicalisation Workshop Room 9 FINISHED
 Welcome to the United Nations | Department of Economic and Social Affairs

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; PlgContentFabrik has a deprecated constructor in /home/wgig/public_html/igf/website8/web/cms/plugins/content/fabrik/fabrik.php on line 24

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; plgContentJComments has a deprecated constructor in /home/wgig/public_html/igf/website8/web/cms/plugins/content/jcomments/jcomments.php on line 25

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; plgContentJw_allvideos has a deprecated constructor in /home/wgig/public_html/igf/website8/web/cms/plugins/content/jw_allvideos/jw_allvideos.php on line 18

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 




>> MODERATOR: Okay. Good afternoon. Welcome to this UNESCO session on trying to understand hate speech, and also radicalization of youth. Very welcome to the third UNESCO session. We are very happy to see that our sessions are quite popular, and it's really a great pleasure to be with you today. The reason why we are organizing this session, and on this very important topic, is that although there are a lot of opportunities that come up with the internet, the reality is that we also have challenges.


And clearly, one of the challenges is hate speech online and use of radicalization. And when we look from a perspective of UNESCO, from a perspective of freedom of expression, we think that what we need to do is to share ideas. We need to enter in a dialogue. And you have to make sure that counter‑speech is better than repression. And that's why in order to understand the complexity of this phenomenon online, and also if, indeed, we are promoting the right approach, that this is really countering online hate speech, we did a publication.


We just published that in our series. It tried to analyze the complexity of the issues, the best practices, experiences, and really the measures that can be taken in order to protect our young people, the people that participate in platforms, and really trying to make sure that both hate speech and radicalization of youth, it's becoming a minor problem online. But as you can imagine, because it's a complex topic, we thought that it would be very nice during the IGF forum to really have a group of experts that have been dealing with some of these issues in different ways, and to listen to them and to see, what can they tell us more?


And also, very importantly, also to listen to you and see what are your experience in this topic, and how can we, from the point of view of UNESCO that works with public policies, that work with international standards, how can we support, really, the movement that is trying to reduce or to mitigate the effects of this phenomenon on the internet. And therefore, I really have a very prestigious group of speakers. And I'd really like to start with them, because it's their time now, to do a five‑minute intervention.


And then we will open to the floor. And I hope that we can then interact together to see, how can we understand better this phenomenon, and what are the experiences that we can put forward. So, the first speaker is the Executive Director of Robert Kennedy Human Rights in New York, and works for freedom of expression. And, Frank, give us some ideas. Thank you.


>> FRANK: Thank you very much, Lydia, and UNESCO, for this opportunity to share only ideas. I think this particular topic is very important to make it a dialogue, because when we talk about youth and experiences, what we're really talking about is sharing the good practices or the difficulties we're all finding in this work. Let me begin by saying that first of all, radicalization of youth is kind of a strange name. We were mentioning it here ourselves before.


It is true that radicalization can go the wrong way, but radicalization is normal with youth. And radical thoughts are always coming from young people. If you remember the '68 movements and the university reforms, and all the different intellectual reforms that were provoked by young people. When we talk about radicalization, we must emphasize that we're not talking about radicalization in philosophies or ideas, or thoughts, or proposals.


We're talking ‑‑ possibly what we're meaning there is radicalization that takes to violent acts and to violence. My perspective is very much marked from being from Latin America, where we actually developed several youth violence prevention programs. And let me say that one part has to deal with language. And I'm not sure it's always hate language. I mean, sometimes radicalization can come disguised not as hate language, but as even friendly language toward the young people.


But taking them the wrong direction. So it is true that when we talk about freedom of expression, we do have to have very clear boundaries. If we're talking about the language that deals with criminal activities, then, yes, we're talking about security agencies and security forces intervening. If we're talking about incitement to terrorism, or putting online the manual of how to build a bomb, or how to become a member of ISIS, yes, those are serious concerns.


And I know that this is growing around the world in different parts of the world. In the West, but also in the East. But at the same time, there can be a preoccupation that oftentimes, because we want to limit criminal activity ‑‑ which would include other forms of activity like child pornography, or incitement to genocide ‑‑ many states may go a little bit over the human rights standard of what is the extreme cases that should bring the attention of the state.


Whether it is hate speech or the protections established in article 19 of the ICCPR. And here is where we are dealing with dangerous ground. And we have to be ‑‑ I think it is important to have this discussion. And I'm not making light of this topic. This is a very serious topic. Some of these expressions can actually lead us to understand what young people are thinking. But at the same time, we don't want the state to begin monitoring all communications, and then it be a justification for surveillance, we were talking before, because then what you're dealing is justifying a legal act of the state, which is unauthorized surveillance, and unlawful surveillance.


And secondly, you don't want the state to be preoccupied with what could be political expressions or rejection of those in power, criticism of those leading the country, criticism of public policies, criticisms on the basis of corruption of the public leaders. And in many countries, I am sure this will be seen as the radicalization of the opposition, as very dangerous for national security, or for the security of those in power.


So, I insist here that we have to deal with very much a human rights approach to what really is hate speech, which is difficult to define, but, it should be a very small, minuscule exception to the rule of openness of freedom of expression in general, but openness the use of the net. And finally, I believe that as we have said with hate speech, the best alternative is prevention. And positive speech. The plan of action that Mrs. Pelia promoted was precisely that, how can we bring more counter‑speech.


And I have said several times in many panels, we are trying to protect the freedom of expression online from the intervention of the state, but on the other hand we have neglected the other side, which is we do want to provoke a very intense debate from the bottom, from the society. Society has to debate what are the ideas being shared, what does the media, offline or online, say. What is being discussed nationally.


And finally, to finish this, violence amongst young people has many reasons, not only language. Language is the physical expression of that radicalization in many cases, but oftentimes, it comes from structural reasons. The fact that many young kids feel that they have been ghettoized, to put a term in their own country, many of them are sons and daughters of migrants. They have been born in the particular countries where some of these acts of violence may be committed, but they still don't feel identified with the nationality and the nation as such, because they were sort of put aside as second‑class citizens without access ‑‑ full access to education, health services, or jobs.


One of the things we have to look at is what are the conditions? Normally this process of recruitment, of really terrorist groups, or this form of radicalization, happens exactly in these young people that have been abandoned. We have felt ‑‑ the first element is how to integrate them into the services of society. Maintaining their identity. When I talk about integrating, I'm not thinking of erasing identity. Maintaining their identity, language, heritage, but integrated into the benefits and services of society.


And in that sense, also allowing them to speak. One of our best experiences in Guatemala was the fact that we brought young people to do theater, to do popular theater. And it was incredible what difference it made for them to see that for first time, they were being listened. They had an audience, they could stand and have a message. And instead of radicalizing, this released their feelings and made a big difference in their lives.


More speech, more freedom of expression, is also sometimes necessary.


>> MODERATOR: Thank you very much, Frank. And I think you are saying that in that respect, UNESCO is going in the right way. It is trying to see how can we ‑‑ instead of repressing, really counteracting this type of speech, and behavior. Our next speaker is Sunil Abraham, the Executive Director of the Center for Internet and Society in Bangalore, India. Without trying to disrupt, because Frank put some questions and definitions, what is, for you, hate speech online?


And how can we deal with it? Yesterday, they said there were 200 definitions. But since we are in a panel, and maybe ‑‑ I know that ISIS and some of the issues that Frank raised as a way of marginalizing youth are your concern. But also, what is it exactly, in your perspective, hate speech? And how can we deal with it effectively?


>> SUNIL ABRAHAM: First of all, I should just point to a fact that on this nine‑member panel, we have no youth. And we are talking about the youth. So I apologize.




>> PANELIST: I can see some gray hairs.




>> SUNIL ABRAHAM: So I was wondering. (Laughter.) No wrinkles, but gray hair. The thing to realize ‑‑ just to go beyond Frank's definition, it may not be explicit calls for acts of violence against certain people. It can be speech that is very seductive, that might radicalize you. There is yet another category. In India, a couple of years ago, there was the spread of atrocity literature. People from the northeastern states of India, who were in the southern states, were shown pictures of Muslims being killed in Myanmar.


They were false pictures taken from some completely different context. And then a rumor was spread saying that at the end of Ramadan, the Muslims in Bangalore and other southern cities are going to exact revenge for the killing of their brothers in Myanmar. So, this is atrocity literature. It may not be classically considered hate speech, but it has almost the very same impact of precipitating communal violence.


So, I'm an engineer, not a lawyer. I work with lawyers. I'm surrounded by them. But I wouldn't dare to provide a comprehensive definition, because as in my example, it's very difficult to understand what exactly is or isn't hate speech. The next thing for us to understand is the nexus between speech and disruption of public order, between speech and incitement of violence. When a particular form of information and communication technology is new within a population, then its impact is much more dramatic.


Let's give you an example. When the first movie was screened to the general public, within that movie clip, there was a scene of a train approaching the audience. And the audience was unable to distinguish between reality and the movie. And the audience ran away from the movie theater. Similarly, when the internet penetrates a completely new population, the creditability of that media, the trueness of that media is, with unprecedented value within that population.


And therefore, those that want to radicalize young people, or provoke violence within a population, are able to leverage it. And therefore, in those populations, we may need to be a little more circumspect, and have extra safeguards to protect the population. I know Frank will find this extremely controversial, but this is a reality in countries with very low internet penetration.


The next thing to say is, it is not just the internet and speech online that contribute to the radicalization of youth. Frank already made this point. Indonesia is the country with the largest population of Muslims, and India is the country with the third‑largest population of Muslims. The success rate in both these countries for ISIS recruiters has been abysmal. And that is because in both these countries, somehow, the youth ‑‑ and in India, Muslims are a minority ‑‑ do not feel alienated. Somehow feel integrated.


But it isn't as if that is a permanent state. Today, there is increasing tension, for example, the government has tried to ban the consumption of beef in various states. And something as indirect to the religious question as the consumption of a particular meat could lead to the youth feeling alienated. Apart from counter‑speech ‑‑ and I haven't read the research output ‑‑ there is one more technique that has been promoted as a method to undermine radicalization online.


And that is to dismantle the credibility of those who promote this type of campaigns. The people who are running these ISIS campaigns are claiming that they have the authority to speak for really true Islam. But what they do in their personal lives is not in compliance with the principles of Islam. And exposing them for what they really are helps undermine the credibility or truthiness of their speech, so that is something else that we should take into consideration.


The last thing is, counter‑speech is also, in other words, propaganda. And every time the state establishes infrastructure for propaganda, we should ask how they would abuse that infrastructure, and, therefore, we must ask for lots of comprehensive checks and balances. Thank you.


>> MODERATOR: Thank you very much, for not only bringing in other aspects to the debate, but also very concrete examples that show once again the complexity of the topic. Our next speaker is Gabrielle Guillemin. She is the senior legal officer in Article 19, an international free speech organization based in London. She also has been a lawyer in the European Court of Human Rights. So, Gabrielle, after Frank and Sunil, can you bring also the perspective of your own region, and try to contextualize this in a more global environment, how this complex issue is being dealt, and really, maybe, explore some of the examples that can show ways of dealing with it.


>> GABRIELLE GUILLEMIN: Well, I think that, like Frank, I'd like to talk about, first of all, definitions and what we're talking about here, hate speech online, and radicalization. Then, secondly, I'd like to talk briefly about some of the measures that have been adopted by some governments to deal with this, because a lot of governments believe that in order to counter, sort of, a radicalization process of young people, access to certain websites must be blocked.


I'd like to touch briefly on that. And finally, look a little bit about ‑‑ at what's happening in relation to surveillance. Now, Frank has already touched on this issue of definitions. I think it's very interesting, because first of all, hate speech online is a very contested area. There's no real agreement as to what hate speech means. Nonetheless, we've seen some criteria have been developed at the international level, and there is a plan of action to help define what constitutes incitement to violence, which is speech which must be prohibited by states under the international covenant on civil and political rights.


Now, I think what's really interesting when talking about radicalization is that it's actually much more of a process. I was looking at a dictionary definition of radicalization, and it was more of a process of people being rendered less tolerant, which I thought was interesting. The other point is that in all those discussions, very often, the underlying assumption is, when we're talking about radicalization, it's also about young people joining terrorist groups like Daesh.


And so here we're faced with another definitional problem, because extremism is not defined, either. And when you look at the word itself, it qualifies an opinion without saying what the opinion might be. Why is that a problem under international law? It's a problem because although free expression is a fundamental right in a democracy, and freedom of expression protects speech which offends, shocks, or disturbs, at the same time, it can be restricted.


But when it can be restricted is subject to three conditions. And one is it must be defined by law, meaning that the law must be sufficiently clear as to what it's trying to do, or what it's trying to restrict or prohibit. It must also be proportionate. Now, if I turn to the sort of measures that you've seen being adopted in different countries or being discussed, particularly in relation to internet intermediaries, that is a problem.


So, for instance, at the moment the UK government is developing its counter‑extremism strategy, but they defined extremism as "the vocal or active opposition to our fundamental values, including democracy, individual liberty, and the mutual respect and tolerance of different faiths and beliefs." It's really broad. And on the basis of that, the UK government wants intermediaries to work with the industry and police to remove terrorist and extremist content.


But yet, with this definition, there's all sorts of perfectly legitimate expression, access to which could be removed. So, that is a problem. We also see a similar issue in countries like France where the government recently passed regulations whereby access to terrorist material can be blocked, or material which is publicly condoning terrorism, which, again, is very ill‑defined. And so all that it takes for that material to be removed or blocked is for the police to alert the ISP and say, hey, well, this is publicly condoning terrorism, without any definition of what that means.


And the content must then be removed. And if the content producer fails to remove the content, then the entire site can be blocked. So that's a real problem. It's also a real problem because it's an attractive solution. It looks like a magic bullet to deal with issues, but it fails to address the root causes of why radicalization happens. So, the third point I wanted to raise in relation to this is also in relation to privacy and surveillance.


I was asked recently to prepare a presentation on a similar topic, and I wanted to give examples of what might be extremist tweets. And then I thought, okay, should I go on a website and try and find out what terrorist groups are saying, effectively, or if some of this ISIS material is available. But then, I thought that in doing that, maybe my Twitter account would be flagged in some sort of way, would be alerted to the authorities. And then when I did the presentation, I explained my problem to the individuals taking part in the session, including law enforcement.


Their reaction made me think that actually, I was right to be worried about the fact that my account might be flagged for some reason, which I found very disturbing. And I think it's an example of the sort of chilling effect that these sort of measures can have on free expression, particularly when we're talking about the solution to these problems being in counter‑narratives, counter‑speech. It's very important to understand that the other side is saying, there are lots of academics and other people trying to find solutions, and therefore, it's very important to also get access to that information. So, I'll just close there. Thank you.


>> MODERATOR: Thank you very much, Gabrielle. And I think you brought, also, the other side of the coin. And you mentioned very strongly government, security, legislation. Maybe for our next speaker, and the younger one ‑‑ the young person in the panel, even if he has white hairs ‑‑ it happens to young people, too, not only to old people. He is the cofounder of Radio in Syria, and the young person. And my question is, what are the other actors?


I mean, we are talking in this forum of the importance of the multistakeholder approach and participation. What are the role of our other stakeholders, and the young people themselves in really doing things? So, the floor is yours.


>> PANELIST: First of all, this is genetic. Ladies and gentlemen, thank you for being here. Aside from being the youngest member on that panel, I'm also cofounder and programming manager at a Syrian community develop radio grassroots. We deal with a lot of issues related to violations, human rights, to radicalization, reconciliation, and advocacy, which makes it a perfect platform to bring a lot of opinions, debates, and dialogue.


Along with that comes a lot of hate speech and danger speech. And actually, a lot of this hate speech, we have learned throughout the last couple of years to differentiate and recognize what kind of speech we are having. So we differentiate the kind of speech towards us, or towards other commentators, or a public figure. We also differentiate between what could be coming to your address, which is blah, blah, blah I'm going to do things to you and your cats.


Or it could be simply, I hate you. There are a big difference between all of these, and we have to deal with each comment on a different level. We have learned, and we have seen ‑‑ we asked ourselves when we started the project, do we have to alienate ourselves and ignore, and increase the victimization and radicalization on the society, or should we engage ourselves and try to ease this gap and not make it explode even more?


We have seen a lot of discussions on a local and international level, such as the UNESCO Youth Radicalization Conference in Paris, we discussed that on an international level. We found out our methodology we've using is similar to the similar that is found in the publication of the UNESCO address mentioned, Countering Hate Speech Online. Information is very important. It's important to have enough information about what's happening, what's being done.


There's a lot of hate speech movements, fighting hate speech movements. It was important to analyze our situation. We don't have to reinvent the wheel. A part of that was taking hate speech and putting it in a text mining machine and come out with things that relate, that this hate speech has a pattern, it's repetitive. This helps us be proactive and produce engaging alternative material to bring the youth to listen to us more, to interact with us more, and to have somewhat a similar vision to ours.


And number 3, there's the action. And it's about responding to what's happening. And we have this plan. It's somewhat of a rule of thumb. It's not really very specific. But to wherever there's a hate speech, or wherever there's a comment, we try to react and ask people to rethink what they're saying. Sometimes people just repeat things. We ask them to be held accountable towards it. Sometimes we take it to private, do it friendly


client-service style, just try to talk with people on an intimate, friendly level.


We try to remind them there's a link between online and real life, there's consequences for what they're saying online. And, of course, we also stretch their language, whether it's religious, nationalist, you name it. We reason with them and do a lot of things. One of the interesting things that we have done once publicly, it was a campaign. And we would take a hate speech comment, for instance, X, Y, Z, is an effing effer, anyway. And we would replace, we would we move the name of the person or the group and ask people to fill in the blank, and replace with their own name or a group they like.


People would be able to see in the mirror how it sound like if they were addressed, actually. We tried a lot of that. And it was a bit controversial. So we did, not complete with it, it was provocative, as well. Anyway. There's also a lot of things to do on that level. Hate speech needs an environment. If you don't have the environment, if you have content that would not trigger hate speech and provoke people, people would really just get along with it.


We try to be aware of our effect online and offline. In Syria, for instance, the social media number one is what's up, we don't have any control once people have downloaded the material. We don't have any control over it. We also see our impact on the other media and how we affect them, the transparency, the ethical code of conduct. We try to be creative. Whenever we have an opinion that is provocative, we use it in drama.


We try to speak to communities, to people in their own languages, in French and English sometimes. This is the deal we try to do. If you have any questions, please don't hesitate at the end of the conference. Thank you.


>> MODERATOR: Thank you very much for bringing another perspective. And our next speaker, Judith Lichtenberg, is the Executive Director of the Global Network Initiative, that is also a multistakeholder platform that tries to bring together companies and civil society organization, investors, academics, really to advance freedom of expression and privacy. What would you add to this discussion in terms of how other stakeholders can, indeed, also support the efforts that we listen before from government, in this case, community‑based organization?


>> JUDITH LICHTHENBERG: Thank you. I'm at the age that you do a lot of effort to disguise your gray hair. (Chuckling.) UNESCO actually called for collective solutions for the problem of online hate speech. And I read the report. It describes how internet intermediaries interact with civil society organizations, and how those interactions led to changes in the way companies dealt with abusive speech on their platforms in specific situations, and finding collective solutions in the form of privacy and freedom of expression is what GNI is about.


Let me say a word or two on GNI. As was said, it is a global multistakeholder organization. We bring together companies, investors, academia, and a variety of civil society organizations together to forge a common approach to free expression and privacy in the ICT sector in relation to government demands that can impact those rights.


And currently, our company members include Google, Microsoft, Yahoo, LinkedIn, and Facebook. The report, as well as, my fellow panelist already explained ICT companies who host or transmit content globally face serious challenges on speech on their platforms. What Gabrielle was already describing is that we increasingly are seeing that governance using company terms of service to ensure removal of content without the legal process.


And in a way, that's not transparent for users and the public at large. And also, the second issue what was raised already by a couple of you is the definition and how difficult it actually is to determine what speech is hate speech, because it's so contextual. So UNESCO actually recommended to companies who are facing those challenges that they should operate in a manner that is consistent with the U.N. Guiding Principles for Business and Human Rights.


GNI was founded in 2008, with that same commitment. The first thing, what our members did was to come up with a set of principles and implementation guidance that are based on international human rights standards. And companies can use GNI's framework to guide responsible decisions. What we also do is we provide a confidential space for our members to exchange experiences and ideas.


A couple of years ago, we had a robust internal discussion when one of our company members was asked by several governments to remove a video clip, because it was perceived as very insulting by groups of people. And it also caused violence. And what we learned from that is actually that there are no easy answers to these questions, and that you don't have a one‑size‑fits‑all approach. What we have done recently is we launched a policy dialogue about extremist content, and we try to bring together government authorities and other stakeholders to discuss what the role of governments and ICT companies should be in responding to extremist content in a way that is transparent and respectful of privacy and freedom of expression.


We had our first discussion a month ago in London, and one of the outcomes, actually, of that roundtable session supports the caveat of UNESCO that more research is needed between the links of online and offline violence. The participants generally agreed that there is not a clear link between the consumption of extremist content online, on the one hand, and radicalization of individuals on the other.


And the same actually goes for evidence that supports the effectiveness of removing or restricting access to extremist content. And, indeed, it was also felt that we need more credible narratives to dissuade individuals from supporting or acting extremist acts. So I think that the UNESCO report is very timely to also feed into that discussion. And so, GNI will continue to focus on this topic in the coming months as part of our shared learning and policy engagement. Thank you.


>> MODERATOR: Thank you very much, Judith, and again, bringing the complexity, but also some possibilities affecting, and, again, the importance of multistakeholder participation. Our next speaker is a Federal Prosecutor in the Federal Circuit responsible for the coordination of combating cyber crime workforce at the Federal Prosecution Services. But what is interesting about you is that you are also working in a project that is public prosecution for digital litigation in schools, bringing a component that was underlined in the discussion until now, but not really put forward. So, maybe give us your contribution to this debate.


>> PANELIST: Good afternoon. I'd like to start by apologizing for my English. I've been practice, but I'm not as fluent as I would like to be. Anyway, I will do my best. (Chuckling.) I've been working for a long time with cyber crimes. And we notice that the oppression is inadequate prevention ‑‑ the best way forward to create awareness of people, children, and adolescents. Many risks are the online grooming, dissemination of pornography, children or adolescents, and cyber bully, including hate speech.


The Federal Prosecution Service maintains technical cooperation agreement for educational activities with the NGO SaferNet Brazil, a private nonprofit organization that joins forces to fight human rights violations on the internet. Between 2009 and 2013, several workshops on safe and responsible use of the internet for teachers from public and private schools were carried out. The initiative of the first workshops occurred in Rio, and several other places.


And because of the success of these workshops, we decided this year, 2015, in partnership with the NGO SaferNet, and sponsored by the Internet Steering Committee in Brazil,, the federal prosecution service, under the coordination of the office of the national ombudsman launched the project at schools.


The Courage project aims at achieving safety, advocacy, and citizenship education for ten Brazilian cities in a year, including Joao Pessoa. The following steps are to extend the reach of the project to all major cities of the country. The workshops will be presented in public and private schools. The first action is to promote a meeting with the agents of municipal and state education departments, and municipal and state social assistance, along with SaferNet and the federal prosecutor for project presentation and scheduling the first workshop.


On workshop day, preferably to be presented at the federal prosecutor's, teaching and materials are handed out to help teachers introduce the subject of safe web navigation in the classroom. There's also a brief presentation by the local federal prosecutor about federal prosecutor's responsibilities in federal cyber crime investigation, prosecution. If present, the state attorney can explain about related crimes of state jurisdiction.


We try with this work to assure the effective application. I will return, okay. Application of the article 26 from Brazilian civil rights framework for the internet, our recent law about the internet that states the compliance with the constitutional duty of the state may provide at all educational levels integrated training and other educational practice for safe, conscious, and responsible use of the internet as a tool for the use of the citizenship for the promotion of technology development.


I will return to show the material that we distribute for the teachers, booklets. And this is the campaign that we showed on the workshops, that shows on TV. And this is the evaluations, we show the auditorium with the teachers. And we help the workshops in Brazil. And this is what shows our work, that you can know better. And this is my contact. (Chuckling.)


>> MODERATOR: Thank you very much for bringing this part of educating the citizens how to use in a responsible way the internet. And that brings me to you, Eve. Eve Salomon is from the Internet Watch Foundation. And she has a lot of experience in self‑regulating. How would you see this empowerment of people, of youths, really to taking control also, how to use the internet, and really parents and teachers, too? So . . .


>> EVE SALOMAN: Thank you, Lydia. I'd like to object to the identification ‑‑ well, the lack of identifying that I'm actually the youngest person on the panel.




>> EVE SALOMAN: I've just had a really tough life.




>> EVE SALOMAN: It's hard to come in at the end of this discussion, because everyone, I think, has made all the important points. But I'd just like to talk a little bit from my experience as chairing the Internet Watch Foundation, which is the body that's responsible for takedown of child sexual abuse content in the UK, and now has a global presence.


Now, as others have pointed out, Article 20 of the ICCPR places responsibility for the prevention of hate speech firmly on states. However, particularly when it comes to extremist hate speech, let's call it that, speech which could lead to the radicalization of young people, in my experience, states very rarely exercise their responsibility directly. In fact, they tend to transfer responsibility, as Gabrielle mentioned earlier, to private actors like Facebook or the ISPs.


Why? Because for the state to deal with it itself, it costs a lot of money. It's a slow process. So instead, they hand the responsibility over. And they will ask the ISPs or Facebook if the material is hosted, you know, on their platform, to remove it. If it's not hosted on their platform, they'll ask the ISPs to block it. There is no accountability or transparency whatsoever in that process. And importantly, there is no appeal. So, if material has been arguably taken down incorrectly, there is no mechanism to appeal that decision.


And I would very much like to see fora like this one really, really addressing that issue and having conversations with the main actors to try and find a way of getting a dialogue going for more transparency and accountability of how these actors operate these duties. Now, when it comes to takedown and blocking, it's actually totally ineffective. Blocking child sexual abuse content is relatively effective in that it prevents the inadvertent access to this content.


And because there's an international consensus that this content is criminal, it more or less works. However, there is no international consensus for what constitutes hate speech. And in any event, all blocking does, as I say, it stops people who don't want to look at this material from finding it. But if you do want to find the material, it is still there, and you can use VPNs, you can use proxy servers. You can find it.


And I think that in terms of young people, as a mother of two young people, making something forbidden but still available actually makes it more attractive. And I think that's a real issue. And it's a problem with blocking. It's a political solution. The governments can say, we're doing something. But in fact, it's completely ineffective. As others have said, you know, hate speech, online speech is one of the many, many factors that leads to the radicalization of youth.


In the UK, as you probably know, there's a lot of concern now about disaffected youth joining the jihadi movements. It's not only a matter of what they're reading on the internet, that's one of the many contributing factors, we have to remember that. People talk about the need for good counter‑speech. I agree with that, but we have a lack of systemic, structured opportunities for counter‑speech. At the moment, we are seeing a relative silence, particularly from the Muslim community, to counter this sort of speech.


You can understand why. Not only do we need a good place for the counter‑speech to appear, we need it to be safe and secure for those people who do want to post things that counter the extremism. And, as Sunil pointed out, the safe environment must not be provided by the government, because that will make it suspect. Those are some of my thoughts. Thank you.


>> MODERATOR: Thank you very much, Eve. And, again, you brought interesting issues, and the issue of accountability, of transparency, in all these processes. And really, the policies that support or not this accountability and transparency. And we couldn't have a better last speaker to address some of these last comments from Eve than Mattias, the Director of Policy Planning for the Council of Europe that represents 47 states in Europe.


And it is quite an interesting program, on the internet, on internet freedom, and all the complex issues that go with it. So, Mattias.


>> MATTIAS: We do a lot of things. It's difficult to squeeze them into five minutes. I'll try not to test your patience after so many speakers. I won't go back to the hate speech definition. There is a narrow category of speech which requires an intervention from the state in criminal law. This category has to be narrow, and maybe even different in different regions depending on our context.


Now, even with all the safeguards, there is still risk associated with excessive attention being paid to criminalizing hate speech. First, even with all the safeguards, there will always be attempts to use this as a pretext and as a tool to stifle and restrict legitimate speech. The second one is when we're talking in the context of young people, a heavy‑handed approach has the opposite effects. In the segment of populations particularly susceptible to these processes, it will simply confirm the perception and feeling that they already have, society is rejecting them, and it will reinforce the feeling of alienation. It will backfire.


Third, excessive attention to the criminalized segment takes away attention, energy, and effort from confronting sub‑hate speech, which is the biggest problem, as many people have said. So ‑‑ and as I said, I was extremely attentive after the horrible events with Charlie Hebdo, I live in France, I'm the father of four children. To see how all this played makes me not very ‑‑ how this will play with them. And, you know, counter‑narrative is a good thing, but, there are also so many things that can go wrong.


And it can get into the counterproductive narrative very quickly. One of the things that we've seen, you know, after the Charlie Hebdo was statements that response should be that French culture at all levels should start their school days by singing national anthem, and signing a declaration of allegiance to the French state. I grew up in an authoritarian country in eastern Europe, and I'm sure that many other people across in this room who grew up in other parts of the world know what is the effect of shoving that down your throat.


It doesn't matter if it's a good ideology, the effect is going to be the same, rejection, revolt. So, that's not going to work. We have to be extremely careful of that. So, what can we do? I mean, one thing, role models. In music, culture, sports. They can have a very important effect. Politicians should be ‑‑ (chuckling) ‑‑ should be very careful what they say. Education is the second, it's very important. The minister of education in France admitted yes, we are the one that would need to work with that. But our teachers don't have the skills or knowledge to handle this.


One of the things we are doing in the council of Europe is to develop, alongside what we have done for teaching foreign languages, very simple, a common reference framework for teaching democratic citizenship. And it's basically giving ‑‑ not giving pupils a ready‑made ideological position, but, give them the knowledge to develop the positions themselves. One of the things that is important, education should be an important part, and it isn't always.


That's one essential point that was made ‑‑ I'm already ‑‑ before. The last point that I'll make, all of this is not going to work if you don't have the bottom‑up approach as well. And I think paradoxically, there we should perhaps look at what the recruiters of these movements are doing, and how they are operating, perhaps trying to emulate and copy some of them. Because what they are doing, they are reaching out to young people who feel alienated, and they offer them an identity, a feeling of belonging, association to something that is bigger than themselves.


And it's basically the same way the gangs operate in the U.S. and other places in the world. And I think if you want to counter that effectively, you have to offer them ‑‑ they have to be given alternative associations, identities, belongings that are going to come from them, where they will have a sense of ownership. And a very important element of that is skills, because they're going to be confronted with somebody who is very skilled in that respect. They won't be able to do it.


And there again, you have recruiters. That's why, the European rights, dealing with the legal approach to it, we also support the grassroots effort through the no hate speech campaign, which is based on these kinds of initiatives that someone was talking about before. That is essential, and perhaps the most important part of it. Thank you.


>> MODERATOR: Very much to all of you, and also for keeping the time. Because it does give us quite a few minutes, almost half an hour, to really, now, engage in the dialogue and changing of ideas, and, of course, asking questions to our panelists. So, I would like to open the floor to you. The idea is that you come up and you introduce yourself, and ask your question, or comment, as soon as ‑‑ as short as you can. If you want to direct to any panelist, you are welcome. I will take a few, and then what I will do is to rotate between you and the panel, okay? Perfect. Please.


>> AUDIENCE: Very quickly, I'm part of the IGF program, and that's the reason why I'm here. My takeaway from this is, yes, it is about empathy. Empathy is a long‑term bet. As a civil society person, I accept the bet. We do it through theater, by changing the names on the paper, we create modules and invest a lot in finding creative ways to promote empathy, making young people empathetic. But we haven't addressed another problem of dangerous speech that we are facing already, which is violence based on it.


And I work with young people who are facing this violence already. And I can think of 55 empathy interventions in a school, or outside school, but today I don't know what to do with people who are already facing this violence. And in the entire forum, I haven't heard ideas of interventions of what to do with the victims of dangerous speech who exist already, and who are not getting the start to life that they deserve, and who we are excluding from these plans when we just focus on empathy on the long term. Can you please speak about those interventions, what to do with victims today?


>> RODRIGO: Hi, everyone. I'm Rodrigo, from SaferNet Brazil. Just to share that in Brazil, for example, we don't have this same hate speech problem. But nowadays, we have a really strong internal hate speech discourse like homophobia, and also regions are different than Europe, but we run the hotline for human rights crimes. And last year, racism is the higher level of reports much more than child pornography.


And just to talk about that, I just looked at the last speaker, and I am really concerned about the child protection issues here, even in the IGF this year. We are talking a lot about child sex exploitation, we work with that, also. But maybe to see and include a human rights, child rights approach, it's really important. And maybe I'll point the question to Frank, to maybe share with us other experience about many, many violence against child rights that are being done in the name of child online protection. And including the freedom of expression, and also the right of privacy.


>> (Off mic.)


>> AUDIENCE: Okay. Thank you very much. I'm part of the digital right organization in Mexico. In 2012, I was part of a student movement which started a protest against one political party and started in a peaceful way. At the end, we radicalized and committed acts of riots and violence. At least, that is what the media said, and the process of criminalization set on us.


This also reached internet with the case of, a page that was blocked and taken down. And if you search for it in internet nowadays, there is a page with health information or something like that, so censorship also happened. And unfortunately, our demands in the case of Mexico have ‑‑


(Portion of audio lost due to internet disruption.)


>> But let's not let the states use that as an excuse to censor communication. If there is a decision to transfer to ISPs and ICTs platforms, the responsibility ‑‑ the state is handing over the responsibility to private actors. It's not fair for the platforms themselves. And they will probably be more restrictive than the state would have been, because they don't want to fall into legal problems with the state.


It's an unfair situation for everyone, and the way the states are getting away from it. And here, what Gabrielle said at the beginning, which was very clear, you have a three‑part test. All restrictions have to be established by law, clear, un‑ambitious law, different from the one she read. That was on the restrictions. And the second, just quick comment, is on what someone said, I think, was very important. Hatred doesn't come from the internet. It comes from those that are using it and learn how to use it to provoke hatred and violence.


And this is an interesting comment, and reflection, because it means that some people use it for the bad. But it means that we can also learn to use it for good. I mean, UNESCO talks about building a culture of peace. Peace should also be a culture.


>> We still speak about it.


>> And building a culture of peace using mechanisms and media. This is important. One of the aspects of radicalization we talked about is the credibility of interlocutors. If they are government officials promoting hatred against the LGBT population, like in Africa, or promoting hatred against minorities which happens in some cases because of migration, or promoting religious leaders' hatred against others of a different religion, because of the credibility of the speaker, it has a much bigger impact than simple comments or tweets.


We have to look at that, and that response to information. Why do public people have a bigger degree of responsibility, because, precisely, they have a bigger degree of credibility. The level of privacy a normal season has a superior than the privacy and possibility of acting this way of a public official.


>> Gabrielle.


>> GABRIELLE GUILLEMIN: Thanks. I'd like to do two things. One, go back to the term of hate speech first, because I think that the term has been used a number of times, meaning different things. And in practice, it would have a different ‑‑ it would be qualified in different ways by domestic law, and, again, potentially differently by international law that we were discussing earlier. So by this I mean, sometimes we talk about hate speech when the phenomenon that is being described is bullying.


But I think the majority of countries, bullying is not something that is unlawful. And therefore, the responses are usually to have support groups and dialogue mechanisms to deal with this. Now, if bullying goes a step further, depending on what the person has said, it may well amount to harassment. If that's harassment we're talking about, it may well be it's proportionate, for example, to have the criminal law dealing with this.


And the same goes for credible threats of violence. Here again, you need to distinguish. You have threats, and people say lots of stupid things on the internet. And you've seen cases of individuals making threats, for example, about blowing up an airport, but that was meant as a joke. So, again, here we need to have a more granular understanding and look at, you know, what the standards are domestically and internationally.


We also have incitement in violence, which has been touched upon. Finally, you have insults. That's interesting, again, because here, I think we need to first of all remember that free expression also covers speech which is offensive, or even grossly offensive. And even when you look at how it's dealt with domestically, sometimes what we call online abuse may not even amount to defamation in some cases. Yet, we definitely see defamation in use sometimes to deal with these kinds of cases.


So, I'm sorry. So I'm just going to finish here. Thank you.


>> (Off mic.)


>> Just to say thank you to the gentleman from Brazil who gave us examples of racism and homophobia, and to say that ‑‑ thanks to that example, we can say that it's not just Islam that is the problem in Brazil, it is the Christian, it's the church that is the problem. And on the question of proportionality and power, the question asked by the gentleman from Singapore, Frank has already answered that, that Edward Snowden told us that transparency requirements in the law should be directly proportionate to power.


And I add a kind of corollary which is, privacy protection should be endlessly proportionate to power. And similarly, when speech is targeted at somebody who is powerful, say, 150 people tweet somebody who is very powerful, the harm is much less compared to doing the same act on somebody who is not powerful.


>> PANELIST: To the person who brought in the dimension to this discussion, with all respect, in general, I doubt that either in Asia, anyone else, politicians are at greater risk for hate speech from the common people, even from minorities, in general. But it is true. I think this points to a real problem that you will always have a situation that complicates ‑‑ people will always bring into these discussions about what is hate speech, what isn't hate speech, and what to do about it.


They will always bring their own cultural, political sensitivities and preferences, and they will adjust their attitudes and responses to that, and that makes things more complicated. I come from Slovenia, the latest country facing an influx of refugees, in particular, from the war in Syria. And not doing very well, by the way, with that. I'm ashamed of it. But, what you see in the public debate, you have that perverted interpretation of what freedom of speech is that is being advocated by a lot of political classes and media, even.


That, you know, basically requires ‑‑ that says that freedom of speech requires tolerance to intolerance, respect of intolerance. We basically have, all these different points of view, and extremes. If you have   ‑‑ I was talking to someone who said, editors, we have to give space to those who love and hate refugees, you won't have a roundtable without a person peddling lies about things because that's about freedom of expression.


It's hard. But when you ask these people, say, would you take the same position if this was a subject which there is a greater social consensus in the country, segments, for example. Gender equality. Would you invite to a roundtable also a person who hates women, and says they shouldn't have any rights? Or as an editor, would you say, of course not? So, there is seriously a problem of cultural and ideological, political preferences that are affecting things.


>> PANELIST: Very briefly, I believe that we should be more aware, more optimistic about the role of internet and youth about this. Hate speech might be not related directly to violence on the ground, and might not ‑‑ on the internet, you cannot prevent it, actually. But try to think of a world where these initiatives on the internet do not exist. Try to see that this hate speech would really prepare an enabling environment. People would not be violent, but would support somebody doing the violence on the ground. We should support more ‑‑ as young people, we have the ability.


I sense a bit of ‑‑ some people were pessimistic about the whole thing. You could be more optimistic with very small cost. It costs 0‑dollar to create a page on Facebook today and create an advocacy campaign and do whatever you wish, and people will be after you. Freedom predators are the same everywhere. And they're going to be the same all the time. But we have the power to change this, and internet is one of the most powerful tools today.




>> MODERATOR: I like that. I think we can change the situation. And that's really a very important call. Thank you very much.


>> PANELIST: Okay. I'd like to answer the first question about the victims. I think the help lines are so important. And in Brazil, SaferNet offers a help line. And the victim can speak anonymously. It's very good. And our state authorities work with victims to offer psychology and social assistance, and so on. Mmhmm.


>> MODERATOR: (Off mic.) To support victims of violence. And I'm sure it's online and outside ‑‑ offline. We had some other questions. So I would open and ask you to be really quick, because we have about six minutes. So, there was a young lady. Do we have any other speakers for the speakers? Okay. I guess you are the only one. Okay.


>> AUDIENCE: Hello. I'll be extremely brief. The Brazilian government announced ten days ago they are launching an app to monitor conversations, identify hate speech, and disclose users' data to enable prosecutions and so forth. I would like to hear from the panel what are your thoughts about this.


>> MODERATOR: This is for you. (Laughter.) That's a very Brazilian issue.


>> It will be launched this month.


>> MODERATOR: (No English interpretation.)


>> PANELIST: I don't know about this. Without the decision of a judge, I think it's ‑‑


>> AUDIENCE: It's the minister for discrimination for women, etc., the ministry.


>> PANELIST: No, no. Without justice, it's impossible to obtain data without a decision of a judge. And after the decision of the judge, prosecutors saw about that, examine, analyze and ask about it.


>> MODERATOR: I think this is probably very much a Brazilian issue. Maybe we can talk after the meeting. Since we have no more questions, maybe in a very short way, give maybe a little bit more time for the panelist ‑‑ for one single strong message. And mine is, first of all, you can find the publication of UNESCO online. So, get one of these programs. You get on our website. Do the difference act. Also contact us. You have email and numbers here. And really, it's an ongoing discussion from our point of view. A very complex issue, we need to continue. But let's do a very quick round to see the very strong message.


>> PANELIST: That's extremely fair. First I have to be the last in the panel, now you give me the concluding remark. Thank you very much.




>> PANELIST: The balancing act I was talking about. Okay. One remark. Let's say in our continuing efforts to find effective ways to mitigate hate speech, especially in that context, let's move to not commit the mistake we have committed in putting together this panel, on which there was ‑‑ let's not do it without young people, and without the participation of young people.


>> PANELIST: I'd like to repeat that I'd really like to see serious dialogue going on with intermediaries, with the platform providers, with Facebooks and Googles to increase transparency and the accountability so that we really know what's going on.


>> PANELIST: It's going to sound a little trite, but I'd say, more speech, more dialogue, more education.


>> PANELIST: I would say, briefly, I think that what we said before. Internet is very powerful. And it can be used for ill, but it can also be used for good purposes, for promoting a culture of peace, understanding, and better knowledge among people. Young people want to be heard as well. They want to receive content, and they want to project their ideas and content. This is what we should be using internet for, a solid, big, substantial debate.


>> PANELIST: With the words of my generation, make love not hate speech.




>> PANELIST: Counter it. Haters going to hate, trolls going to troll.


>> PANELIST: I don't have any inspiring message to deliver, but I owe an apology. The gentleman who asked me the question comes from Hong Kong, not Singapore. I was just thinking of this young man from Singapore, Amos, and part of his conviction was hate speech against the Christian community, which was actually a very legitimate criticism of Christian dogma. And I got confused. My apologies.


>> PANELIST: I thank you for this meeting. I like very much to participate. (Chuckling.)


>> MODERATOR: Thank you very much. I think we got interesting ideas. Clearly, more dialogue, more speech, involve all the multi‑stakeholders, use internet for the good, and like you said, let's act. Let's do it together. And really, I think this is what it's all about. I think the forum, this session, is how to build, really, a movement, a dynamic movement that can, indeed, start shifting, really, the use of internet for what it can bring of best. And that is trying to include people and bring the voices of the voiceless to the world.


And I think all of us have a tremendous responsibility there, not just you young people. We, the old ones, have a lot of responsibility. But definitely, we count on you, also, to help us to do that, to really bring what is the transformative power of this technology. I feel very frustrated sometimes because we don't use that. And it's one of the technologies that have the most amazing transformative social power. And we don't use it right. So, let's keep together, please, please, yes.


>> PANELIST: Just a very small thing. I'm really serious, contact me if you need anything. I'm a technologist. So, please, if you need any help in anything, please contact me, and I will be very happy to help.


>> MODERATOR: Okay, great. So let's have a round of applause.




>> MODERATOR: Thank you very much to the panelists, and really, thank you, thank you for being with us during this time. And we act together. Thanks.




(End of Session, 17:32)






This text is being provided in a rough draft format.  Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.