Deprecated: Function get_magic_quotes_gpc() is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/cegcore2/gcloader.php on line 63
Internet Governance Forum
 Welcome to the United Nations | Department of Economic and Social Affairs


Seventh Annual Meeting of the Internet Governance Forum

6-9 November 2012, Baku, Azerbaijan

8 November 2012



The following is the output of the real-time captioning taken during the Seventh Meeting of the IGF, in Baku, Azerbaijan. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.


Chengetai Masango:  Okay, excellencies, ladies and gentlemen, now so that we would like to resume the meeting and given that there Mr Mammadov, head of the department, Minister of Communication and ICT, apologise for being, stuck in the traffic and he is going to show up later.  I would like to open this afternoon's session dealing with security, openness and privacy.

As you know, so that this afternoon's session will continue the long standing IGF dialogue on the topic of security, openness and privacy and I am pleased to welcome Jonathan Charles, BBC foreign correspondent, to moderate the session.

As you know cyber security and cyber crime have caused major problems for government around the world sm we can use our time now to discuss how we can best secure the internet, while also ensuring freedom of expression; access to knowledge, privacy and also maintain as there is a human rights aspect to this discussion that we should be sensitive to.

We will take advantage to this unique multi‑stakeholder gathering to continue to build some consensus around these issues in this regard.

Finally, we will also cover issues derived from the workshops through the discussion with the workshop organisers.  In order to manage this session, it is wonderful  to have Jonathan Charles to moderate it.

JONATHAN CHARLES:  Thank you, I spent 30 years as a BBC foreign correspondent.  I have now just left the BBC.  I am Director of Communications at the European Bank for Reconstruction and Development which is very active in Azerbaijan where we are right now and many other countries.

We are here this afternoon to discuss the issue of security, openness and privacy with e‑mails, I think we probably all agree there isn't a single one of us in the room who isn't concerned about some aspect of this particular topic.

All of us probably on a daily basis have reason to be concerned about our online security.  Some people here probably have issues with openness.  We know that many corporations are facing issues of cyber attack; some government agencies are facing accusations of being involved in cyber attack.  These are all issues we will discuss over the next couple of hours.

Also of course it will be a chance for you to ask your questions and we will be hearing as well from the feeder workshops as they report back into our discussions.

We have our panel here, I will introduce them all one by one, perhaps I will start at the far end over there, Kirsty Hughes on my far right ‑‑ probably not politically though ‑‑ is the Chief Executive of Index on Censorship, they of course pay great attention to these issues of openness and privacy.

Then we have Carlton Samuels who is the academia and civil society representative, Vice Chair of the At Large Advisory Committe, ALAC for ICANN.

Sherif Hashem is the Senior Cyber Security Advisor to the Minister of Communication and Information Technology in Egypt.

Next to me, I have Christopher Painter, who is coordinator for cyber issues, the US State Department.  Then next to him is Eleonora Ravinovich the Director Freedom of Experssion at the Association for Civil Rights in Argentina.

Then as I go over to my left is Jonathan Zuck, the President of the Association for Competitive Technology.

We will be joined as well shortly by a member of the European Parliament, Marietje Shaake.  She is an MEP and the Parliament's Rapporteur for digital freedom strategy.  She should be joining us in a few minutes.

I would like to start by asking all of our panel here one by one to consider one question and that is if they look at this issue of security openness and privacy with e‑mails, what is the one major concern; the real concern they have.

Perhaps, Chris Painter, I will start with you.

CHRISTOPHER PAINTER:  Thank you, I think for me, especially because I have seen this area evolve over the last 20, 22 years in various capacities, I think for me it is a lack of understanding that these issues, the issues you have articulated in the Panel, are really very, very much interrelated and to raise awareness of how these things work together and how as you address these as a policy matter you address them together and not in separate buckets.

For the US, we had an international strategy for cyber space and significantly, that wasn't an international strategy for cyber security but it was a strategy that talked about the economic issues; it talked about the freedom of expression issues; it talked about the security issues; it talked about the international security issues; it talked about the full panalply  sp of issues and tried to weave those together as said our goal is an open, interoperable  , secure and reliable information and communications infrastructure, altogether not just one or the other.

If you think about this issue, if you think about it as a pyramid, the things you are trying to achieve is the openness and free flow of information and economic innovation and growth, security is not an end to itself.  It is one of the enablers.  It is the base of the pyramid.

That leads to some other issues.  There has been a lot of talk or a lot of concern there are no rules, no law; no understanding of what applies in the space and I think that that is well overstated.

We do have things like, for instance, the Budapest Convention on Cyber Crime which is a great model and many countries adopting it to deal with cyber criminals.

We recently, we had the UN Committee on Human Rights, affirming a resolution that the rights you have online are the same rights you have offline.  There is no distinction.  You have those rights, including things like the UN declaration of human rights, you have the same rights online.  That's significant I think.

Even in the conflict area, the kind of crunchie bit, if you will, even there, many states have affirmed that international humanitarian law, the law of armed conflict, applies in cyber space.  That has meaning.  That means there is a legal structure.

That said, you have that but we need to continue to socialise, we need to continue to build a consensus around what the norms are in this space and while we are doing that, we need to do some practical things, that includes having practical confidence building measures between countries to build transparency and trust, it includes countries around the world having national strategies, raising the awareness again here.

Those national strategies should include what they are doing structurally within their government to protect themselves, also how they are talking together as one government on these issues and how they are dealing with each another internationally.

Then we have to make sure we are doing capacity building around the world to bring everyone in this conversation, including the developing world.

Finally, there is a real diplomacy aspect.  That  we need to raise this to a foreign policy priority among all of our countries so we had this discussion at a real policy and not just a technical level.

There are a lot of challenges but I think there are a lot of good things that can come out of this.  There is great discussion and attention now, we need to keep that momentum growingthis going.

JONATHAN CHARLES:  Chris, thank you very much indeed.

Kirsty Hughes, perhaps I could ask you next to look at this issue.  As you look at this what is your one major concern?

KIRSTY HUGHES:  Thank you very much, Jonathan.  Good afternoon, everybody.  We have a lot of concerns but I will stick to your request to look at one for the moment.

JONATHAN CHARLES:  I will allow you two, if you'd like.

KIRSTY HUGHES:  Okay, one and an interconnected one maybe.  I think one big concern that I have and that we have as a human rights organisation is the trend towards mass surveillance and mass population‑wide data collection.

Index is an organisation that focuses on freedom of speech, so I actually wanted to start by just saying something about how and why I think privacy and free expression go together very much online.  Sometimes we are used to talking about web privacy and free expression conflict and where there are public interest issues, perhaps invading someone's privacy, perhaps a politicians, they can conflict.  But I think online it is very simple.  We know we are on the record here.  If we are having a private conversation at home or by the bar and if we think we are being bugged or monitored, someone is recording our conversation, that is not just an invasion of privacy, but fairly obviously it is also going to chill our conversation.  It is going to lead directly to self‑censorship.

That is a huge concern because when you look at how much the internet has opened our ability to converse, to debate, to share information, we want to be able to choose whether we are doing that in a public space or not.

I would argue that there is no justification at least in any society that respects or that claims to respect human rights for mass surveillance in any form, whether off line or on, and that includes whole population data retention.

What is particularly worrying is if we look around the world we not only can draw lessons or examples of that from somewhere like China with its extraordinary army of monitors, human monitors as well as technological monitoring but take an example from the country I'm from, the UK, where the proposed Communications Data Bill currently being looked at and scrutinised by the UK parliament would actually lead to that sort of mass surveillance and would be the worst form of mass surveillance we've seen so far on the internet in any democracy today so you won't be surprised we are campaigning against that.

I think what we are seeing here and I think it is an interesting question for debate in this session, is we are seeing people, there is a kind of elision and eliding of technological ease, the ease of collecting this massive amount of intrusive data, with the question of need.

One of the web host companies said to me about the UK Coms Data Bill, it is a sledge hammer to crack a nut.  I think we need to see, where there is genuine police or security issues for collecting that data that it is much more targeted and it is done with a court order.

I wanted to just link to that a couple of comments about security and free expression because what we sometimes hear and I have heard it in some sessions here, but certainly not in all, is security and free expression being put in opposition to each other and I was pleased that Christopher Painter just now precisely didn't put it like that.

I don't think we are mostly in the business of balancing security and free expression.  If we have free expression we can hold to account those with economic and those with political power; we can expose criminality and wrong doing, so a lot of the time the two are complementary, not in opposition.  If they are in opposition, if we want to sort of inhibit or limit or regulate speech that is incitement to violence, again we shouldn't be talking about balance, we should be talking about very limited exceptions to the general principle of free speech.

If I can conclude with one last comment that I think links to this surveillance and monitoring issue, it is to do with social media.  What we are seeing, again in the UK as well as in more authoritarian countries, we are seeing a tendency to criminalisation of speech.  In the UK we have had people making bad jokes, really tasteless or sick jokes on twitter or on Facebook and actually being taken to court and in some cases beoing given even Community Service Orders or going to gaol.

We seem to be suddenly in a situation where there is only kind of a right sort of speech that is going to be allowed.  We seem to be in a situation where some people seem to think we have a right not to be offended.  We don't have a right not to be offended.  If you tried to not offend anybody from all the countries that are at the IGF here today, we would have no space left for freedom of speech.

I think if we continue blocking and inhibiting the development of social media and other creative developments we see on the internet today, we are really going to be both undermining human rights and doing a lot of harm to that creative and open potential of the net, thank you.

JONATHAN CHARLES:  Kirsty, thank you very much indeed, very interesting points there.

I would like to welcome Marietje Shaake, welcome.  We will be hearing from her in a few minutes.

Let me turn to you, Carlton Samuels.  What worries you?

CARLTON SAMUELS:  Maybe it is a question of what is important to us?  In my area of the world, the internet is and for a development paradigm and so openness is top of mind for us.  It is important for us to be able, each of us to be joined to all of us and so, in our view, maybe it is aspirational at this stage, we are for global internet, globally accessible to all and with the ability for all of us to participate.

We are not unaware that there are some issues with security and privacy.  But, this in our view a security issue must be responded to with what is necessary and proportionate?  What do I mean by that?

It is for us true that there are people who will come to the internet with hearts and minds laced with larceny and we believe that it is rational; it is indeed duty for us to protect the public from those persons.

Privacy is very important, it is not a matter that we are not concerned about privacy but with respect to security then if we think of the internet as common then, surely, there are going to be the points where these two principles chafe, rub up against each other, and in this respect we would always opt for the least objectionable, the one that is in the best public interest.

With the internet being global, we have a concern that the approaches to management and security not be seen from a narrow, nationalistic position.  The fact is, since it's transnational, then the Westphailian model  Would Be Not particularly suited for response.  In our view, a multi‑stakeholder, multinational approach to the security and privacy issues is what is required.  There cannot be any space in a global internet for Germany or exceptionalism.

It is, it is in our view, a matter that all of us have an interest in making sure that all the voices are heard and all interests are addressed at some point.  Thanks.

JONATHAN CHARLES:  [microphone not working]

[inaudible] Power problem here.

Just bear with us a minute.  Yes, there, thank you.

Very good.  All right, thank you very much indeed yes, that was a little electrical censorship there.  Thank you very much indeed Carlton.

We have heard a lot about the priorities that should be given to freedom of expression, maybe this a good moment to bring you in Eleonora, to talk about this, this is an area that you would undoubtedly would target as being a concern. Eleonora Rabinovich.

ELEONORA RABINOVICH:  Ok, thank you very much for the invitation. I will continue the perspective of the censorship representative.  First of all I liked the link between privacy, security and freedom of expression, because they are closely linked they are not separate issues and you can find and you can't achieve freedom of expression if you don't have privacy and security.  So, that is the first comment.

I will talk from the perspective of the human rights organisation working  ‑‑ can you hear me okay?  Yes ‑‑ from the perspective of a human rights organisations working in Latin America and I would like to highlight that in the last times, we have seen like a couple of problematic bills and policy decisions that call the attention of activists and NGO's, working in the human rights  ‑‑ it's ok, because I lost my connection, my own connection.

‑‑ okay, so we have seen a lot of several policy decisions and Bills proposed in many countries that call the attention of NGO's, working in the human rights.  And they are basically in two areas which are copyright and cyber crime and both in big threats to openness; free speech; access to information; access to knowledge and other fundamental rights.  For example I would give a [inaudible] you can go on with this later.  In Peru, the Congress has drafted a new cyber crime legislation that for example, could eliminate anonymity, which is a fundamental part of free speech I mean it is important that we can guarantee anonymous speech for, in order to guarantee free speech.  And also that legislation gives the police an enormous amount of power to request user status.

So that is an example.  Also in another country we have seen another cyber crime bill in Costa Rica, but criminalised the spreading of false news and under the umbrella of inter‑American and universal human rights framework you can spread forth information, I mean that is allowed.

So these are some examples of really problematic proposals that affect human rights.  Also on the area of the copyright, we have also experienced as a result of the implementation of military free trade agreements with the US, countries like for example Panama or Columbia, approved new copyright frameworks that violate many human right principles and could possibly affect openness and access to knowledge with a great impact, but for example in the case of Panama, we, the community of activists in Latin America we have great concerns because the law creates an administrative body that can basically accuse people of copyright infringement without the oppressors of law so the process is something you have to follow in every, in every proposal even if you have a list and aim as for example, combat crime or enforce copyright rights.

So my concern is that policy makers are not aware of the impact of these policies and these bills in human rights and they should take into account and they should look to international and regional human rights instruments as a guide.

For example, the inter‑American standards on freedom of expression, I don't want to give a detail but they are very specific and very solid in the way they protect freedom of expression and how we can limit and how are there allowed restrictions to freedoms of expression and also we have the joint declaration but all the special Rapporteurs of the freedom of expression, the OES, the UN, the organisation for security and co‑operation in Europe and the African Commission on Human Rights and People with Rights, who give us a very specific guideline about the way we have to regulate the internet without affecting freedom of expression and other rights.

So, my big concern is that we have to be very cautious when we think about new policies and legislation and really evaluate the impact that they have on human rights even though governments can have very good, sometimes very good purposes like for example, protecting children, enforcing intellectual property online or prevent crime and terrorists.

JONATHAN CHARLES:  Eleonora, thank you very much, indeed yes, it is difficult when good intentions rub up against bad consequences isnt it?  Yes that's really one of the problems. Sherif, perhaps I could ask you to address this point?

SHERIF HASHEM:  I would like to start with the point on like the emerging threats and security.  We have seen new patterns and sophisticated tools and attacks in that specially in the Middle East now with new viruses attacks  and [inaudible] and others.  These attacks sometimes they are suspected to be or attributed to state actors as well as nonstate actors and it is quite challenging and this problem will really stay with us moving forward.

My concern, my main concern aside from the security concern is that the security community and the human rights and freedom of expression community; they don't talk to each other a lot and they don't communicate these issues together.  You cannot face a security problem just based on technical solutions.  We have to rely on society, the support of the society, in our States as well as across borders.  We cannot do this and build this partnership and trust except when governments work with private sector and NGO's and international organisations and discuss both issues or both lines of approaches together.

This panel is an example when you are bringing together openness, security, privacy and really issues and discussing it.  So I hope through this dialogue that we initiate a process, and we have to be innovative in the processes that we create to bring the partners that otherwise don't work together in tackling the challenges that we see because we have seen even existing principles that are agreed upon internationally the principle of proportionality in, whether it is within states or across states, dealing with cyber warfare and security of infrastructure.   We have seen examples in the Middle East where sometimes football fans would launch attacks on government sites across countries if they are not happy with the result of a game.

We have seen the political debates trigger attacks on infrastructure and these are opposition parties that don't like each other's views.  So here, the challenge is how to have a way or a mechanisms where we really discuss the legislation; the regulation; the operation; the technical requirements; the human resources.  Chris Painter mentioned that the capacity building, the awareness raising within the society to really address these challenges because these challenges are really staying with us unfortunately and it is sophistication of the new patterns are really alarming from a technical point of view.  So as a society we have to deal with this together.

JONATHAN CHARLES:  Sherif, thank you very much indeed.

Marietje, I know the European Parliament spends a lot of time looking at these issues, it is often very easy, is it not, to pinpoint what the challenges but the solutions are somewhat more difficult.

MARIETJE SCHAAKE:  Absolutely, thank you so much.  I wish the European Parliament would spend an aweful lot of time thinking about these topics because I don't think we yet do so enough.

In answering the question, what I believe are the key priorities, you will understand my professional defer maition  as a people's representative, I believe the key priority when dealing with technologies and the internet is that we ensure that people come first and that we embrace the opportunities through which technologies can help empower people and give them a free voice, have them access information and also give opportunities for people for economic development.

But the representation in the development of technologies, the representation of people or even the democratic aspects I think are going be a key challenge.

When we think about an open and safe internet we should continue to think, we do this not for the sake of the internet itself but really for the sake of the users.

I have heard it many times when companies or other stakeholders say, let's make sure we do not over regulate the internet and I am a liberal politician, I am not in favour of over regulation or bureaucratisation but it doesn't mean we should not act and sit idly by, we do need to update our laws when necessary and when we do it needs to be based on knowledge and evidence.

Especially, when it comes to technology, there is still a lack of understanding and appreciation of the revolutionary impact of technologies on the part of policy makers and sometimes there is also a lack of evidence based policy making which I think is a problem.

In all of this, when we need to update our policies or laws, we should do so on the basis of key principles of human rights; of competition and free trade, of democracy and providing security for people.

The challenges are significant.  There are plenty of actors who are afraid of the empowerment of individuals with the help of technologies.  They seek to reclaim control and power.  Power when it comes to governments or economic influence when it comes to corporations.

In all of this I think increasingly internet users or the public ‑‑ the internet public, we may want to call it ‑‑ can be overrun and can be excluded from decision making processes.

Sometimes this is also a consequence of the global character of the internet, because decision making in one country can have a huge impact on people on the other side of the world and I would like to see that more factored into the thinking behind certain laws and decisions so the global impact needs to be incorporated in the national level decision making when it comes to internet aspects.  It comes with responsibilities of a different kind for those who can make decisions also for corporations by the way.

We have to understand that what we do at home impacts our credibility abroad so, in other words, we should look at the impact of technologies and the different contexts within which they are going to be used; whether the country where they may end up in has the rule of law or it doesn't; whether technologies that are used in a safe and open way in one country can be used as a weapon in another country.

In this process about thinking about how to empower people to protect people and to ensure that their voices are heard I think it doesn't really help to do sort of a witch hunt, vis‑a‑vis powerful corporations or governments that are using repression against people, but to engage in thinking about incentives to do the right thing, incentives to make sure that freedom isn't a zero sum game and we do not end up in a stand off, with different actors vis‑a‑avis each other but that we really try to co‑operate as much as we can in a multi‑stakeholder fashion and that we kind of get ahead of the curve in stead of reacting to these rapidly developing technologies.  That we try to think in terms of scenarios, think in the research and development phase of technologies what the impact could be down the line for the security and well being of people.  Perhaps we should even start implementing human rights by design considering people first when those engineers are developing these highly technical solutions.

I will end there because I look forward mostly to a discussion.  I think we have some time and a great audience for that.

So, thank you very much.

JONATHAN CHARLES:  Marietje, thank you, we do have a lot of time for discussion.

Let's get an industry view now, Jonathan.

JONATHAN CHARLES:  Just losen up, don't worry about us.

JONATHAN ZUCK:  It is a long day.

"Give me liberty or give me death!"

It felt good say that.  That is a quote from a rather famous figure in the American war of independence, named Patrick Henry.  Like all rhetoric it feels really good to say.  I am not exactly sure what I meant by that when I said it but standing up and saying it felt really really good.

As we finish up the third day or for some of us the fourth day of the IGF, one of the things that really stands out in stark contrast here at the IGF are discussions that lead us down a path of aspiration versus those discussions that lead us down a path of practical solutions.

What is interesting about that of course is that we all want these aspirational discussions and proclamations to be impactful and to have an ultimate influence on the practical solutions that we find ourselves at eventually.  But unfortunately a lot of the debate seems to happen in a more siloed way.  We have our aspiration discussions, separate from our practical discussions and don't necessarily identify where the intersection of the two occurs.

Looking back at some of the principles of the past, for example, the Sullivan principles that were created in 1977 as a way for industry to have an impact over the racist regime in South Africa, called apartheid and the irony is I went back to do a lot of research to try to figure out what impact they had.  I had a very difficult time finding any.  There were 150 companies that signed on.  What it led to was a lot of companies pulling out of South Africa and very little change occurring to that could be attributed to the use of those principles, so instead of the companies staying and bringing about change they left in protest having signed on to those principles.

I am forced to ask whether another incident in 1977, the release of the movie Star Wars may have had a bigger impact on the ultimate downfall of apartheid than the much vaunted Sullivan principles.

We have to ask ourselves what is the ultimate impact of principles and the interplay of these rhetorical discussions when we are really trying to come up with practical solutions.

Yesterday at a feeder workshop I heard someone say, "I don't know the answer to coming up with a multi‑lateral agreement but I do know that the answer is not form bilateral and trilateral agreements and then impose them on trading partners; that is bad.

Of course, that same advocate not long ago would have said we shouldn't build a trade agreement with a country that has bad labour policies.  So the ends justify the means, I guess, when I am in favour of the ends.  So, how we marry this sort of rhetorical, facile rhetoric with the practical solutions that need to come about I think is really the biggest challenge in this area.

If we think about the result of rhetoric, at least in my experience and some of the more European style of governments whether it is the United States or in Europe, rhetoric often leads to placebo style legislation.  It leads to policy makers trying to find a why of placate the rhetoricians in such a way that it is not a practical outcome so we enact aspirational regulation that doesn't have a practical effect.

I think that finding practical solutions has got be our priority and we have to make sure that principles, while they have a role, have to be taken into consideration as a utopian view to which we are moving as opposed to a revolutionary view, because in a revolutionary view, very often when I ask for liberty, it is really the liberty to choose my own oppressor.

I think, in the area of security, openness and privacy, we keep hearing that they are intertwined and that is something that concerns me as well because it feels rhetorically cogent to say that these things are inexplicably intertwined.  Unfortunately, when that is translated into political action it means that I have a security problem and I am going to solve that with a privacy law, for example.

I think that that can be a dangerous outcome as well.  I think they are interdependent but before we can actually examine their interdependence, we need to examine them independently, so we know what we are talking about so that interdependence is better understood.

Where rhetoric meets the real world of policy, I think it is probably the biggest issue facing us as an industry, trying to create mobile apps ‑‑ which is the people I represent ‑‑ going forward and making sure that just because we are small and just because we don't have a big voice in government, we don't end up becoming the victims of what I would call placebo legislation as politicians struggle to respond to mounting voices that are rhetorical in nature, thank you.

JONATHAN CHARLES:  Jonathan, thank you very much, you will be glad to hear, you got a small thumbs up from our lawmaker, Marietje here, when you were talking about the need to marry aspiration and practicality.

Let's turn to our last speaker for this first round of interventions.

Zahid Jamil is a barrister.  He spends a lot of time looking at these issues, in particular as they apply, commercial law issues, as they apply to information technology, Zahid.

ZAHID JAMIL:  Sorry, to be a little tardy, we were in a terrorism workshop.  I think I brought some interesting concepts from there.  It is wonderful what you learn through the day but I won't waste time.  Let me say I am also responsible ‑‑ and for the transcript, yes, you got it right ‑‑ I basically have ‑‑ I am basically responsible for the current draft of the cyber crime legislation that is standing  before Pakistan at the moment.  It was, in principle, approved two days ago.  It was a legislation that basically dealt with trying to be more compliant with the human rights standards and protection and safeguards.  We can have discussion about that in some other place at another time.

I have two basic points I would like to develop, one comes in from a workshop this morning, which was what do we mean by cyber security, when we talk about security?  In the context of the WCIT and many other things we have seen that certain governments using ‑‑ started expanding the definition of what is security or cyber security.

To me and this is something somebody else said this morning also it should be restricted to the operational threats to a network, it is really more about those but instead we are seeing some governments use that terminology to also apply to activities which they think are illegal.  For instance, if there is blasphemy, or if they feel that somebody is blogging against the government, what you are seeing is, basically, actions taken under the guise of what they call security, in a preventative fashion of sorts without due process, without proper legislation around it.  An example of that basically was developed this morning by Yara Salam  sp  from Egypt about when the revolutionaries went into the state security building they found that a lot of the bloggers had actually had surveillance conducted against them they didn't even know about and all sorts of things were happening.

One of the points and it's the first one I want to develop is, we need to respond internationally to that.

Companies ‑‑ there was a UK company called Gamma.  I am not in favour of overly regulating business, because I come from business.   IAN TO HERE  We need to take account, as you have arms trafnging, there should be, the behaviour of companies in trying to provide these things should be coming under scrutiny somewhere.  ...   it is something we have to deal with, where corruption is an issue, with FCPA, what we are seeing is even much more damaging as an impact.

That is my first point.  The second one, I, coming from a developing country, I am concerned with what I see around the world where there are challenges we are facing with international... on cyber crime, coming from the law enforcement angle we need to do more work to bring people together on one platform, one that does exist and get them signed on.  We cannot wait years on end, we need to do this now and we need to get it done immediately so we can safe people from the threats that exist currently.

JONATHAN CHARLES:  Zahid, thank you very much indeed.  Before the questions from the audience, two things that really as pyre to me, we have been joined by our Chairman, from Azerbaijan, would you like to say a few words before we carry on?

CHAIR:  I would like to welcome all participants and I express by sorry because of traffic and other stuff I couldn't come earlier, and I believe we will continue with what we started and I will put all my [inaudible] for efficient man anging of this meeting.

JONATHAN CHARLES:  Thank you very much indeed Chairman, as I was saying, take your questions in a minute.  There were two issues that struck me from everything that is being said.  Chris Painter, if I could ask you, we heard a lot about freedom of expression, there is obviously a clash there with national security gheeds and it is difficult to get it right, let's face it we heard from Kirsty for example about legislation that is going through in the United Kingdom at the moment, American... can trawl for information particularly that is private online?  But, how do you reconcile them, where does the line, where can it be drawn and how can we be sure that the line is being placed in the reason of...  ‑‑ where do you draw the line?

CHRISTOPHER PAINTER:  If you have a security regime, that is stifling freedom  ‑‑ you have to be cog ghis sent, one of the key things that underlies any approach is really the rule of law.  Due process and the rule of law.  We heard about some of the cyber crime legislation around the world and how that has to be structured so when governments access data, true of the US and other places.  Procedural protection put in place so that there is a rule of law construct for this.  That goes to a broader issue, Jamil mentioned the issue of how people are defining cyber security in this Kay and there are some people defining cyber security saying information security.  That touches on the issues you raise, it means it is not so much technical security of the networks which you need to have good free expression, if you are doing your job right in terms of law enforcement you are protecting privacy  ‑‑ but information security could mean really looking at the content.  Keeping people from expressing their views and that is a real concern and there have been some efforts around the world to have a governmental approach to the internet; which imposes governmental, imposes governmental control, substitutes allows us to thrive, with concepts of monitoring and censorship that are a real problem.

We have heard other panelists say there are certain lines, I think every society recognises them, including the U.S.  ... it is important when we do the policies we don't use security as a way to hamper free expression that we make sure that those goals coexist.

JONATHAN CHARLES:  Just one quick follow up, we all ought to assume, shouldn't we, if we write anything online, if we interact anywhere online that somebody somewhere is going to be, may at some stage be viewing it?

CHRISTOPHER PAINTER:  Certainly, if you are writing anything to send someone, you intend them to view it.  The problem is with some of the cyber crime and the hackers that are out there, this is where again, if you are protecting networks you are protecting peoples privacy and we have seen a lot of cases where people take not just credit card information but all kinds of personal information and we need to make sure that it doesn't happen.  We need the better security so people have the level of trust and privacy and unfortunately we are not not there.  Building good legal regimes that have at the core of this idea of due process is important.

JONATHAN CHARLES:  Very quickly, Eleonora, would you buy that, that the line is drawn in the right place by national security considerations?

ELEONORA RABINOVICH:  I think national security has been used as an excuse to deprive from an exercise of Human Rights in many legislationtions around the world.  For example, it is not absolutely linked with this issue but, for example, in  access to information policies  they usually have these security or national security exceptions which are usually used to not allow the citizens to access to some public information.

The same happens with cyber crime or, for example ‑‑ yes, with cyber crime legislation and also other issues.  But I believe that the key is that, as he said, that we have to follow the framework of the rule of law and we also have to follow the framework of international human rights standards.

Those standards and I would like to partially disagree with your presentation.  The principles are there for us to follow them and they are not vacuum formulas.  They are very strict principles about how our legislation and how our policies should be and when we are talking about rule of law, when we are talking about, for example, the need to have proportionate or appropriate limitations or we are talking about the need to ban the  prior censorship of the need to allow the criticism of public officials, for example, or issues related to freedom of expression and these issues that we are discussing they are not vacuum form las.

There are very strict guidance that policy makers should take into account when they designed also cyber crime legislation.

JONATHAN CHARLES:  Zahid, you are just raising your hand to comment.

ZAHID JAMIL:  Yes I am just having a thought actually we are talking about intelligence and talking about national security, a thought that basically some of us have resigned ourselves to the concept of don't say something in private if you are not going to be okay with saying it in public.  That is one of those rules that people are taught, at least they were usually.  It is getting less and less and then I think the youth needs to understand that in any case, it doesn't matter whether we are in an information age or otherwise but that shouldn't detract from the right to be able to have freedom of expression.

The thought that came to me was that intelligence does actually do that all over the world.  We know this, that they have their systems.  In some countries, very few ones, you actually have a legislation that controls how intelligence functions and those maybe a better standard.  Then there are those countries in which there is absolutely no legislation, no regulation, no transparency, no systems by which we know anything about what is happening there so that is your intelligence aspect.

But where it starts impacting us mostly is when that intelligence starts being used by certain governments to start saying, "Arrest this person.  Shut his account down.  Block his e‑mails", or do something else against him and when you cross that line, which is from intelligence into real life impact on people which should actually come under due process, civil liberties, legislation, he has rights.  That is where the problem arises and I think we see more and more of that in some developing countries and some parts of my region.

JONATHAN CHARLES:  Kirsty first and then Marietje.  Kirsty?

KIRSTY HUGHES:  Two quick points.  I think, as Zahid was just saying, we shouldn't be naive about assuming we are always in private when we may not be but I think we shouldn't either give up the idea and the importance of privacy.

If you look at the role of anonymity and the challenges facing Human Rights defenders in some of the more authoritarian countries, their ability to have the technology to defend themselves against their states trying to snoop on them, arrest them and so on is absolutely crucial.

I would also agree with Eleonora and her comments on Jonathan.  I don't think we can sit here and say Human Rights are some ideal but the real world is practical.

If you look at American democracy, Obama got elected not Romney,  that's the real application of democratic principles.  If you look at this country one of my colleagues went in Baku yesterday to the trial of a journalist, Avas Zanalli,  who has been in prison for a year while his trial is delayed.

I think he would quite likely say rights are very real and practical to him in his gaol cell so let's not create false divisions here either.


MARIETJE SCHAAKE:  Yes, I agree with this sort of false dichotomy between principles and practice.  Of course we have to step beyond rhetoric.  That goes without saying and I think we do so every day in the European Parliament where I work but the notion that principles are agreed or that they are rhetorically meaning all the same is of course not the case.

The problem is that what a law in one country may mean in terms of the rights and security of a person means something else under the exact same name in a country where the rights are abused or the laws are applied creatively to say the least.

So the issue is to make sure that the practice and the application of principles (for example, universal human rights) are tweaked and applied in a more appropriate and relevant way as technology and time advance because this is a process.  It is not a static concept.

I wanted to touch on one issue that was mentioned, namely the export of technologies that can be used for mass surveillance, mass censorship for tracking and tracing and monitoring individuals, particularly Human Rights defenders but also journalists dissidents and activists.

If you look at the principle of universal Human Rights and the purpose and the mission of the European Union to be a promoter and defender of Human Rights, it means that you have to update your export regulations to stop this selling and exporting of digital arms to countries where they may be used to violate Human Rights.

This is where practice and principle meet.  This is what we were able to pass in the European Parliament last week to get important updates to make sure that digital arms can no longer be exported but we are not there yet.  This is a work in progress and so we will continue in the following year to more specifically ensure that we have the appropriate regulation.

Another issue is that of lawful interception which is legally required in technologies made in the European Union so that in the context of the rule of law the police when they have the appropriate court orders and authorisation can, for example, place a tap on the phone calls of a suspect in a criminal investigation.

But these same technologies with same "lawful intercept capabilities" are exported all over the world and when used in a country where there is no rule of law we should really question what such a concept of law for intercept means.

In other words, I think that even if we agree on principles, hopefully we can agree on some universal Human Rights, as I believe they are, it needs ongoing updating and we need to continuously consider the context within which technologies are used.  We need to continue to work on the rule of law, human rights, development and security and you know the country where we are in, Azerbaijan, is a constant reminder of the need to improve those very basic principles for the well‑being of the people.

JONATHAN CHARLES:  Mariekje, thank you very much indeed.  I thought Jonathan's point was a very interesting one actually and just let me be clear what I think you are saying because it was one of the points that struck me, actually, which is there is this clash between aspiration and practicality but actually sometimes ‑‑ and I think this is perhaps what you were saying as well ‑‑ it is a clash between aspiration and implementation of a law.

In other words, the law may have been drafted in what is thought to be a practical way and is needed in a practical way and fits in with pragmatics of life but then implementation is an issue?  Is that one of the points you were drawing out.

MARIETJE SCHAAKE:  I think there were two points I was making.  On the one hand I think principles are not merely rhetorical.  You need to know where you are going what aspiration of the guarantee of rights and freedom you may want to achieve through our policy through your laws in order to engage people, find a majority such as in a parliament, to make sure that people are moving in that direction so that you found a common ground and support for an issue.  That is one part.

On the other hand, I was trying to highlight how meeting the practical implication of your principle laws is a work in progress.   you know, even though, for example, I live in the Netherlands, it is a country where the quality of life and the guarantee of Human Rights, thankfully, is among the highest ranking in the world.  However, we are constantly in a process of updating our own practice to our promise to make sure that there is as small as possible sort of hypocrisy gap between what we're practicing and what we're preaching.

The Netherlands has been called out for its treatment of asylum seekers, for example.  Recently our Minister of Justice came up with a proposal in which he sought authorisation and a mandate to proactively hack or to hack back.  Of course there was a lot of criticism but, you know, this is an ongoing debate so my point was on both of these balances between practice and principle.

JONATHAN CHARLES:  I understand that now.

Carlton, very quickly and then let us go to our audience.

CARLTON SAMUELS:  Thank you.  I wanted to catch up with the issue of the technology because in this case the technology is agnostic.  It really does not care what the law says.

Let me give you a practical example of this.  There is a system that is used for controlling access to certain on‑line resources and it is used in, say, in the United States in a certain way and it ended up filtering some websites that it was said to be a mistake.  That's a mistake ‑‑ wrongly.

That same system was used in Iraq to do the exact same thing because it has the capability built in to do that.  It came with a list of capabilities and the response was, well, it's because the law there allows them to do it.  This is a bit of the hypocrisy gap that we are talking about here.  The technology cannot be blamed for that, technology is agnostic.

JONATHAN CHARLES:  Thank you.  The innocence of technology.  Okay, one last word from Jonathan as long as he doesn't stand up and shout it.

JONATHAN ZUCK:  Depriving me of my freedom of speech and expression!

Obviously, I was speaking in a fairly broad terms in my discussion but I still want to take people back to some practical issues.  The very same advocates who are in favour of an open internet are the ones that justifiably have been lobbying governments for the last 100 years for consumer protection laws to make sure that cars don't blow up when you drive into them or that airbags work or many other related things.

We've asked as citizens to have impositions on us and on our liberties in return for our safety, so when I ask for a question of, you know, give me liberty or give me death, I mean, I don't really mean it, right.  I don't really want to die.  I want my liberty but I also want you to protect me from the evils of the world and I think that in the context of consumer protection there's often conflicts with straight‑up openness, if you will, on the internet and a desire to try and control goods and services that reach individuals within a country.

I mean, what we keep hearing about at the IGF is how the Westphalian notion of a nation state has become outdated in the era of the internet and yet I feel like that's a complicated argument in the context of a multi‑stakeholder process in which one of the stakeholders is government.

When I talk about marrying principles to practice, I think we have to make sure that we keep in mind, whether you want to call it implementation or not, the notion that we need to make evidence‑based incremental progress to solve specific problems and be focused on the solution of specific problems because Utopian proclamations take us to a place of impractical solutions that often do more ham than good.

THE SECRETARY:  Jonathan, thank you.  A very quick response from Marietje becase I want to move on.

MARIETJE SCHAAKE:  Yes, sorry, it was a response to the notion that technology is agnostic because I think we have to keep in mind technologies are designed by people and they are often designed and tweaked for a very specific purpose, sometimes for a single use purpose which is to track and trace dissidents or to do masss surveillance, of which we can wonder whether it is ever in line with Human Rights.

JONATHAN CHARLES:  Marietje, thank you very much.  Very engaging first round.  In a moment I am going to call on some of our feeder workshops to report.  We have about 12 or 13 to report so I am going to take them in small group over the course of the next couple of hours but first perhaps some of you have some questions.

If you have a question raise your hand and go to the microphone.  There is a microphone over there and a microphone over there (indicated), say who you are and where you are from and one gentleman is already at the microphone, yes.

BILL SMITH:  I am Bill Smith from Paypal from the US and the point I would like to make is that in the internet THE end points are actually the ones who are providing the bulk of the security on the internet.  I .

Think if we ‑‑ several of the panellists have spoken about protecting consumers and at Paypal that's a big part of what we do in our security organisation is it is focused on protecting consumers to protect their information but we can also look at this as a way of protecting free expression.

If we look at security, openness and privacy from the perspective of consumers we actually naturally fall down to and we will realise we need to protect the network and we need to protect the end points and then we need to say, okay, how much are we willing to pay for this protection or this security and how much is necessary.  That's a risk assessment and I will point to Rousseau's social contract where we give up certain rights as the individual in order to enjoy the benefits of the society.  We should look there and suggest that we minimise loss of rights for maximal protection on a risk based basis and I'd like to hear the panellists' view on that.

JONATHAN CHARLES:  Kirsty Hughes, do you think the commercial sector is your friend in this fight for expression?

KIRSTY HUGHES:  I think to some extent it certainly is and I don't think in terms of this discussion that certainly Index or myself and I imagine Eleonora or not talking about having a complete anarchic world.  Someone mentioned rights offline going online, laws offline also on‑line but it requires some development, so offline and on‑line we will certainly disagree about where certain laws are set or where they go too far or about your risk assessment.

It is what I said in my introductory remarks that when you do restrict free speech in some ways it has to be, from my point of view, in a extremely limited way but that is where we can have the sensible discussion.


CHRISTOPHER PAINTER:  Two things, first of all, in terms of the private sector, I think one of the important things is with developing policies here is it is not just governments, it is governments working with the private sector and civil society.  We need to work together.  This is one of the themes of this entire process, the IGF process, is the multi‑stakeholder nature of the Internet we need to do that here and really talk about these things.

On the topic that you mentioned, where do you make certain trade‑offs.  Well, I think you can be really targeted.  I will give you an example.  Someone mentioned earlier in the Panel this idea of anonymity on the internet and some threats to have anonymity on the internet.  I think we really do want to preserve some level where people can have discourse on the internet, criticise on the internet, have free expression on the internet and have that ability to be anonymous but there are other times when you want to have people be able to authenticate and identify themselves so you can come up with a much more targeted solution.

We did this in the US with something called the national strategy for trusted identities in cyber space.  It falls trippingly off your tongue, it is so well defined.  Which is a structure that allows different level authentication for different things like banking or other things, so you don't have to have a one size fits all or a binary solution.  You can protect privacy.  You can protect civil liberties but at the same time you can allow security to function in a way that makes sense.

JONATHAN CHARLES:  You will be glad to hear your natural strategy was inaudible according to the transcript.

Eleonora, do you think it is right, you give up in the end certain individual freedoms in order for some greater freedom?

ELEONORA RABINOVICH:  Imdividual and social freedom come at the same time I think.  There are so separate things but I would like to highlight something that Marietje said before.

They were talking about how the principles are implemented in reality and I think that ‑‑ I mean we shouldn't be surprised about that.  I mean, always the context and new technologies and new developments in society define the contents and the boundaries of the right in practice so we have the internet now for the last years and previously we had television and radio and we had other medium of expression and we have to think what the rights mean in practice.

I mean what is, for example, the ban of censorship in concrete scenarios of the internet?  What do the need to provide for proportionate or necessary restrictions, which is one of the substances of freedom of expression under the umbrella of international human rights treaties, what does it mean when we are dealing with, for example, cyber crime legislation.  So we are all dealing with human right in concrete experiences and in concrete environments and there is nothing new about that.  So we have to rethink and redefine the boundaries that are applied to make it possible in concrete environment, not to change it but to make it possible.  I think that is the real goal that we have as human right defenders.

JONATHAN CHARLES:  Leonora, thank you.  A long list of people who want to speak, Sherif first then Zahid and probably Marietje, I guess.

SHERIF HASHEM:  Thank you, actually I have also a comment on the technology side that  we can referredto, digital arms and so forth.  In many cases, actually I would claim in most cases, technology has different type of applications and digital arms are used both to test networks and infrastructure and protect them as well to attack them.

It is a double‑edged sword.  When we talk about technologies we sometimes cannot predict the type of uses that will happen in the future.

Having said that we also need to, understanding the nature of the technology, develop and innovate new approaches and that is a continuous process when we are dealing with these challenges.

This is one of the (inaudible)  principles that we deal with.  In many cases the principles apply in cyberspace as they apply in real life.  However, their proportionality is different.  Something that you create over the internet will stay there forever.  I mean forever.  I've written certain things, certain articles that existed for 25 years or more in news groups over the internet.

I can say that I am still happy about them now but it is something that crosses border it crosses borders and crosses time limits so when we develop principles and we deal with them we deal with them as such that they will evolve and they will continue and the technologies will have multiple users.

JONATHAN CHARLES:  Thank you very much, I must say as a former broadcaster I am not that happy about it.  Some of my most embarrassing mistakes remain on‑line.


ZAHID JAMIL:  That's true for many of us I guess.  That.  Dualist technology absolutely.  You know, it's  humans use the technology.  I do believe that technology is to some extent agnostic and so the protection that you to have look out for is having the proper regulation and laws around them.  So at the end of the day you come back to the same old traditional responses.  You have got to make sure that humans and legislation play their role but here is a point I wanted to pick up on Bill's intervention.

You mentioned Rousseau's social contract and, you know, we look at the constitution in most countries since the last 200 years saying that was the social contract we entered into.  We got rid of monarchy, et cetera.

That is interesting because that is the first time I have thought about it and I am still developing my mind so forgive me if it is not fully thought through but it is notr fully thought through.  We have now a transnational society, we have a global society that is all interconnected that never was before and so there has to be a new governance model or there is a new governance model, you have multi‑stakeholderism.  If you want to be able to engage and do that trade off of the social contract you are not going to do that within your little country.  As Jonathan said, the Westphalian system is over, so you are going to have to do that some place else and in a place that is monarchical or ‑‑ I'm sorry, I was going to use the word "dinosaurs" but I didn't mean that, everybody knows were I have been throughout the week ‑‑ but the monarchies are over and now it has to be multi‑stakeholderism where business is at the table where they can actually negotiate these things, so the multi‑stakeholder model, whether ICANN, ITF,  et cetera, gives you the ability to be able to negotiate that social contract and I think that is an interesting point, Bill, that you've raised and hopefully that will be developed over time, thank you.

JONATHAN CHARLES:  Anybody else on this?  Marietje, did you want to make a quick point?

MARIETJE SCHAAKE:  Yes, I just briefly want to clarify something on the notion of the context because I think precisely the global nature of the internet has completely changed the dynamic of the impact that laws have when they are made in one country but they can have an impact on the other side of the world and so it is not just about looking at the context of one country but also beyond borders and thinking about how those decision‑makers can be held accountable.

That's I guess democratic principles which were also made by the last speaker which on the one hand of course in this global reality and when dealing with internet related issues we can work with the multi‑stakeholder model.

On the other hand, the points I am making are more relating to people that are not here, that are not at the table for any multi‑stakeholder dialogue but that may be at risk for their lives for what they are expressing and who we need to include in our thinking and in our decision‑making and in understanding the impact down the line which may sound like a great technical innovation in a lab somewhere let us say in Silicon Valley or in someone's basement but which can actually have a fundamentally different impact on the lives and well‑being of, let's say, human rights defender in Iran if it ends up in the hands of those in power there.

This is what I tried to highlight in terms of context and the impact and how we should look at how it trickles down and how we can ensure accountability.

JONATHAN CHARLES:  Marietje, thank you.  Carlton and then then Kirsty.

CARLTON SAMUELS:  Thank you very much.  I made the statement that the Westphalian model for internet governance was not fit for purpose and there was some push back on that but let me just give you a practical example of how laws made in one country impact countries outside of the national space.

The Government of Antigua and Barbuda has an interest in internet gambling.  The United States government is against internet gambling.  The United States government passes laws that have impacted the economy of Antigua and Barbuda negatively.  Antigua and Barbuda ‑‑ this is in the Caribbean so I am speaking for them ‑‑ they went to the World Trade Organisation and sought a judgment and that judgment was against the United States government and to this day Antigua and Barbuda suffers because of a law that was passed in the United States for the purposes of United States jurisdiction.

This is just one example and I could give you many more examples.  SOPA, for example, was another example.  This is a point, I am reiterating the Westphalian system is not fit for purpose with respect to internet governance.


KIRSTY HUGHES:  I think it's a fascinating and a crucial discussion about what Marietje called the changing dynamics of how laws in one country affect another and, yes, in some ways we are part of a transnational society courtesy of the internet.  In other ways let us not forget that we are facing potentially a lot of fragmentation and a lot of countries and governments have a lot of power and Iran may be one of the most extreme examples building its own internet cut off from the rest of the world wide web but it is not the only example and we should be very aware that if countries go for a certain sort of control and behaviour in their country it stands as a role model, perhaps, a really negative one, that other countries will use.

Like the UK's snuper's charter, the Coms Data Bill that I mentioned before, if it gowa through, will certainly be quoted by countries like Russia, I think.

So let's not also be too naive about that or even if you look at the Google and Twitter transparency reports and the take down requests when India tops Google one and US the Twitter one.  So we are seeing this interaction that is partly causing a lot of fragmentation and a certain amount of censorship and that is a big concern.

JONATHAN CHARLES:  Thank you.  We might come back to the fragmentation issue later on.  We do have our remote moderator Milan Vuckovic over there because people are watching this on‑line around the world and are able to send in their question.  I think, Milan, you have a question from someone?

MILAN VUCKOVIC:  We have a question from a remote participants, Radnesh dividevy  sp who is an assistant professor of journalism at the  university, New Delhi, India.

His question is and I am going to read it as stated:

"How to ensure a smooth penetration of internet in rural area in country like India where government is also keen to implement new internet policies where you have got to bridge the gap between a clearly illiterate society which show lesser interest towards technology but once internet and web is made possible they will have tremendous opportunity to know various governmental policies which they can use for their better understanding of government."

JONATHAN CHARLES:  Thank you.  Of course, that is not just an issue for India, is it, Sherif, it is probably of interest as well to a country like yours?

SHERIF HASHEM:  Yes, I guess the issues of inclusion and making sure that our policies and practices are in line with including everybody is very important and there are certain examples in Egypt where we expanded government programmes  in partnership with the private sectors and NGOs to make sure that we have our policies are alligned with practices, that reflect inclusion of the society.

I don't know how much in terms of the resemblance in India but clearly you don't want policies that would result in the openness in the internet is not reached by the I mean different levels of education in the population.

Remember in the Egyptian revolution that happened earlier last year, inclusion was key and we relied not only on direct access to the internet but access through organisations and support groups and different communities.


CHRISTOPHER PAINTER:  I said this is a huge issue but it is also a huge opportunity.  Recently we did a capacity building seminar in partnership with the government of Kenya for several of the West African countries and with the advent of broadband internet that is enabled by some of the new cable drops that have come in, it's incredibly how much of a democratising effect that has around the world and how much of an economic effect it has around the world and so I really do think it is incumbent  on all governments to have as much penetration of this technology as possible and the opportunity that that allows is the opportunity both for new innovations.  Kenya had this on‑line payment system that is frankly more advanced than anything I know in my home town called Mpesa  sp that allows people, even in rural areas, to make payments.  It is just tremendous but it also allows government like Kenya and others around the world to think about these issues that a lot of us have been thinking about as after thoughts when they are building the technology in, think about having a strategy, think about what security means, think about how they can really protect these networks from the get go rather than simply thinking about it after we have already built these things out.  So I think there are tremendous opportunities in the developing world for this and that is one of the reason we need to bring the developing world in this conversation on policy as well.

JONATHAN CHARLES:  Chris, thank you very much indeed.  Our Chairman would like to say a word.

THE CHAIRMAN:  Yes, I want to add some points how government could efficiently promote and extend services with their activity, in Azerbaijan  remarks when in 2001 president issued decree on some measures in sphere of the organisation of rendering electronic services by the governmental bodies and evident tto this decree create  establishment of the governmental bodies of the electronic services in the areas of the authority, which means that government promotes and extend services applicable for all citizens and according to this programme government also extends quantity of the users.

Right now we have penetration rate 65 per cent but government support and promote idea to extend and make fibre optic cable to each home which will increase, in the end, users for 85 per cent approximately, in the end government will benefit from such kind of services which will create better condition and better partnership between citizen and governmental institution and government.

JONATHAN CHARLES:  Thank you very much indeed.


KIRSTY HUGHES:  Just one thing that hasn't been said in response to the question from New Delhi so far which is of course the issue of the spread of mobiles and both the increasing ability to access the internet on mobiles or the falling price of smart phones and so as infrastructure and prices allow remote or poorer areas in India or in other parts of the world to access the internet we know that we stand in the next few years on the brink of a huge increase in access and a change in the digital divide and on the one hand that is very good news and on the other hand it actually means the conversations we're having here are even more important because we don't want that mobile access to the internet to be censored or to be even more filtered say on mobile than on‑line access through computes.

JONATHAN CHARLES:  Thank you very much, Kirsty.  Chris painter has had to leave us because he is on his way back to the United States and so we thank him.  He is being replaced by Lisle France  sp.  Let us move on now to hear from some of our workshops and I would like to call on a couple of our workshops to report back to us to feed in to us some food for thought that we can then take up.

Perhaps I could ask someone from WS94, workshop 94, on social media, young people and freedom of expression to go to the microphone and make their point about what that workshop said.  So please a spokesperson for workshop 94.  They are heading to the microphone now.

MATTHEW JACKMAN:  Thank you very much.  We are representing workshop 94 and feeding back to you.  We are both delegates from Childnet, which is running a Youth IGF project.  This is our second year.  This workshop was aimed to discuss social media, freedom of expression and young people.  It was a multi‑stakeholder dialogue with representatives of youth from the UK, Denmark, Finland and Hong Kong; industry, hearing from Facebook, Club Penguin and the Google perspective, and finally the Civil Society and educators as we heard from Insafe, the UK Safer Internet Centre and the Internet Rights and Principles Coalition.

The workshop was structured in a four‑part structure where we tried to understand freedom of expression from each of those groups' perspective; so it opened with the youths' perspective, then it was followed by the industry's perspective, and then finally the educators' perspective.

NICOLA DOUGLAS:  Throughout our discussion, we cited a survey which we developed over the summer and it ran over the course of a month and it basically sought to find out what young people think of freedom of expression and how they use the internet and how they use their freedom of expression and enact right this on‑line.  We received 874 responses across 40 different countries and six continents.  In our workshop, we used this information we received to present how young people view freedom of expression from a global perspective and bring as many young people's views as possible into the discussion.

MATTHEW JACKMAN:  We spoke about being anonymous on‑line.  We had a diverse range of responses from the survey and in the workshop including that being anonymous is a positive thing, how it gives you power and how it gives people a voice they couldn't otherwise have but it can offend.  We heard also that being anonymous on‑line contradicts the self‑exposure purpose of social networks.

Furthermore, we heard that freedom of expression is a right and the right to express yourself is important but also that right to express opinions has to be self‑regulated.  We saw that in our survey 41 per cent of our respondents said they felt that could express more freely if they were anonymous, which affirms and shows insight into the two aforementioned points.

We also, moving on to educators, heard about the role of social media in education.

NICOLA DOUGLAS:  We heard that currently there is a reluctance to embrace social media in education due to concerns surrounding the financial, technical and cultural barriers and we found that there was a fear of accountability in schools and education and that this accountability needs to be tackled.

We heard that users need to be more self‑aware when expressing themselves on‑line and that especially among youth there needs to be education about how to be considerate and a discerning user when you're both expressing and receiving information.

In the end, we came to a general consensus that everyone has a role in educating young people as well as educating adults to be more self‑aware on‑line and this is between educators, Civil Societies and internet service providers themselves.

MATTHEW JACKMAN:  Thus a three‑way dialogue is necessary to move forward.  The more these stakeholders discuss the issue of freedom of expression, the closer the move to finding solutions to the problems they face we can get.  Thank you.

JONATHAN CHARLES:  Thank you very much indeed.  Perhaps I could stay with the topic of young people and ask the speaker for workshop 110, young people combating hate speech on‑line, to make a few interventions if you would.

RUI GOMES:   Thank you.  Indeed the workshop on young people combating hate speech on‑line, which is a project of the Council of Europe, dealt with similar subjects, especially focussing on the prevalence and the ways to address hate speech, speech that is harmful to young people, but generally to society because it offends human dignity.

We heard from various points of the world starting with the Yemen and the point of view of a young blogger on how hate speech is not only a European reality, it is a reality that touches across the whole of the internet regardless of where people connect from, and the fact that it may not be defined as hate  speech does not mean that it does not exist and is not harmful.

It is nonetheless a complex reality with various associated phenomena such as cyber bullying, cyber crime, extremism, et cetera.  The way to address it is also complex because of this.

One of the main concerns raised in the workshop is the fact that hate speech because sometimes anonymity of internet, because people are more likely to say things on‑line that they will not say offline is becoming culturally more acceptable, perhaps more tolerant, we end up, we fear, being accepting in some ways and some type of expressions, some types of addressing people from other nationalities, religions, et cetera, in ways that we would not otherwise tolerate.

For us, one of the main important conclusions is not so much in the sense of regulating or putting barriers because we don't believe that is the way to go about it but actually to promote better awareness, self‑awareness, of Human Rights and Human Rights on the internet and related to that also all the citizenship aspects that go with it, and notably responsibility, so that communities can play a role as managing in a way the content of hate speech.

However, that is also not sufficient in some cases.  We also consider that it is important to make use of the legal provisions whenever hate speech content falls into the areas of crime which it in some of our countries it does ‑‑ not in all of them though.  So law enforcement measures are necessary.

Nonetheless ‑‑ and this is where, for example, the campaign of the Council of Europe against hate speech on‑line connects very much with this.  We believe that the accent ought to be in education.  Young people need to be ‑‑ not only young people, but not only need to be as much aware of the implications of hate speech on‑line as they are and they should be about implications of hate speech or any other crime offline.

Here, we believe that all stakeholders have a role to play and that young people notably, or Civil Society in general but notably young people, have a particular role to play in defining what they consider hate speech, defining what is acceptable and not acceptable, knowing that it always depends also on the context and especially in being themselves part of the effort of promoting education, change and in a way responsible citizenship for securing the dignity of everyone.

It would not so much be looking into the possible clash or opposition between hate speech and freedom of speech but really in terms of responsible behaviour and, in fact, respecting the law whenever the law put some restrictions on hate speech as it does in many of our countries.  Thank you.

JONATHAN CHARLES:  Thank you very much.  We will take two or three more feed‑ins from that before we get some response from our panel here.

Guy Berger of UNESCO, perhaps I could ask you to report on workshop 59, which is internet privacy and freedom of expression, and workshop 256, the safety of actors on‑line.  You are reporting on two workshops.

GUY BERGER:  Thank you and good afternoon everybody.

The first workshop I will report on is the forum on safety of on‑line journalists and on‑line actors.  The report is structured in three sections.  The first is the general stuff, then I come to what is happening right now and then some future actions.

On the general points, it was more or less agreed that freedom of expression is a delicate eco‑system and that when freedom of expression in the on‑line world suffers, this has a negative impact on all the rest, including the traditional media, traditional media being still important, particularly broadcasting which provides most of the news that humanity receives today.  Freedom of expression when it is disrupted on‑line impacts negatively on sources of news for offline media.

One of the participants in this workshop was Eynulla Fatullayev, an Azerbaijani editor who was the UNESCO 2012 winner of the Press Freedom Prize.  He made the point that when traditional media come under tight control and when there is corruption and bribery of traditional media, citizen journalism has to take the lead and that highlights the importance of a free internet for them to do that.

The point was also raised that freedom of expression on‑line requires that countries are sensitised to being tolerant of divergent voices, even those that are vulgar and tasteless.

A further general point that was made is that technology keeps on evolving and therefore there is a need to continuously revisit the assumptions that we have today about on‑line safety and people being free to express themselves on the internet.

Moving to the second section of this, the agenda of the next few months, the Council of Europe have adapted an internet governance strategy and the main work will now commence and be implemented over the next few years until 2015 with special attention to aspects of freedom of expression.

The United Nations has adopted the UN plan of action on the safety of journalists, journalists conceived in the wide sense of the word.  This was presented and discussed as a rallying point for many stakeholders, not only the UN but other stakeholders, and especially at the national level.

The next step that is taking place there is developing an implementation strategy for this plan in Vienna in November 22/23 and this UN plan and its implementation strategy will include on‑line safety as an important component.

UNESCO is busy working on rolling out partnerships with the International Federation of Journalists, the Open Society, International Media Support, to help implement the plan especially in terms of spreading awareness about the important of safety on‑line and safety at large and also for training, including safety training.

Finally, UNESCO next year will publish a handbook on safety for journalists (including on‑line safety) and one on gender and safety (including on‑line issues), plus a report on world trends and news coverage, including safety (which again will include on‑line safety issues).

Turning to the future actions, the European Commission Vice President, Neelie Kroes,  described a strategy based on three main points.  She said we are (inaudible) and internet activists to news technological tool to avoid surveillance and to fight cyber censorship including she spoke of funding for that.

She spoke about how the EU aims to intervene about European companies to make them aware of Human Rights implication of the technology they sell and the issue of ICT export control finally she mentioned that restrictive repressive restrictiontion on media freedom will be considered unacceptable for the European Commission and this will be implemented in relationships of the EU with other part of the world.

Janis said there was a need to sensitise government worldwide about the important of the safety of journalists on‑line and offline.  The Dutch government said it will continue to strengthen networks of independent on‑line journalists and bloggers in Iraq and Iran and the Swedish government said it will continue to support projects for protecting internet freedom through seeder and also use international trade and co‑operation agreements to promote freedom in particular countries.

The moderator of the session who is the same moderator as we have now stressed some of the demands when you were summing up a need for more training to be extended to on‑line actors more implementation of existing protections that is needed and that require leverage that international institutions can wield.  He said that more co‑ordination I needed among all concerned actor on a country by country basis so that the same message can be delivered by multiple voices and finally that thereafter need for new and (inaudible) of internet actors.

On this point of anonymity Google promised to try and maintain this but stressed that even although a vague regard of anonymity as the best possible protection at the same time people using their services need to assume a certain responsibility for what they did say.

So that was the summary of the on‑line safety discussion.

I can turn now to the other group which I am reporting on which was about privacy internet privacy and freedom of expression.  This workshop was developed around a new just a global survey that UNESCO has produced by very credible expert and the discussion centred around the launch of this publication.  The publication itself unpacks many different interpretations of the concept of privacy and the way this concept relates to the other concept of data protection and anonymity the three being distinct concepts but ones that are often mixed together.

The study looks at the current regulatory land scapes in many regions of the world and problems around the fact that legislation about on‑line and offline are often different and unpack complexities but international and national (inaudible) and freedom of expression.

The book as we discussed cautions that sometime attempts to safeguard privacy on‑line can undermine freedom of expression, the classic example where corruption exposed by investigative journalism used as its defence that the privacy has been violated.

The book points out that very often poor regimes in protecting freedom of expression are accompanied by poor regime in protecting privacy.

The study also recommends how governments can help legally protect both right and it goes into other stakeholder groups what they could do as well.  So in the discussions which included presentations by two of the authors Ben Wagner and Andrew Pudifoot the following points emerged that our understanding of evolve and particularly we need to see how the advent of big data new business models based around data impact on these.  The point was made that law enforcement increasingly wants access to the giant pool of data that is being developed which of course raises questions of privacy and freedom of expression.  It also raises additional concerns about digital footprints and security of data such as abuse or theft reference was made to the huge theft of personal data in South Korea.

The internet was seen as presenting a particularly challenges to privacy because of the way in which information can be collected stored sharing analysed used commercialised and traced and with the international flows.

The point was made it makes difficult the internet makes it difficult for uses to control their personal information on the other hand the point was made that privacy can limit data mining possibilities such as the information that is needed to improve voice recognition services and interpretation, simultaneous interpretation.

The point was made was that freedom of expression and privacy when they come into tension with each other there should not be a hierarchy between the two rights and neither I absolute as was pointed out by various people the public interest override needed to be brought into to decide which right was more important in any given instance.  Reference was made to the website necessary and proportionate dot net which talks about various cases where these kind of balances are being struck.

It was also discussed how difficult it is for businesses providing internet services to deal with the complex landscape constituted by different national privacy laws an this ambiguity sometimes hindered privacy protection for user.

There was concern that there was a trend even amongst democracies to establish legal regimes that facilitate the use of personal information for law enforcement but without always giving due consideration to privacy.

The point was also made that I think about Article 19 that if user know or suspect that their data could be released to law enforcement officers indiscriminately it could be a chilling effect on freedom of expression.

The point was made that many data protection regimes include a number of specific rule to protect public interest but do not provide a general public interest override which would naif one is talking about protecting freedom of expression.

The right to be forgotten was discussed.  It was not much supported in the workshop.  The point was argued that there are already safeguards about accuracy with information that is data holdings and it does need to be enforced.  There's some ‑‑ there are some measures about limited preservation periods.  Further technically challenging to have the right to be forgotten very labour intensive as well for individuals to indicate all the time when they are contributing information when that information should expire and finally there was the concern that the right to be forgotten could lead to attempt to rewrite history.

One argument was presented in favour of pre‑emptive measure to protect privacy saying that punitive measures after thevalation were too late as the damage done could not easily be reversed.  However it was pointed out stronglien opposition to this that the European Court of justithat rejected the idea of prenotification concerning publication of private information on the basis this would really chill freedom of expression.

Amongst the recommendations and I am winding up now the recommendation that states shouldstantial a strong constitutional protection of privacy and freedom of expression including positive obligations on the state and only limited restrictions on these rights and the public interest override.  Privacy should be protected from threats that come from both public and private actors.

The argument was also made that civil law should be the primary practical means to protect privacy with criminal rule only being used for very limited and highly sensitive thing such as banking.

States should set up strong data protection regimes allowing for exceptions to these regimes on the basis of freedom of expression and public interest.

Then (inaudible) better policies to protect privacy conferring as much control over privacy as possible to the user the Mozilla model was referred to (inaudible) in terms of devices.  Lastly public awareness on privacy protection was strongly recommended media and information literacy particularly including tolerance to content on‑line.  PTP violations of privacy should be part of this and users were seen as having to have some responsibility about encrypting and tracking their information and communications although the present uptake of these kind of facilities is very low among users.

However, it was argued that if the internet is to be a public plaza as the UN special representative has urged then privacy need to be waived against thissed so benefit.  So finally I don't think everybody agreed with this but the point was made that privacy should not be the default right as regards the internet and there should not be a prejudice again data per se.  Thank you.

JONATHAN CHARLES:  Thank you very much.  I will take one last feeder workshop report someone here from workshop 128 empowering Internet users which tools perhaps they could step up.  Is there anyone here from that workshop workshop 128 to make their report?  Maybe not.  I assume there is no‑one here from workshop 128.  Or maybe there is.  Are you from workshop 128.

HONG XUE:  No, I am from workshop 118.  I am sorry I've another workshop.  I have to give a presentation.

JONATHAN CHARLES:  We will take workshop 118 law enforcement via domain name caveat to NDS neutrality.

HONG XUE:  I am very sorry to jump another queue.  We have a really (inaudible) diversifying workshop at the panellist came from Brazil, Russia, India, China, and France so we call this (inaudible) discussion.  We Panel talk about how internet domain names and IP addresses are growingly used for law enforcement purposes such as antipiracy anticyber crime (inaudible) protection.  And the law enforcement measures domain maybe unseized resolution redirected to a new location or transferred to another party.  The Panel compared the respective law policies and practice in Brazil Russia India and China in respect of DNS filtering the Panel know that the study on DNS filtering in (inaudible) developed and much needed although most ccTLDs in these countries do not make the content filtering policy by themselves they would definitely enforce decision from domestic public authority such as the court order or administrative decision.  Some CC TLDs would even (inaudible) from foreign authority.  All the Panel strongly concerned the negative impact of DNS filtering on free speech and the free flow of information on the internet as well as on stability on the system.  The Panel was against the use of DNS which I the logistics layer of the internet as a control Panel for content regulation.  The Panel addressed the an of segregation and differentiation of internet traffic in different territory based on default IP addressing system.  The Panel believe the state should E sigh the authority carefully so as not to restrict the cyber trouble a (inaudible) network thank you.

JONATHAN CHARLES:  Thank you very much indeed.  We have heard from a few of our feeder workshops.  Quite a few of those again were about freedom of expression and I think am I right Leonora you were quite keen to say something about that.

ELEONORA RABINOVICH:  I want to make some comment about the issue of hate speech.  I am glad that it was raised because it is one of the most important answers regarding freedom of expression on‑line and offline and I agree with the conclusions that the gentleman said about the need of more education and (inaudible) policies in terms of promoting tolerance and understanding between people and fighting again prejudices and promoting social change.  So I think that the it is very important too that we have to seek.

I think that the big question is when the States are able to legitimately limit freedom of expression for the goal of fighting and preventing hate speech and I would like to mention that Frank La Rue that unfortunately is not here but was going to be here in the panel so I can talk about this issue, he presented his last report his annual report his last annual report is about specifically about hate speech and he talks, he addresses the issue from a Human Rights standard and he answered this question about the possible restriction on freedom of expression for the prevention of hate speech saying that that is possible only when there is incitement to hatred or incitement to genocide and under certain circumstances that are already defined in our international Human Rights standards like, for example, the (inaudible) of Human Rights or the universal declaration of Human Rights.  So we have a framework there that we have to apply it and we have to be very cautious when we try to regulate content and discuss even those hate speech could be very problematic but it is also very problematic that we eventually censor some expressions although we have good intentions in terms of fighting hate speech.  So we have to follow that rule and that Human Rights standards when trying to regulate and freedom of expression in regards to hate speech and as Kirsty said before limitation are very specific and exceptional.  They are not the rule.  So I think we have to try to find other solution and other tool for promoting understanding and preventing this type of discourse.

JONATHAN CHARLES:  Eleonora, thank you very much.  Jonathan will come to you in a minute but first of all Kirsty you want to jump in.  Hate speech is a real test on freedom of expression and it is a real issue as to where the boundaries like.  It is the old Voltaire question I may dislike what you say but I defend your right to say it.

KIRSTY HUGHES:  I think that is right and I think we should be honest this can often throw up very complex question of where you do draw the line and I welcome very much all those four workshops and the feedback and Guy Berger's comment around (inaudible) journalism but I think both of the youth workshops feedback was addressing both hate speech but also being considerate and responsible on‑line and I think that is a very important part of the answer.  You don't always have to have laws where there is something very offensive.  If I get too offensive now on this Panel you might ask me to leave the panel but you probably won't say we need a law to stop Kirsty doing they next time.

So I think those comment at of the workshop are really what in other language or more oppressed language we might call taking editorial control or even our self‑editorial control or communities regulating themselves and as Eleonora says I think the Frank La Rue report is a helpful contribution and it interesting in the that report he calls for the repeal of laws against blasphemy because religion and hate speech in particular opens up very tricky questions around identity, freedom to discuss and debate and challenge different points of views and mind sets, believe sets and knowledge sets.

So freedom of expression is not the only right, a universal declaration of Human Rights covers a lot of rights including right not to be discriminated against and freedom of belief and so we have to look at the way right go together but we shouldn't rush too fast or too far to just say as I was saying earlier that all offensive speech should be outlawed or we will end up with no free speech.

JONATHAN ZUCK:  This is a complicated issue with hate speech and it related issue and I don't know the degree to which it's addressed cyber bullying that become more and more of an issue I know in the United States and elsewhere and that's another area where real questions about freedom of expression have been raised as well and real questions about anonymity because as a general rule people feel a lot freer to be more irresponsible and less gracious in their language when they are able to be anonymous.  So as we again ask our sovereigns to protect us from things like cyber bullying we need to make sure we understand what it is we're asking.

JONATHAN CHARLES:  Jonathan, thank you.  Just pick up on one other issue as well that was raise independent those feeder workshop and that's the role of education and whether that can really help a to push where the boundary lie of what can be said on‑line and how people act on‑line and education clearly may have some role to play here.  Does anyone here have thought on whether enough is being done on education and whether it is possible to proceed on education.

MARIETJE SCHAAKE:  Thank you very much and I thought that the input from the youth panels was indeed very important because if we want to build resilient society for the future we need a lot of open debate and we need people to be able to take their own responsibilities when they go on‑line instead of relying on too many laws and restrictions cothe law is and should be the last resort.  I think the aspiration should be to have an open society to an extent that even the most difficult issues can be discussed openly.

On education, I observe sometime that it is actually younger people who are much more tech savvy and more familiar with the internet who can teach perhaps their educators something and so to educate politician to educate educators to educate young people about technologies and their implementations implications I think is a cross‑generational issue and perhaps we should all be educated little bit more especially I can say that people that are making decisions about technologies often don't know exactly how internet works.  I mean, a lot of my colleague still have their assistant print out e‑mails they write on it with a pen, they give back the piece of paper and then the assistant or staff e‑mails back.

We have a lot to learn and to do across the board and of course that is very important especially if we aspire to having open societies.

JONATHAN CHARLES:  That is very good to know laws are being made by people at the cutting edge of internet technology.  Liesyl, first of all, and then Zahid.

LIESYL FRANZ:  Thank you, Jonathan.  I just wanted.  Clearly I am not Chris Painter, so thank you for introducing.  I do express his apologies for having to leave early for unexpected circumstance.

I just like to take the opportunity to say that certainly in the US we have a very strong desire to improve the education situation at all ages and I think matata your point about educating at all generations I particularly pertinent.  We had the same issue with our Congress as far those who might be tech savvy and who might not.  So that is part of the agenda as well.  We just have come off of October is cyber security awareness month in the United States and it takes the opportunity to highlight the need for educating young people, educating user, educating providing training and resources for small business and so it a really good opportunity for all manner of nature of educators whether they be those providing services in the NGO community or in our education system at all ages.  So I just wanted to highlight that as a need that we see and we look forward to the opportunity to partner globally on those things as well.

ZAHID JAMIL:  I am going to make a point I made earlier in another workshop.  (inaudible) of Council of Europe with respect to education in relation to the hate speech and trying to explain to youth, for instance, how they can respond a to stop the encounter on‑line because does take a two or three things sometimes you are fearing for your own safety, sometimes you could be bullied so you don't want to get into the fray.  You may not know how to respond with argument and so you know awareness in education on that front can be very important and I think that's where Civil Society can play an extremely important role to educate the youth to say look you need to get involved and try and sort of discuss this.

Now there's an aspect connected with that relating to freedom of expression.  If someone says something that is bizzare says something that let us say that he loves Hitler, for instance, we have two options.  We can take it down or we can leave it up there and make sure people start writing counter‑narrative against it.  Then there is the whole thing of likes, dislikes and a debate and in this new sort of era and environment that is much more effective in trying to you know have social sanction guess those sort of discussions because if you take that away then nobody will really know why that's bad idea or what is the demerit to it.

So you lose that awareness, that information, that education, you lose the ability to do that social transaction and you then lose the ability to do the counter‑narrative instead what happens is that kind of talk may not exist because it has been taken down but it goes underground.  It goes to an environment where nobody will challenge it and then that is much more difficult to counter.  So I think that in those areas leaving it there to haunt the person who said it for the rest of his life may be more interesting.

JONATHAN CHARLES:  As long as they are haunted by it and don't regard it as a badge of provided.  Eleonora very quickly and then Sherif and Carlton.

ELEONORA RABINOVICH:  I think that is a very important aspect of the issue that you raised but not only is it not effective to block it but it it is also sometimes wrong I mean speech that (inaudible), for example, establish religions or belief or mind sets are allowed by freedom of expression and it is very hard to always draw a line between what we consider wrong and good and also we have to think about what kind of speech that it is really offensive we would like to support or not.  These type of policies can be like a boomerang.

JONATHAN CHARLES:  Yes, they can come back to hit you.

ELEONORA RABINOVICH:  They come back and hit you when you want to say something that it is offensive it is chugging but it is important to question the status quo so free speech I important to promote social change and social change sometimes needs very even offensive and not popular and not majority things.

Of course sometime we have to I believe in responsibility and hate speech include I mean it includes terrible things about minorities and we have to fight again that type of discourse but blocking this content on‑line or offline evidence has shown that it not a better way to do that.


SHERIF HASHEM:  I think you mentioned education.  I think it central to all the issues that we raise that we keep education in mind and create a culture of tolerance and revisit some of the curriculum that the student learn early on even in the higher education system.

On the scale of the internet dictates the treatment of even all traditions that we have when we deal with issues like religion and when you mention it the nature of it somebody writes something about Hitler but how about if somebody writes an Article about how to blow up assemble a bomb and blow up a certain minority group because they don't like their ideals.

To deal with this we have to reinvent or revisit the principle of proportionality.  Somebody who is criticising a writer is I mean is allowed to do so obviously but somebody who is doing damage or trying to induce damage to others because he or she doesn't like their ideas we need to deal with it separately and education is key to handle such issues and to make them relevant and to make them really to make the society aware of what exists on the internet and the scope of scale and the scale of the internet and ideas like or statement like don't talk to strangers we tell our kids don't apply to the internet so even have to visit them and revisit these principles and practises.

JONATHAN CHARLES:  I tell my kids not to talk to strangers on the internet as well.  Carlton Samuels as well and then our Chairman would like a word.

CARLTON SAMUELS:  I have all been again hate speech the way that it handled for the simple reason that it what I don't know that hurts me not what I know and to me I might not have a right to be offended and when you legislate against hate speech all you do I put it on the ground and at that point it become even more dangerous for me because I now have no way of knowing who I should defend myself against.  So this concept of legislation against hate speech I have never seen it as useful to those of a who would be in the line of fire.

THE CHAIRMAN:  Thank you very much.  I would bring a few point on the matter that I think education should cover not just education in high school or in kinder gardens it should include broader mainly because we as a aware of Minister of information and technology sometime face argument which was brought to the court and court issued decision that removing of certain offensive sayings from resources of the YouTube or other places whereas they think that we as the minister of communication information and technology has power to get remove this information which means that education should include broader people, lawyers, court or other people.

Also I think community should think about kind of Code of Ethics and Code of Ethics of behaviour in internet because some people think that internet is free and should be expressed everything and no matter if it is even violate law but I think a kind of standard should be applicable and these standard or they Code of Ethics should be applicable in all fields in all educational system and the standardisation will help in the end mutually understand and solve problems.

JONATHAN CHARLES:  Thank you very much.  Zahid quickly you wanted to come back on something.

ZAHID JAMIL:  I want to respond to something and thought I maybe make make this a little more interest.

JONATHAN CHARLES:  What are you suggesting?

ZAHID JAMIL:  I think it an interesting point that Mr Sherif made but I am sceptical about the example of the analogy that since there is a bomb‑making video or information on the internet so we have to do something about it and take it down.

Here is why I think that because I going to be difficult.  First of all, you are never going to achieve that.  It's not going to happen.  It's viral it will all be there you are going to keep trying to take it down that is a challenge 1.

Second, if you did that and you achieved it guess where it's going to go.  It's going to go into a private network who people who really really really want to do this are going to know it and you have absolutely no idea what is happening within it and that is what al‑Qaeda's networks did.

So what you do is you have honey pots you have that available and that is what YouTube does.  There are the videos about how to make IEDs that are available and fanatic videos that are freedom of expression still on YouTube.  I can get them.  It is equivalent of saying there's a book that tells you how to do explosion as a science book we should take that book at of the lie tray.  That book can be used as well you don't have to go onto the internet so we are going to start deciding like maybe Hitler did that certain book need to be banned.  I think that's a challenge.

So I am not so convinced by the cyber security argument preventing people by information.  I am in favour, however, of anybody who uses that to inside hatred or to ask somebody to go ahead and use that bomb to be tried in a court of law under a justice system that has due process.  That I am in favour of but to say we will protect you from yourself and we just take this thing away because you are children and don't know what I happening and then take the responsibility that are going to succeed which I don't think you will be able to do frankly speaking I I think in my view a challenge and I am sceptical about it.

ELEONORA RABINOVICH:  One last remark.  I think we have a challenge here because we have a different standards of protection for free speech in International Human Rights systems and, for example, in South American Human Rights systems has very strong tradition of freedom of expression and you can't ban any offensive or shocking type of discourse only the one that really insites to violence but maybe some, for example, the European system was more tolerant toward and some even national legal systems in Europe have been more tolerant towards prohibiting some kind of discourse, for example, the denial of the holocaust, for example.  I mean it would be impossible to prohibit that type of discourse under the umbrella or the South American system of Human Rights.  So when we deal with decisions we also have to take into account this different kind of protections we have very deeply root in culture and in history of how these Human Rights systems and these nation build the legal systems.  So just to address to mention that there is also one challenge there.

JONATHAN CHARLES:  Marietje and then Sherif.

MARIETJE SCHAAKE:  Briefly looking around the hall I am not sure how many people here listen to hip hop music but I think one lesson we can learn is that often times when you say that something should not be done or listened to or watched especially for younger people it becomes all the more interesting.  So I think it can be argued that this is a topic that may need some more research that the logo of parental advisory that has been good on the CDs boosts the sales of those CDs with graphic language.

Another suggestion that is often made is to have hot lines so that people can report problems that they see and I think you know it is important that people can share their concern about some content but what we also know from the results of these hot lines is that what people report as being hate speech or pornography or violence inciting or hatred inciting does not in the majority of cases match with the way the law would be applied and it also came back in some of the read out of the workshops that the public needs to be considered but we have to be very careful I think that we don't create some kind of majority rules mechanism where as long as there is just a lot of reports against some content, that it will be taken down.

The law should be the rule of law and the law should be the last resort and this is even something that may be up for discussion in the Netherlands, for example, there are well the denial of the holocaust is illegal and Mein Kampf is still also the book by Hitler is also still not legally available and this falls into historic context but it does on the other hand sometimes undermine our credibility.  I have experienced directly in talking to representatives of repressive governments that when I addressed free speech in their countries they immediately came back to me with the example of these two practices.  So I am willing to rediscuss because I think it is important that we in an open debate address our differences and that even very difficult disturbing and even insulting ideas can be shared because often times the public response times against the ideas is strong, that is the beauty of the internet, we have an open debate of what is and what is not acceptable if it is given that space while of course ensuring that peoples lives are not in danger.


SHERIF HASHEM:  I strongly agree with this, that is how we educate ourselves  ‑‑ I also agree with the earlier comments from Zahid, I didn't mean to imply that we take this video oaf the internet or likewise, actually I am a big advocate, big brother, playing the big brother role over the internet and actually I again strongly agree with his point that what I was trying to say is we need to have the culture of tolerance over the internet, located to our newer generations, so we know that there will be videos on how to assemble a bomb, videos on how, even to commit suicide and we need to educate our younger generation, is we don't mimic anything you see over the internet.  We have seen instances of children trying to do this, making sure that the culture of tolerance is there and that is where I see the comment from Rita about this.  This is key.  Having the principle of proportionality, if it is a life threatening statements, we have counterstatements to dole with it and we try legally and hold the people responsibility for it in front of the court of law.

JONATHAN CHARLES:  Before I take questions, are there anymore feeder, Rapporteur  ‑‑ this gentleman at the front, go up and tell us which workshop you are reporting from.

[workshop feedback]

NEW SPEAKER:  [mike not working]

JONATHAN CHARLES:  We will get someone to help you.

Workshop 50.

ANDY SMITH:  Workshop from workshops we did from IGF 6 in Nairobi.

JONATHAN CHARLES:  About aspects of identity.

ANDY SMITH:  The UK IGF and various other workshops, the panel we had yesterday, we had representatives from sow dee Arabia, the UK and Europe and and it was quite interesting we actually got some surprising answers and changes in direction.  One of the fundamental findings we had from last year was proportionality between security and privacy.  It is culturally context sensitive but it is also very hard to define and it is very emotive and I think some of the feedback we have had even in this session shows how much of an emotive sujt it is with people coming down on both sides.

So the key issues for this year were to look at the governance of identity on the internet and its impact on security and privacy.

Look at the use of identity in commercialisation of the internet with particular regard to legal frameworks and economic development.

To look at the balance between privacy and openness in the context of user norms an behaiv yor, including how to protect the naive from themselves and how better to use identify, in access to information online resources and services.

Workshop 50 questions we started off with, were is identify legitimate currency to fund the internet?  Can you actually use identity and private information as a resource to fund the internet?  Is it a currency?  How context sensitive is identity?  And how do you protect the naive from themselves?

One of the fundamental questions is will we ever be able to balance the need for security with the right to privacy and also how identity frame worngs can become an e business enabler for the mass in the east?  Because the east certainly the Middle East have a very different view of the internet and want the internet offers than we do in the west.

So from the workshop they were a number of conclusions and one that is actually come up already in this session.  There is a significant difference between privacy and anonymity.  Anonymity is the ability to perform actions without them being traced to the person.  Which means both that they have the right to free speech without fear of repercussion sm but also, they can't be held accountable for their actions.  And it is quite easy to miss use facilities if you have no fear of repercussions.  Privacy is the ability to only provide personal information to those entitled to it by law or that the person chooses to provide the information to of their own prer rog tev.  So privacy protects peoples rights but doesn't damage the need for security in law enforcement whereas anonymity can.  We really got try and keel with the two concepts separately, even though they are interrelated.  Anonymity is not necessary fi for privacy, but is misrepresented as a requirement for privacy.

It is vital to have the right level of identity of assurance for the context of transangsts across the internet was another conclusion we came to.

Basing on liability model and using contractual frameworks will improve the trust and commercial use of identity, help fund it going forwards and help people have trust and reassurance in the identity being used.

Identity is used as a form of currency on the internet, with people providing personal information in order to gain free or low cost services in return, this allows the payment of services that comes from things like targeting marketing.

A lot of things to do with balances an the understanding the dimpt drivers for security and privacy and how they pull against each other.  Digital identity is an on going piece of work and becoming critical subject for the success and globalisation of the internet.

Most digital identities still fundamentally based on physical identity and physical identity credentials issued by single authoritive sources, original documents tend to be the national passport or government issues identity issued documents.

This maybe used directly to set up identities or indirectly as they are used to get bank accounts and credit cards which are then used in turn to get digital identities.

People need real insen tiffs to get online and perform commercial activities, such as the card for blue collar workers that has been been issued in Saudi Arabia, they need help to secure the online profiles so not sujt to identity theft and fraud which is resulting in a fear for going on the internet across much of the Middle East at the moment but it is in the west as well.

We should not be looking for grand scheme but rather small steps and maybe compatible standards and creation of compatible standards so that small schemes can interoperate effectively although someone needs to set these standards.

So in the context of the IGF questions in response to question one, what impact can security governance issues can have on the internet and human rights?  Privacy is a fundamental right but so is national security and feeling and having the right to feel safe both in the real world and online, which is derived from enforcement of law.  Privacy is giving people the ability to protect and control dissemination of their personal information.  However, even the physical world it is not possible to retract or remove information from the collective conscious for example something published in a newspaper.  You know if Richard branson decided he wanted all of his personal information redacted, that would never happen otherwise you would have to go around the world and burn millions of newspapers.

We need to better manage and have better governance of the internet and internet iews yaj, so that in general privacy is upheld but where necessary someone can be held accountable for their actions.

In response to question 3 what risks can internet fragmentation pose to privacy and openness?  If identity use and governance becomes fragmented it will destroy many of the benefits of the internet as a resource.  The market may deal with this and business and funding drivers that need an effective globalisation, this will come about by contract yiewm relationships and effective liability models much as we have today in the real world with the passport.

And in answer to question 5, what risks do law enforcement information suppression and surveillance have on security provision and openness, anonymity is the biggest conceptual head ache not priev si.  Privacy is good and hard to misuse, anonymity can be misused.

Surveillance is often in effective and you often only surveil those that are law abiding when you capture interlogs and huge masses of information that the British Government now wants to can chur law enforcement is fundamental to the nation states, the balance is probably more between anonymity and security rather than priev sis and security.

In answer to question 6 what measures can be taken?  Anonymity will always be used by the bad guys.  We already have organised crime using data protection and human rights especially article 8 to protect themselves against being traced, prosecuted and against general law enforcement activities.

There are things like one way trust models and zero knowledge proof of knowledge which can be used to provide pseudoanonymous services which can give some of those rights to people but without actually taking away the capabilities for national security and law enforcement.

However, this would allow su pressive regimes and those that misuse the information to have dangerous tools they could misuse.

Until we have a fully democratic world we will have to find a balance between security and aanonymity.  Csh for now we need to conduct the balance of, to which...

JONATHAN CHARLES:  Thank you very much, is there another Rapporteur, step to the microphone.

TREVOR PHIPPS:  Good afternoon, I am [inaudible] a fella from [inaudible] the workshop that we are reporting on, workshop 96 internet of humans which looked at the bemaiv yor of humans on the internet, the social ol of the internet,.

The use of real names is desirable but we need to be careful of what is posted online, the law itself should be balanced in dispensing justice relating to speech online, vis‑a‑vis speech that is conducted in the viz cal world.

Two  ,... freedom of assembly and freedom of association, these rights are threatened by surveillance, censorship and erosion of anonymity.

The, this group felt there was need for greater account jt by all accounted bt... online,.

As well as the need for governance to understand the issues from a multidisciplinary and multi‑stakeholder approach.

One approach that was suggested was that we need to study or understand human behaviour first then consider online behey yor, some of the questions that were raised for example, what steps are being taken by policy makers or regulators to understand behaviours taking place online and how do regulators determine what policy need to be implemented to address online behaviour.

Also, given the need for users to have content available on demand, the need to have the content when they want it, where they want it in the form they need it.  How does that change in behaviour affect other issues affect copyrights.

And we also felt there was a need to see privacy as completed related to security.

Consideration should also be given for how ICT's have changed or respective behaviours for example,... [inaudible]   reduction in commune caiks skills for example, face to face communication, how is the ICT's affecting that.

It was also felt that these are large population of digital citizens that use the internet as a source of information but are unaware of the risks associated with the risk of the internet.  There is need for greater awareness and capacity building in the area of privacy, security and ethics.

We also looked at the youth study by the child friewp and some of the key take aways from that, the youths felt they were more inclined to feel open if they were more anonymous and the anonanymorety allows them to be more exprezzive  ‑‑ one's [inaudible] of the online communication was that the aspects of body language is lost, they found it difficult to identify emotions even with the use of e mote cons, what is intended as a joke in an online chat could be easily misinterpreted and running into trouble.

So, the panel basically encowcialged and called for greater collaboration on similar studies like the childhood group and hoped that the study could be expanded to include other groups, the disabled, policy makers and all the end users to understand how their behaviour, how their behaviour on the internet and online is shaped and is being affected and finally, we felt that anonymity and freedom of expression online is good but each of us must be responsible and accountable for actions.

JONATHAN CHARLES:  Thank you very much.  Yes, another feeder workshop if there is one, I will take the lady and the gentleman, yes.  Lady first, lady in the orange, first of all.  How about that?


If you could make it swift  , please be as quick as you possibly can.

ANJA KOVACS:  I am Anja, I work with the internet demom ra si project in deli, reporting on workshops 185, looked at the tension of application of criminal law and of free and open internet.

We divided our discussion in 3 parts, the first part we tried to kind of set the broader context and showed that even in democratic societies which the workshop focused on, there is a great variety and understanding on what our appropriate ways of doing things including protecting freedom of expression.  We started with the discussion of the innocence of Muslims where the Google representative gave an explanation of how they handled that.  Very interestingly, we actually got into a debate, where a staunch profree speech asked why Google didn't take down the video in Pakistan and the representative of... that is linked to the tiew niz yan government said that Google shouldn't have taken it down where it got ordered.

That is the setting of the scene to see how messy the issues are, even any actor doesn't consistently take the positions that one expects.

In the second part of the session we went to the challenges that entered the tension and the field.  First of all there was a comment that increasingly because of the desire on the part of governments to imp plement criminal law online as well this lead ses to architectures of control where criminal law becomes embedded in the internet at various layers and including increasingly in the court layer as well as the content layer.

Also remarked that businesses play into this desire to more and more regulate user/criminal law, partly by creating terms of service that one speaker mentioned increasingly can be seen as accidental constitutions so that in a way they are a new level of law regulation that more and more of us have to abide by if we want to be able to express ourselves freely.

The two of these things together, the way criminal law is implemented in the architecture of the internet and in the terms of services of the businesses, lead to new behaviours online, a forced acceptance of new kinds of behaviours, an example of how we have to give up our privacy on certain platforms is the best answer of that.

We spoke of co‑operation as businesses as law enforcement so you see the privatisation of law enforcements something which citizens then often have little recourse to.  Was pointed out sometimes it was difficult to understand what were the principles that intermediaries in particular apply in those kind of situations and some, one participant in the workshop remarked, it seems there are more pragmatic in the decisions than principles, this makes it difficult for users to understand what is going on.

From the point of view of states we discussed the issue of cross boundary hum quite a bit.  There have been instances for example, in which the government of state X has seized the domain in state Y, even though the activities of that domain were legal in state Y.   in those kind of cases what you see is that a state, asserting its sovereignty online in the same way as it would off line harms the rights of users in another state.

The remark that was made is that sovereignty online and that in that sense kills sovereignty, that was an important remark to keep in mind.

Finally in terms of the challenges there was a mention of how the patch work of the different applications of law in different states how that leads to a risk of forum shopping and particularly the example was mentioned of how defa maition is  , in some states than others, according to where it is criminalised journalists published online might be persecuted in another state.

JONATHAN CHARLES:  I will have to ask you to be very quick, rer where you know r running out of time.

ANJA KOVACS:  The users have to better understand, that the government have to understand better what this is all about.  Secondly we talked about how we need far more transparency and far better processes and a better understanding of policy making sm there was a sense that a lot of what happens online at the moment is about doing politics and not actually about making policy.

In Tunisia, there was an interesting case, the tiew niz yan internet agency tries to separate different aspects of the power and is staunchly advocating for no censorship on the internet but opportunity for surveillance, the surveillance becomes the opportunity to keep the internet open.

Lastly human rights instrument, like the inter‑American system that Eleonora has been mentioning and identifying shared norms and standards to apply as ways to take it forward.

JONATHAN CHARLES:  An ya, thank you very much.  For our other Chairmans, can you be very quick,.

Malcol Hutty:  My name is Malcolm Hutty, I am the... head of public affairs for the London internet exchange, here to the workshop 111, protecting the rule of law by... cosponsored by... tv a multi‑stakeholder panel that involved all the sectors that are engaged with this broad debate and included representatives from... behaviour on the content and the content on the internet on the grounds that it maybe legal.

NGO's representing citizens and those of journalists, a criminal law specialist from the government sector, we had [inaudible] and we had a public official and an... [inaudible] from the European Parliament.  Even within the constraints of ensuring the wider... we managed to ensure representation from 4 continents, that included an NGO from Azerbaijan representing journalists here who sometimes have difficulties in issues regarding freedom of expression.

The context of our workshop really follows up from one aspect of what the previous workshop report had, the questions around so called privatisation of law enforcement.  The real context is rk that as the world becomes an ever more, as the online world becomes a central world of the economy and our everyday lives moving through where... [inaudible] and behaviour is mediates by I,... where the off line world equivalent had no such intermediary, you raise questions about how to apply the rule of law.

This new creation of intermediaries, creates a new route for those who wish to complain about content and bemaiv yor on, on the rights that thees the... new routes to put pressure on the intermediary, so as to apply sanction for those responsible for the content or behaviour.

Thus we arrived at the cowr question for the workshop, how do we ensure that the the rule of law and the procedures of... due process that we expect in the off line world are equally and effectively applied in the online context.

Now as you can imagine.

JONATHAN CHARLES:  If I can ask you, fairly swiftly, we are virtually out of time.

ANJA KOVACS:  ... large range of participants,... there were two key points which we arrived and though the Internet Governance Forum is not somewhere where you need or expect to come to consensus.  Two points of consensus were rieched first the rules of law and questions of procedure fairness,... [inaudible] content and ak tiffties on the grounds that it is illegal or infringes the rights o v a third party, that was proposed by the council of Europe and endorsed by those present.

While illegal material and bemaiv yor should be addressed... legal... should not be removed as a consequence of re... this implies a need for mechanisms to distinguish between the two.  Now given the first proposition that was universally endorsed, that means that questions to the rule of law and procedural fairness are in vote it follows that those mechanisms to distinguish between the legal and illegal must be ones that provide and respect the interests of the person who is complained about as well as the interests of those making the allegations.

Thank you Mr Chairman.

JONATHAN CHARLES:  Thank you very much.  One other workshop if there is one?  but it okay, how many workshops are left to report?

One?  , two.

Okay, one minute each and you really are going to have to summarise in one minute each, who would like to go fist, bring us your main points in one minute.

FATIMA CAMBROMERO:  Workshop 97, affects states,... on the matter of states this is discuss dz in places like the UN, we need to have a multi‑stakeholder voice to underinp the technology and end user concerns so we invited multi‑stakeholders to have a debate, needs to be grounded on how man rights principles, weren't sure how to go about that, start at the national regional lel, the, other panelists take that forward into their own regions and hopefully have something to report back next year,.

JONATHAN CHARLES:  That was swift, thanks lady here.

IRINA TRUSHINA:  Thank you, my name is erina, I am from one question or comment, on what was saying in here, just maybe ten minutes earlier.  You are saying about the offer to develop a code of ethics, am I right?  I just wish to get some details.  Who will be involved in working group on development of such a chord?

I am a librarian, I am from Russian library association, and national association of library  ‑‑ I know and practice that this is a very good and instalment of [inaudible] the code of ethics and we have a lot of experience of development.  I just want librarians to be involved.

JONATHAN CHARLES:  Thank you, very much indeed, we really have run out of time.  I want to ask our panel one question which arises out of this and it is a question also raised by our secretary here who makes a good point.

On the panel perhaps 30 seconds each, maximum answer, do you believe we will get to a position where we can have some international code of ethics or code of online behaviour which could be agrowed by everyone and make notes, I am talking about a code not a law.

Do you think we could ever get to that position?  Your 30 seconds starts from now, Kirsty Hughes,.

KIRSTY HUGHES:  I hope we won't to get to that position, even if it is a set of law, we don't need the top down set of rules, we have a diversity of community and individuals out there, they are perfectly capable of having their own codes for their own particular communities so I would really challenge back on why on earth we would go that route?

CARLTON SAMUELS:  Too many things involved with too many at stake, with people coming together with a common interest we can find a way to get along.

SHERIF HASHEM:  Thank you for raising this point at the end of the panel, clearly we need to find common ground ses if r the better, for the resources that we have and for the best interests of the society.

I second your point that probably having a treaty is extremely out of reach at this point but starting with a code of ethics would be achievable I hope.

MARIETJE SCHAAKE:  I agree with Kris entirely, well I  ‑‑ one of the read outs, I think it was the gentleman from the United Kingdom who was talking about the fact that the right to privacy and the right to national security are the same.

I think there was a really dangerous sort of confusion going on there, or maybe if that is not what was intended I think we have to be careful in general not to confuse individual rights with collective rights and not to confuse the responsibility of the state with the protection of individual from the state.  An example which is deeply experienced in many European countries, is that for example, in World War II the Netherlands also Germany had sophisticated, archives of sensitive and personal information and when these were compromised by the occupying Nazi authorities this put a lot of people at risk and put the national security at risk because the state could no longer protect the individual citizens so I think it is very important to distinguish between the two particularly when and this came up in a number of your workshops.  There is going be increased privatisation of policing an law enforcement or at least privatisation of keeping data or archives or information in the context of you know, security policy etc.

And I atbree with the last speaker, on the danger of pushing law enforcement and policing into the hands of private actors who do not have expertise or a mandate within the rule of law and I think that is really important.

So, protecting the individual against the state is important, wheels we have the representative from the aseer yan government,... the whether you would be willing to do something about that.

JONATHAN CHARLES:  Code of conduct?  Code of ethics?

MARIETJE SCHAAKE:  I will, difficult to find a one size fits all approach... of finding ways for common cause, communities of... where they don't exist and mechanisms for ensuring a multi‑stakeholder approach toward strategy or framework or expectations of behaviour.

JONATHAN CHARLES:  Eleonora, 29 seconds left.

ELEONORA RABINOVICH:  I agree with Kirsty, it is there are lots of communities of different staiblgholders discussing ethics and I think like a unique code apriewved by I don't know, I don't it is not achievable or desirable also the code of ethics implies some kind of sanctions so we have to discuss about that to see if we really want a code of ethics for controlling internet and monitoring this and practices internet, I don't find [inaudible]

JONATHAN ZUCK:  I think we fall into the same track, we don't want to impose on other people unless we want to impose our will on oh people.


As a representative business, I will say that I think that a great dole has happened just to the expansion of commerce and it is important not to underestimate the importance of goods and services and trade etc. in bringing about social change in a ways that are wholly unexpected and that is awive how influence can take place.

ZAHID JAMIL:  Sorry,...

JONATHAN CHARLES:  The security services csh.

ZAHID JAMIL:  I better not go Germany.

I would like to respond to what [inaudible] said.

... come to ICANN, please, that is my response, I don't know if she is listening, please do one.

You can't have a code of ethics of life, that is what it is, you can have various codes of ethics on virs things that is brilliant, we should try to work on that, I think in different spaces, you cannot have one size that fits all.  You know, who, what, which space matters, you can't have something that is all things to all people and would in any case be too broad to be useful and in any case it might even have the impact of throttling innovation.

We need to be careful.

That is why it is very important we work with multi‑stakeholder bodies because they are working on various things and various codes and ass pects and I think that is the way to go.

JONATHAN CHARLES:  The law of unintended consequences,.

My panel, our final word to the Chairman.

CHAIR:  Thank you very much and for a wonderful moderating of this session.  My proposal I am glad that all of you supported establishment or developing of the code of ethics in internet, WSA has created a unique  ‑‑ for the governance forum.  This is so different than any other institution or other structures and this participation of all stakeholders in discussion and solving problems existed in [inaudible] of internet is, shows that this meng nism could also accept applicable and acceptable for all users of internet code of ethics, regarding the points of my colleague Marietje, as the a representative of the republic can [inaudible] it is a unique place where there is no any restriction for internet and if you see, development of the internet was in the last ten years you can see that, which kind of achievements that Azerbaijan get, in access of the largest population, [inaudible] as I mentioned the support idea bringing fibre optic to each home, each possibility of its citizens to use and benefit from internet and I think that never ever person in Azerbaijan, for expression their vision punished or restricted their rights.  If any action which violated was laws which is coming from any countries has been taken in this case I am, it is subject of another consideration but, you can't find any single article in criminal court or penal court of Azerbaijan punishing for expression of ideas and I believe that as Azerbaijan expressing its interests for open discussion holds this wonderful event; wonderful Internet Governance Forum for open discussion, for solving not just regional or countries problem, global problem, which shows that our country as a new democracy, as the youngest country as it has showed in 1918, 20's, it was one of the first cun I trs who gave a first election rights for women, earlier than the United States or the United Kingdom.  Now we follow new achievements, I believe together with all institutions, involvement, all international expertise will achieve and we will gain better results so the using of internet for benefits all humanity for development of the humanity.

And thank you very much for all your active participation and under my power, I call this session closed.


[end of session]