Deprecated: Function get_magic_quotes_gpc() is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/cegcore2/gcloader.php on line 63
Internet Governance Forum
 Welcome to the United Nations | Department of Economic and Social Affairs

Sixth Annual Meeting of the Internet Governance Forum
27 -30 September 2011
United Nations Office in Naiorbi, Nairobi, Kenya

September 29, 2011 - 16:30pm


The following is the output of the real-time captioning taken during the Sixth Meeting of the IGF, in Nairobi, Kenya. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.


  >> AMBASSADOR DAVID GROSS:  Well, welcome to this afternoon's session.  Let me assure you that you have chosen wisely.  It's well‑known that this is going to be the most interesting, exciting interactive if not totally fascinating panel of the entire four days of the Internet governance forum.

So you have chosen very, very wisely.  And everyone who's not here will be jealous of the fact that you were here.

Let me suggest the following as our format, and this will come as a surprise to your panelists up here, because I like this to be a surprise.

What I'm going to ask our panelists to do is, I'm going to ask each of our panelists and Sunil's going to start, because he had the ‑‑ he was still ‑‑ he sat at the end of the table.  So we're going to start with Sunil and work our way down.  And what I want each of the panelists to do is to give us one thing that that panelist thinks is the most important fact, suggestion, idea that that panelist wants you to know when you receive here about our subject, about mobile and privacy, largely because we're going to assume that each one of you has thought, worked in, talked about the issues of privacy generally.  Who hasn't?

Mobile, in particular, because it's ubiquitous.

And so the object here for each of the panelists, the test that they are being given, is to give you at least one thing, but only one thing to start off with.

You can weave in other ideas over the course of the next hour or so in answer to questions, but one thing that you think that these people out here, paying all this good money to sit in these seats, should know when they leave here.

Once we are finished with that very brief, maybe even cryptic.


  >> AMBASSADOR DAVID GROSS:  Statement from each of our panelists I am going to play a bit of a devil's advocate and make them slightly debate with each other over these things if I can figure out how to do that.  But that will only last a few minutes, because it seems to me that you all are as ‑‑ probably as knowledgeable and as interested in the subject as the people sitting up here, making you get a "crinked" neck because you are sitting at 90 degrees.  So we'll make this interactive.  I will call on whoever raises their hands.  Ask you to give a comment, a question.  The only thing I ask for you to do is be provocative.  The objection of this session at least as I see it, is that you come away saying, I learned something.  I understand something.  I know something that I did not know when I came here.   Otherwise it's a wasted opportunity, and we could have been taking nap.

So with that, Sunil, start us off with something provocative, short and important that everyone should know.

  >> SUNIL ABRAHAM:  I was born in a Christian family, but I was raised in a Hindu nation, so I don't really get this obsession with one God and one truth, but I will try.


  >> SUNIL ABRAHAM:  The privacy principle that I recently find useful when thinking about these issues is that expectations of privacy protection and guarantees of privacy protection should be inversely proportional to power.  People at the bottom of the pyramid with no power should be subject to the least surveillance, and people at the top of the pyramid, with the most power, should be subject to most surveillance.

And in India we have a case study which demonstrates how this is possible.

One of the surveillance organizations, intelligence agencies in India, the NTRO, has imported over the last five years around 1,300 off the air interception devices.

Of these 1,300 interception devices, 400 have been deployed in New Delhi in and around the Parliament building so there is a lot of surveillance happening in India at the top of the pyramid.  I'm a bit conflicted about whether this works and if this is a principle that we should all adopt and advocate for.  And I hope it will provoke some thought and some debate.  Thank you.

  >> AMBASSADOR DAVID GROSS:  Oh, there it is.  Okay.  Good.  Juliana? One thing.

  >> JULIANA ROTICH:  I think privacy should be a user's choice.  It should be my choice as to what information I should share.  And the devices and the services that I use should respect that right. 

I'll give you one example.  I don't know if some of you got local numbers when you came here, but if you did, you may have gotten some promotional SMSs from Safaricom.  It should be a choice whether you want an advertisement or not.  It should default to no advertisement, even if it's directly from Safari, you shouldn't have to worry about saying stop.  You should be able to say start first, not say stop.

So I think it should be a choice that users make and not a choice that is made for them by mobile service providers, app developers or anybody else. 

  >> AMBASSADOR DAVID GROSS:  Okay.  Let me warn you.  That didn't sound that provocative to me, but we'll see, we'll see where we go here.  Patrick, pressure's on you.

  >> PATRICK RYAN:  All right, David, I'll give it a shot, I wanted to just sort of propose a hypothetical of a Kenyan cloud provider that sells cloud services to ‑‑ to Enterprise customers.  You know, a company that started out of a garage, not unlike Google did in ‑‑ you know, several years ago.

And think about this hypothetical service provider that delivers a product, let's just say it competes with a product like Google apps or something like a Microsoft Word and sells to enterprise customers.

Now, imagine that the product works very well, and, you know, customers, consumers start using it worldwide.  In the name of privacy this application would probably not be legal for use in Europe because of the ‑‑ the privacy rules that are ‑‑ that tavern to ‑‑ you know, to various different European ‑‑ European companies in the name of protecting consumers they would not be able to use this product.

Because Kenya is not an adequate country under the European privacy data protection directive and so I just wanted to throw that out there in the name of many other countries like Kenya and many other developers like this hypothetical cloud developer operator that I just described.  Sometimes these privacy norms are very important for protecting consumers and are certainly key to an open Internet but can, you know, certainly act as barriers for ‑‑ for many new companies to ‑‑ to ‑‑ to deliver services to customers. 

  >> AMBASSADOR DAVID GROSS:  All right.  I'm not sure how provocative that was, but we'll find out if there are some Kenyan cloud‑computing experts here who have found that to be a problem.  Jeff, more pressure for you, be really provocative and I'll even give you a microphone.

  >> JEFF BRUEGGMAN:  Yes, David, I guess we're not standing up to your standards.  A typical thing is privacy may be dead, I don't think that's true, I think when we have asked our customers they care deeply about privacy and they want control but I think our job is more difficult than that, we can do a lot of work to think about how to provide and secure privacy in this complex mobile environment, but the reality is, that's not enough for our customers.  They want all of the security and privacy, but they also want all of the choice that comes from mobile services and all of the convenience, and they want to be able to share their data and manage their own data.

So I think we have an incredibly challenging task in front of us, because it's not just about privacy, it's about giving consumers what they want, which is both ‑‑ all of the facilitating the information, as well as the privacy.  So ... 


  >> THE PANELIST:  Okay, provocative, first of all I'll struggle with two identities I'm half Irish and half British but I don't know which is which and I don't know which one to present to some social networking sites.  To be provocative I believe that was really interesting of Google to say that European laws prevent innovation, et cetera.  I'd agree, but I'd also agree that operators probably more restricted than Google in that, and I think it's time to look at functional equivalent data, look at the privacy context and regulate that context, not the infrastructure.  So if I give you an example, in Europe, for example, the GPS sell ID dater used by Google is not regulated but it is when it's a mobile operator that uses that data.  So that's the real challenge.  How do we address that to ensure that the user has a consistent privacy experience because it is about the user and I think often policymakers forget it is about the users and we need to focus more on that.  That's my provocative statement.


  >> IAN BROWN:  I'll start unprovocative but move into more provocative.  I think we all agree users demand control.  That's a very important for that trust in the technology, their willingness to by‑products.

I believe, thinking about how you can ‑‑ you can provide that to users, particularly with mobile apps, where, as several people have mentioned already, you might have Micro businesses developing these apps from anywhere in the world, selling them then on these open global marketplaces to customers in Europe and in countries with very different privacy standards.

How can ‑‑ how can we sort of give the benefits of that to consumers around the world without regulation getting in the way of innovation and all the good things we've heard about?

I think platforms will play a really key role in that.  I think mobile operators and companies like Google, I think they will have to help their users if they're ‑‑ if I'm using my iPhone or Android phone, and I see an interesting app from a developer I don't know much about, you know, how can I, using services from the platform, convince myself I will stay in control, that it will use data, my data, my very sensitive data about where I am with my phone, who I've been talking to, meeting with and so on, in a way that meets my expect stations? I know many technology companies outside Europe wish the data protection directive would just go away.


  >> IAN BROWN:  Perhaps the provocative thing is if you're following this which is quite far advanced now the Commission is not saying this is getting in the way of innovation, upsetting our international trade partners, putting Europe at a disadvantage.  They're saying, whether you agree or not, I won't tell you my perspective, this is the way forward, you know, for our partners, and Europe is not relaxing the directive one bit.  Actually Europe wants to extend the reach of the directive make it apply as far as possible to companies supplying services to Europeans, no matter where they are geographically.

  >> AMBASSADOR DAVID GROSS:  Okay.  I've got a little bit of a headache here, because what I heard from I think everybody either directly or directly ‑‑ or indirectly is that users ought to be in control.  From the least, you know, we should maybe prioritize and how ‑‑ users are in control.  The reason this gives me a headache is I don't have a clue how you figure out what user is what.  Now, you could easily say:  Ask the user.  But my experience is that if you give me a survey, and you ask me:  Is privacy important? I don't think I've ever met anybody who said privacy is not important.  In fact, privacy when you do surveys is almost very, very important.  Almost uniformly.

But then I watch my 26‑year‑old son from the time he was younger and to now, I look at myself, I look at my friends, and you have a privacy disclaimer that's written by 500 lawyers, that's incredibly long, that you have to agree to, to get to the site, or to have the application.  And I don't ‑‑ I'm exaggerating a little bit but I don't think I've ever met anybody who's actually read and understood one of those things.  'Cause people want it, they do it.

So if you ‑‑ if the users are in control, what are you talking about? Are you talking about what they say in a survey they want? Or are you talking about what they really do?


  >> SUNIL ABRAHAM:  Um, to leave it all up to the user is something (off‑mic).

  >> AMBASSADOR DAVID GROSS:  That's all really really Draconian to me.  Meat packers ‑‑ people die.  They ate things that were bad, people know it.  People died.  Privacy is important and maybe there's an occasional extraordinary case where somebody is harmed physically or otherwise, but by the fact that everyone just seems to click through these things and by and large the world has remain in there tact, is this really an area where you need governmental regulation?

  >> THE PANELIST:  Let's look at m‑commerce in India.  India banks are supposed to use digital signatures for encryption and also digitally sign the documents they give to their consumers, but for the last five years banks have been in violation of these policies.


  >> THE PANELIST:  And recently they've been part of a study group which has come out with a public report that says that this is really not necessary.  What is the impact when there is fraud? Instead of of the banks being liable for fraud, ordinary citizens are liable for fraud.  It might be a small change when it comes to each individual consumer, but if you take it across the Indian nation, I'm sure we're talking millions of dollars.

  >> AMBASSADOR DAVID GROSS:  Okay.  Jeff? You deal with ‑‑ your company deals with millions and millions of customers before before before.

  >> JEFF BRUEGGMAN:  Uh‑huh.

  >> AMBASSADOR DAVID GROSS:  You think what Sunil said is correct, are people being defrauded.

  >> JEFF BRUEGGMAN:  Well, I guess a couple of points.  First, when we asked our customers we found they were more sophisticated than I think they're often given credit for.  They understand they're somewhat negotiating when they use the Internet.

On the other hand I think it's a fair point that it's not clear to a lot of consumers what information is exactly being used and there's an opportunity to use the controls.  Back to my original point I don't think regulation can solve the problem of making it easier for consumers to informing gait their privacy.  You can argue that regulation is one of the reasons we have 50 pages of privacy policy.  That was the thinking at the time was you better layout every detail.  So I think it may be some combination of a privacy framework but I think ultimately the more interesting discussion that we're having with the regulators and the industry and civil society is can we come up with agreements on ways to create better standardization of how you're being treated as you're navigating the Internet so that you can trust, okay, if I ‑‑ you know, and I think the growth of the trust market is a great exam pail of that.

If I see indications that tell me I know what practices are behind that, then I don't have to worry each time that I'm navigating a 50 page policy.

So, you know, to me, the discussion of regulation is missing a key part of how to make this useable for the user.

  >> AMBASSADOR DAVID GROSS:  Pat, you just got finished doing a huge report.  A whole bunch of guidelines, principles available online, I recommend it highly.  If you ‑‑ if you need some sleep as Pat was saying, the way to do it.  Pat, is that the right way to go? What have you found through your studies and your work in this area.

  >> THE PANELIST:  Some stats first to put some dimension to this, there are 7 billion people on the planet, 5.8 billion mobile phones and of those 5.8 billion, 4 billion are in the developing world and where is policy on privacy, generally set? In Europe and America.  And I think the thing for me that we must wake up to is that, you know, the mobile Internet, people are enmeshed in a global, complex web of relationships with entities around the world.  They are immersed in multifaceted social contexts, and it's becoming increasingly difficult for them to understand who has access to their data, and for them to be aware of and manage their privacy and express their preference and their choice and their control.

It is not about pushing the burden back on the ‑‑ on the individual at all.

So if I give you some other steps, for example, in terms of mobile handsets, Android has 43% of the global market in terms of OS, and the OS players have a big role to play here in terms of ensuring that privacy is designed and built in.  Apple has 18%.  Symbian, some have written it off has 22% and RIM has 12%.  So to take Aaron's point about inverse power, I think there are power relationships here, and people can step up to the mark and do things.

So in terms of what are we doing? The thing that you can take away is have confidence that the GSMA and its members represent 800 mobile phone operators around the world are doing things.  We've conducted research that shows that 92% of people care about their privacy and they want choice and control and preference.  We have an initiative and a way.  I think one of the challenges that we have is that privacy in the mobile space is determined by a patch‑work of geographically bound laws, but data flows are global, data throws are immediate.  People's privacy interests, their expectations, their needs, their wants, they transcend these geographically bound laws.  And what about Kenya, no data privacy protection and laws to preserve privacy.  What do we do? We have established principles that our members and others have agreed to that will work.  We're trying to establish a global framework.  We have established privacy design guidelines mobile application development.  We're working and talking to developers and others who say to us:  Help us, help us build privacy and help us give people meaningful choice.  So there are a number of things that we're doing, I've thrown some things out there.  The law clearly isn't adequate.  And I think industry, self‑regulation, is a good way forward.  Ample a all right, Patrick, then I'll go Juliana, Patrick, you heard Android was also, Google big on the wireline side, Broadband.

(Skype reconnecting.)
  >> THE PANELIST:  Of Googles privacy procedures, so it's something that the company takes seriously not just because of the things it has to, but also because it's been involved, you know, in discussions with government about it.  The opt in model for location based privacy issues is fundamental to the Android design, and we believe that users should be in control, they should be able to choose and opt in, in every case when they want to enable and use location based services.  There's a lot of value in location based services for mapping if you're out and about and you want to find out where a coffee shop is, you know, it's convenient to be able to you know, put the search term in and have a coffee shop be pointed out to you nearby, but that's an opt in feature, as are the other location based features for Google.  So, yes, we think it's very important and it's an ongoing, very important issue.

  >> AMBASSADOR DAVID GROSS:  Juliana as our representative of Kenya on this panel, one of the things that I was really blown away to learn at the IGF and I heard it from the Minister, I heard it from the permanent secretary both so I'm sure that it's right.  And that is that 90% of new mobile phones being sold in Kenya are Smartphones.

We are at the beginning of at least in North America.

(Skype reconnecting.) 

  >> JULIANA ROTICH:  As a platform maker, and one that is based in Kenya and was funded in Kenya we really want to think about software that works for us and in this emerging area of Smartphones we also want to think about privacy issues that matter to us.

Let me give you an example.

In terms of privacy issues, like I mentioned, you're not able to quickly opt out of ads from Safaricom which, you know, I think that's an egregious Travis industry around your rights, because you own the device, but you cannot control the service provider's behavior towards you.

I think looking at these principles, mobile service providers need to respect people's rights, and they ‑‑ it's okay for them to see themselves as just a conduit of information.

Specifically, about the Smartphone growth, we have a platform that is used for crowd‑sourcing information.  And I do want to make the point that we will see emerging issues around privacy.

If you look at the revolution in Egypt, you had people uploading video.  And there are questions now about the privacy of the people captured in those videos, because there could be retribution.

If I could just mention the work of Sam Gregory and the witness team, they've got an application that helps blur the images on video.  And it's an app that sits on the phone, but this is sort of ‑‑ these are concerns.

Egypt is in Africa, so these are African issues.  So I think we're really in that point, but (off‑mic).

(Skype reconnecting.) 

  >> THE PANELIST:  At the Same‑time there's been quite a lot of economics research done in the last few years, like in the U.S. looking at sort of fundamental ways that people make decisions about privacy, about other issues, when they will disclose data, when they won't, what upsets them about data use so I think that kind of more sort of, you know, fundamental human research is generalizable.  I just wanted to say ‑‑ to respond to a couple of points people have made.  I think looking at the mobile principles that GSMA has been working on, I believe a really positive thing about that is the engagement you've had with civil society with a whole range of stake‑holders like IGF and that holds in the principles that you see some codes that are transparent just in the interests of the industry, proposing them in an attempt to fend off regulation but I know there's some very positive aspects of what you've got in there that show you're thinking about the social broad value of privacy as well as for individuals, it's important for democracies.  It's critical.

(Skype reconnecting.)
In terms of mandating data attention, in terms of telling companies you've got to store all of this data about all of your customers.  For several years in some countries, that, of course, encourages the tel‑co's the other companies building the systems to build in the surveillance functionality that the European and American governments are asking for.  Of course that is sold to countries like Egypt and Bahrain, and I think much more carefully about this spill over‑effects of their own surveillance policies.

(Skype reconnecting.) 

  >> THE PANELIST:  So no one can understand those permissions so it's very difficult for people to express meaningful choice.  I know Schmitt said that he was going to change that and that's really a positive statement for him to make.  And, you know, we welcome that.

I think going back to Juliana's point, one thing that concerns me is that this is global.  And we need to think about how we provide opportunities for people in Africa and India and elsewhere, there are developers here who develop apps that go in an American app store that are downloaded by citizens all around the world and yet there is no framework to guide how privacy is built in and designed into those apps because there's an absence of privacy frameworks, privacy law here in Kenya and in India.  So who is determining the privacy experiences of the users on those apps? Should it just be left to the app store? Should it be left to the OS vendor? Is it the device manufacturer? No.  It's a complex, interdependent and interconnected ecosystem and it's the responsibilities of all those parties to work oncoming together to provide that experience.

  >> AMBASSADOR DAVID GROSS:  All right.  I'm going to continue to be controversial with Sunil in a moment but let me give you all.

(Skype reconnecting.) 

  >> AMBASSADOR DAVID GROSS:  Is Africa different than other places like south Asia?

We know that India has been traditionally ‑‑ particularly recently very concerned, including the mobile environment with regard to security, particularly with the terrible bombings in Mumbai and the use of cell phones to help precipitate that horrific event.

So, in looking at all of this, and hearing you start off your one point of being difference between the least versus the most and surveillance and the like.

How different is privacy in India? Between the common man, perhaps, and the entrepreneur and financier in Mumbai from the rest of the world?

  >> THE PANELIST:  I don't really want to play the role of native informant and give you evidence from India which will help strengthen debate globally, but I'll tell you two stories.  One from India and one from Philippines because hopefully Indian think tanks will have global influence as well.

The story from India is if you get onto a train in India.  As you board the train, next to the door you will get a computer print out that has the name, age and the seat number of all the passengers on that train.  And once you board the train and sit next to somebody, questions that are considered completely normal in the Indian context is when are you going to get your daughter married, and what is your salary, and who do you work for, et cetera.

So this is Indian idea of privacy.

My story from Philippines.


  >> THE PANELIST:  My story from Philippines has a little more to do with mobile phones, which is, there was research from a University, and they've looked at people in the creamy classes, the upper‑classes and also at the bottom of the pyramid.  And they looked at people's responses to a simple mobile phenomena, which is (off‑mic).

And what they found is people at the bottom of the pyramid, they look more.

(Skype reconnecting.) 

  >> THE PANELIST:  You're not asking for bureaucratic regulation, you're not asking for overregulation, what we're asking is for appropriate regulation.  And hopefully a multistakeholder discussion like this will help us arrive at such appropriate regulation.  Thank you. 

  >> AMBASSADOR DAVID GROSS:  Let me open it up.  We have a question right here.  Thank you, please identify who you are and maybe an affiliation as you make your question, thank you.

  >> THE PARTICIPANT:  Sure, my name is Chris, I'm a researcher in Indiana University and until last year I was the first technologist at the Federal Trade Commission in the Divison of Privacy.

So security is the flip side on the same coin of security ‑‑ of privacies, in fact, to protect consumers' privacy we need to make sure they have secure mobile platforms.  I have a gentleman from GSMA and Google, I know, I know, it's tough.  Yeah.


  >> THE PARTICIPANT:  So the security situation on mobile, to be Frank is horrible.  So many of us used to desktop computing are used to patch Tuesday from Microsoft, every ‑‑ it's the first or second Tuesday of every month.  Microsoft rolls out security patches, if you're a Mac user you will routinely receive security updates and Google has set the gold standard in the web space, the Chrome browser automatically updates without having to go out and find them.  The Chrome is running the latest version of the browser.  When was the last time you were asked if you wanted an update on your mobile phone? Never.  It's very, very rare and the reason, I'll tell you why, is that Google except in rare circumstances does not have the ability to push out updates to users.  So you have the operating system vendor, you have the handset manufacturer and then ultimately you have the carrier.  And in many countries it's the carriers that push out security updates.  So Google will find a flaw ‑‑ Google will discover or have a flaw Version three.  Google's engineers will fix the flaw.

(Off‑mic) and modify it for whatever Chips are in.

(Skype reconnecting.) 

  >> THE PARTICIPANT:  And all business models to provide us with security up‑Secretary of States and they are not doing this and so it's shameful that this Android phone is running software that's six months out of date when Pat's is running the most update and my Chrome browser is running the most recent.

  >> AMBASSADOR DAVID GROSS:  We're going to go to Pat to defend the entire industry and then we're going to go to Patrick, I don't think a lot of defense needed, you just need to defend the mobile part of Google.  And I Hardy vicious attack on a carrier, so we're going to have Jeff on from AT&T defend the specific carriers specifically since I'm sure he was using AT&T back in DC.

  >> THE PANELIST:  Okay, well, Chris, hi.  You didn't answer my tweet, because you tweeted on this and I tweeted you back maybe about a month ago and I said and I think I used the iPhone as an example.  As to do you believe it's the carrier's responsibility.  Because actually I hooked this up to my laptop here, my Mac, and I download the software from Apple directly.  So in that situation I begin to understand was think the role of the carrier s but I used to also work for a carrier in the UK and we regularly pushed them out, and I'm quite happy to talk to you offline if you have specific examples, because we do have, even in the GSMA we have a security group that looks at these issues and tries to wok with multistakeholders to try to fix these problems particularly on some platforms where you're correct there are malware stealing apps.  I think in the UK you will see a change because one of the regulators just got on the act because there's an app that people are downloading that would help conserve battery power but the app was sending premium rate SMS text messages.  So that's a challenge.  I'm happy to talk to you offline about that.  But what do you think about the Apple one.

  >> THE PARTICIPANT:  Apple has leverage that nobody else has.  When the iPhone launched in the U.S. they were able to extract a revenue sharing deal from AT&T that no other handset vendor had and so Apple rolls out updates to its users but only the nexus series where Google controls the updates, the rest it's the wireless vendors that ultimately have the ability to push out updates or lock down or block tethering or do other things.

  >> AMBASSADOR DAVID GROSS:  That sounds like throwing down the gauntlet to you, Patrick. 

  >> THE PANELIST:  Right.  So Chris, thank you for the relative softball, I know you throw some hard balls sometimes.


  >> THE PANELIST:  In this particular case you've really helped me out by asking and answering the question.  I'm going to add on a little bit to the answer, I think this is an area that's ripe for discussion, there's room for improvement in many cases, one of the interesting things about the I'm phone is that there is just one iPhone at any given time essentially, maybe two, but that signifies the ecosystem considerably, because you have to deal with one software, one device at any given time.  Although think back, you know, a couple years ago, for those of you that maybe you had the first iPhone, and, you know, get a little bit frustrated with the updates.  And there was, you know, what we sort of referred to informally in industry as the I brick update, right, because it would download the software and suddenly your iPhone would slow down and it would have these, you know, just seemed like to you, you think, just operated very differently.

So, you know, even Apple is learning, and I'm sure improving as well of the.

But the Android ecosystem is a complicated one.  There are 170 different Android devices, right? Compare that to the ‑‑ you know, to the iPhone or the handful of iPhones that have been developed and put to the market.  That really does complicate things.  There's 27 different manufacturers of those devices.  Collaboration among all of these groups is a responsibility that Google certainly takes part in.  As to the carriers and we're working very closely with our carrier partners and with the device manufacturers in order to improve the user‑experience, but there's work to be done.

  >> THE PARTICIPANT:  (Off‑mic).

  >> AMBASSADOR DAVID GROSS:  All right, Jeff, what's the role of the carrier in this.

  >> JEFF BRUEGGMAN:  The leverage in the relationships between us and the manufacturers, you know, it may be unique with Apple but I think there are a range of dynamics here.

To your broader point about pushing out updates it's interesting that you brought this up because I've heard our chief security officer actually say he thinks the wireline network could be improved in this and that it's actually a very clunky slow process to push out on the wire line and so he was endorsing your idea that we need to get better pushing out updates.  Software flaws are too recurring and not getting addressed enough.

And we've been trying to develop some network‑based security mobile phones, things that are not really possible in the way that the wire line world grew up to try and help provide an extra layer of security in the network to help do what you're talking about which is let's not ‑‑ either let's not create new issues but let's also try and improve on you know, what is certainly a challenging security situation on the wire lines end so... 

  >> AMBASSADOR DAVID GROSS:  Anybody else up here want to comment before we go to the next question?

Next question.

  >> THE PARTICIPANT:  I just wanted to add one last comment, this is Mark from Google.  It seems like the issue here is potentially in ‑‑ and I'm not necessarily speaking for Google but it seems like the you shall here is decoupling security updates that are provided to cell phones as opposed to feature updates or feature restrictions that are imposed by carriers for their own business reasons.  Now, does that sound like a boiling down of the issues? Yeah.  Okay.  That's just wondering. 

  >> THE PARTICIPANT:  Hi, my name's Sam Gregory and I work at witness that's a human rights group that uses video.  Thank you, Juliana for the earlier reference.  And I want to expand on the scenario that Juliana presented about the need for consideration of mobile video as we start to think about privacy issues.  We've been think a lot about that, because increasingly the people who create human rights video do it on mobile devices, that he don't go to desktop computer, then to edit it or share it.  They directly upload.  And you're getting a set of issues and this is in the human rights scenario of two sets of data in a visual image, but can be very compromising as well as literally the physical image of someone for example for someone who speaks out in Syria and this is happening right now.  We've been working on some tools solutions, the secure smart cam on a phone but that's a solution coming from a small nonprofit.  What we really want to see is how the OS, the hardware, the apps and potentially the carriers address this as you think of video increasingly becoming the dominant mode of creation and shared mobile networks.

  >> AMBASSADOR DAVID GROSS:  Anybody, Sunil, do you have a view.


  >> AMBASSADOR DAVID GROSS:  He was complimenting you before you should at least return the compliment.

  >> JULIANA ROTICH:  Okay, I would actually like Pat to address this, if he could, because he's sort of our link to the many mobile operators.  And I'm just going to throw down the idea ‑‑ well, we saw what Vodaphone did in Egypt where they shut off the Internet, and I think mobile phone access was also interrupted at some point in time.

So it does bring out the issue of at what point ‑‑ when you have a mobile operator with all this information, location, data, EXIF information that's been uploaded from a phone, or all this data that's flowing through their network, what sort of responsibility do they have to protect the privacy of the people who are using their networks?

Or at least working with organizations to perhaps provide end‑to‑end protection because you can only do so much on the mobile device.

In fact, encryption and decryption technology is woefully lacking.  You have to be a thorough geek to get encryption working, SM.  S encryption working on a mobile ‑‑ on a Nokia phone.  So I'd just like to add to his question and to also add something that stuck with me, and I'd like to leave that with the attendees here.  And this is something that Chase talked about and he said that poor people are the people at the bottom of the pyramid can least afford badly designed devices, and I would add services 'cause mobile services, you know, the work that AT&T z all these things, these are services that people depend on every single day. 

  >> AMBASSADOR DAVID GROSS:  All right, Pat, over to you.

  >> THE PANELIST:  Okay, I'll try, Chris, I understand the issues, I've given evidence in terrorist trials and kidnappings and pedophile cases and I've had to go to court so I understand the need to protect witness's and et cetera and I'm happy to have a conversation with you and understand more about that.   I'd like to use the UK as an example if I may, Juliana, and responding to you.  I think it's important that people understand that not only the mobile operators have privacy laws to comply with, but unlike other communications providers, that are not regulated as much as mobile operators, they have license condition imposed on them, or the communications regulation.  So in the UK, for example, mobile operators there have a legal obligation to maintain a capability to intercept communications.

They have a legal obligation to remove any encryption they put around communications.

It's ‑‑ it's law.  They have no choice.  But actually, that law in the UK is called the Regulation Investigatory Powers Act is designed to be compliant and compatible with UK on human rights and that came about because there was a real need to do that.

It also regulates the communications data that a law enforcement agency or the security services can request of them.

But to take one other point that Juliana raised about cutting off networks I think it's also important to understand that in some countries governments do have the ability to ask communications service provides to do certain things.

And sometimes operators are under an obligation, they're between a rock and a hard place, but what I would say is that it needs a very clear legal framework established in this area so that people have ‑‑ they can ‑‑ you know, they can be aware of the legitimate expectations to privacy, generally.

So if they commit a crime, you know, what's going to happen in relation to access to that data interception communications.  That's the basis that RIPPA is established on, and I think it works very well.  Operators are in a very difficult position, you know.

  >> THE PANELIST:  Just wanted to address your security point, I'm not a technologist for the company but when you're talking about applications riding on top of a network, you know, we don't control those and so I think part of issue is, and you know, there are good reasons for that, right, there are also policy considerations about the openness of the networks and the ability to easily provide applications and devices.  Well, one of the trade‑offs of that is you don't have one place to go to say why aren't you encrypting the facial recognition service, because, you know, I'm not aware that we offer such a service so that would be someone else, and I think that's one of the challenges here.  But it's also a challenge for law enforcement.  I think they would love to be able to just come to us and get all of it and they're not happy that in this complicated world that we live in now that they are trying to keep up and figure out, you know, how do I get access to the ‑‑ I mean, you know, then there's the question of a legitimate government request versus a ‑‑ an illegitimate government request.  But even assuming a ‑‑ you know, say going after a terrorist, you know, it's a real challenge for law enforcement now because there is no one stop place to go.  

  >> AMBASSADOR DAVID GROSS:  Next question from David, make sure you say your name and affiliation, but if you can, I'm hoping that your question will bring us more on the privacy side as for the security recognizing there are two sides to the same coin.  So I'm trying to keep true to the title of our panel.  (Off‑mic) because we are being transcribed and billions of people around the world are listening carefully pan.

  >> THE PARTICIPANT:  No pressure there then, it's been industrialized really interesting, there's been a number of panels at this event where youngsters are talking about privacy, and it became very clear to me in a number of sessions that their% session of what privacy is or indeed friends is changing.

And the provocative statement I would make to the panel which is do you think the notion of privacy will need to change? Do you think their perception of privacy and of what friends are and what of privacy is in their life will change in a few generations time?

  >> JULIANA ROTICH:  I guess ‑‑ all right.  I believe it will change, however, it does bring up an issue of ‑‑ you've actually had the pleasure of meeting Steven of the Family Online institute.  And the problem now is, I think there's this phrase that people forget, but if you post something on the Internet it's forever.

Now, that is something that I think the young people have to understand, because there are law enforcement agencies on Facebook.  There are hiring managers on Facebook who will check out your Facebook profile before you get hired.

These are issues that they do not realize, that ‑‑ by relinquishing some of their ‑‑ that they have to be the Guardians of their privacy, and the choices that they make may influence ‑‑ may influence their future.

So it's definitely a valid point, and it will change.  And I think probably it's a generational issue.

Maybe as we progress there will be an HR manager who will see your drunken photo companies and think, hey, no big deal.  You know, but for now I think that is something to flag for the young people, because the hiring managers, when they see your drunken photos, they're going to say, well, I'm not really sure I want to hire that person.  They're probably going to drink at the job.

Yeah.  So I think it will shift, but for now, there are some things that we do absolutely have to flag for young people. 

  >> AMBASSADOR DAVID GROSS:  All right.  Ian.

  >> IAN BROWN:  There's some interesting research which looks at this.  And you often hear people say not at this panel discussion, oh, young people don't care about privacy look at how they behave on Facebook, it will be dead in a generation, why are policymakers concerned about this thing.  Actually talking to young people they do see social networking sites as a private space, and a place to share information with their friends.  They're aware that Facebook and other social networking sites can access that data, but as Juliana said, they're not, you know, they're growing up, they're teenagers by and large, or even younger, they're not quite so savvy.  Adults are about the consequences of the party photo or maybe later in life insurers are on Facebook as well, they would be very interested to know things that might impact on whether they will give you assurance that's necessary for a loan or mortgage or some other very significant potential harms.  This comes back to the question where is the harm in privacy violation? These can be things that can really damage people's lives. 


  >> THE PANELIST:  I'm, David, yes, concept of privacy has to change.  It's not static, it's dynamic and contextual, ever more so in a mobile environment.  People's privacy expectations change at different courses in relationship with an entity and use of an app.

And I think young people, I think their perceptions of privacy will change.  We need to educate and work and educate young people more as we do grown‑ups as well.

I think, for example, you know, location privacy.  Would a young person or an adult here in this room understand what that means? Is the fact, is location just your country? Location these days, it's where you've been, it's where you are, more importantly, it's where you are not, which can be determined by analyzing.  It's where you're heading.  Most mobile phones now have a compass in them.  They know the direction you're pointing.  They know ‑‑ and you can use to determine how far you are away from the interactive billboard, for example.

Soon when the handsets are center sensor enabled and environments are center enabled they'll know which floor of the Shopping Center they're I     in, oh, he's heading this way, get the ad ready.  So I think it's a challenge particularly on the limitations of a Smartphone with the small screens, et cetera, to help people become aware of the privacy implications.

And I believe, for example, we focus too much on the concept of privacy's identifiability.  You don't need to know that it's Pat Walshe for a particular situation to have privacy consequences and implications for me.  So how do you develop a framework, the technology, the rules to allow people to control privacy and context. 

  >> AMBASSADOR DAVID GROSS:  We have a question down here. 

  >> THE PARTICIPANT:  Thanks.  My name is (name) from the German government from the ministry of interior, and I would like to make some distinction first talking about privacy.  I think we have to build some categories if we talk about it.  The first category is if we talk about the relation of a private person to a state authority, like a law enforcement agency.

The second category is if we talk about a private person, the relation to a big company like Google or Apple or whoever.

And the third one is, if you talk about the relation of one private person or user to another user.  For example, if someone is writing someone or posting someone on Facebook about a third person that might also be a problem of privacy.  So I will focus on the second group, the relation of a private person is the bigger company.

And I ‑‑ I wonder whether our private law ‑‑ privacy law as we see it or as we have it, especially in the European Union, is still able to deal with the problem we have there, because one of the big pillars of our privacy law is that we are focusing on each data or setting of data.  That is one problem.

The second one is, well, one of the ‑‑ one ‑‑ possibility to process data is to get the prior consent of the user.

So if I download an app, what is it? Is it a consent in the whole app, or is it a consent in whatever.  So these are two problems, I wonder we had we can find a framework in what we already have, especially for the mobile environment.  And I think it might be worthwhile to think about other solutions to ensure privacy.

One example might be to think about the competition law.

Why don't we think about ensuring that companies like Apple or others, Google, that they design their apps privacy‑friendly and that they give it to the market and say, okay, one competitor can control the other one if this is really a privacy‑friendly setting and app or not.

So I think we have to ‑‑ to think about ‑‑ very creatively think about new solutions how we can protect privacy instead of just talking about do we need more or less privacy, because the privacy law as we have it might work, especially for the relation of the private person and state authorities, but I doubt whether it still works even in the mobile environment within these other categories. 

  >> THE PANELIST:  I think the competition idea is a good one and it's certainly something the European Commission is talking about, about mandating interoperability between, certainly portability of services which Google has done a lot on already.  I think the way to take that further to ‑‑ to really bring competition into some of these areas it would be say if you're a dominant player, say like Facebook, then you should have to inter operate with other social networking sites.  I can take ‑‑ not just my profile to a competitor if I prefer their policy but if my friends are still on Facebook I need to be able to interoperate with them otherwise it's a meaningless option, I can take my profile to this service but all my friends are here.  So I think that has been talked about in the directive so I'll be very interested to see where that goes.


  >> PATRICK RYAN:  Here we go.  I'll comment briefly, because I've heard that proposal.  I think it's a very interesting one, I don't know that it ‑‑ it strikes me that there would be some competition issues with that about being able to influence competitors, although I certainly like the idea, something I use with my two girls, for example, when, you know, when it's time to cut a piece of cake, right? I implement the rule where the one person cuts the cake and the other one gets to choose.  It certainly stimulates fair cutting.  At the same time, you know, I would suggest that there is a very robust competitive market right now in ‑‑ in privacy, in companies like Google and Facebook are competing on those fronts.

The Google plus model is an example of that.  The ability to choose circles and to be able to have more granularity in that selection is ‑‑ is something that's influencing now how Facebook is drafting to its customers and rolling out different features to their customers as well.  There is I think quite a bit of competition there and I know from Google's perspective we look very closely at what competitors are doing and they're very interested in new developments in privacy. 

  >> AMBASSADOR DAVID GROSS:  Pat first and then Jeff.

  >> THE PANELIST:  Well, Germany is a particularly challenging environment to do business in from a privacy perspective.  If I take location privacy, for example, I think the way that briefs see laws have been implemented in Germany denies consumers services so if you're a mobile operator the rules are so turf getting written concept from, so if you want to locate me the operator would have to get consent.  What does that mean? And actually that creates privacy problems because you're creating a database on third parties.  So it's not worth a mobile operator getting into that market because the barriers are so high.  But GPS, not regulated.  So there aren't those same barriers.  So they flourish.  We need to be careful in privacy law that the challenge we have in Europe is that you have a directive or two that have been implemented in 27 different ways and it's a nightmare for businesses in Europe.  I'm glad to see that Vivien reading is considering all of this.  In relation to the review of that framework she is intent on putting in a principle of accountability.  She's intent on putting in a principle of privacy by design, but then from what I gather they're not going to proscribe the detail of privacy by design because recognizing that the technology will out strip the ability to provide protective framework.  On the app stores we have had now a document out for public consultation, privacy design guidelines mobile application development that people in this room and others can still contribute to.  And when we talk to people around the world, they're very happy to see those 'cause it puts in place the kind of standard, the baseline standard that you are calling for. 

  >> THE PANELIST:  I was just going to respond, to me competition law seems like a somewhat indirect way to go at core privacy issue although data portability is in and of itself you know an important value.

But I guess I still go back to the point that what we really want to do is to encourage the user to have good information and easier signals about Facebook versus Google and what's happening with my data and my privacy, and, so I'd say kind of horizontally as well as vertically, okay, now, what happens when I download an app? You know, and my Apple iPhone.

And to me that inherently is ‑‑ there can be a regulatory framework and a baseline for it, but it's going to have to be more positive development of standards and ‑‑ you know, signals to the user than I think regulation can deal with directly. 

  >> AMBASSADOR DAVID GROSS:  Thank you.  We have a question in the back and if you could get to a microphone.

  >> THE PARTICIPANT:  My name is Ahmed and I come from ‑‑ I have Vodaphone number from Cairo and when I get a cue to pay the bill or speak with the agent I get SMS telling me welcome to down Cairo when we occupied the Tahrir for a couple of hours I was the only person who was capable to tweet from Tahrir, and that's only because I ‑‑ back then I had a Nokia N900 running toward so I could use it and circumvent the blocking on Twitter while there was still mobile Internet and service.

Those days we all chose 12 million ‑‑ 12 million of us chose to break laws the teleCOMS not.  And the question is now why do Telecom companies choose to go by ‑‑ by the easier jurisdiction, those are international companies, they are based in case of Vodaphone in Europe, but also in the other mobile operator also in Europe, why do they choose to side with the easiest jurisdiction when it comes to privacy?

  >> THE PANELIST:  I was just going to say I think there are difficult decisions we have to make as companies about whether to operate in a market, but I know we've had situations where when you have the government making requests, you have local employees in that market, and we actually had a Call Center in Egypt that we evacuated when this ‑‑ when this happened.  And so I hear what you're saying but it's difficult for a company to make decisions about the lives of its employees in that location as well.  So I ‑‑ you know, I'm sympathetic and I think it's a challenging issue but I also think there are countervailing interests when you have employees on the ground in a country.

  >> AMBASSADOR DAVID GROSS:  Speaking about employees in the country I'm going to turn to Patrick.  Seems to me in terms of that you had a problem in China, having to do, not so much with the mobile side, but you have a lot of employees in China who were at risk.  I know from comments that Larry and Eric made as well a great certain about their safety as you had legal issues with the Chinese government.

You have I think if I remember correctly still a couple of executives who can't go to Italy for fear of arrest because of problem with You Tube if I remember correctly.  How do you deal with it from a Google perspective? Is this just a question of international companies being cowards.

  >> PATRICK RYAN:  Well, thank you for that softball.

  >> AMBASSADOR DAVID GROSS:  That's what I'm paid for.

  >> PATRICK RYAN:  So.  Right.  And these are two obviously very, very important issues, very sensitive ones, I have to say, and this isn't just a ‑‑ you know, I'm not just trying to get out of answering the question but I am fairly in you to the company.  I've been here for eight months, a lot of those things predated me.  The issue in Italy is ongoing, and that's an issue that's very difficult for me to comment on. But boy, I can tell you that Google takes these things ‑‑ takes these things very seriously and we recognize the responsibility, you know, to our employees, and also to the ‑‑ you know, to the ‑‑ the users of You Tube and the people that are using these services.

That may sound like a little bit of a non‑answer and it intentionally is so. 

  >> AMBASSADOR DAVID GROSS:  All right, Pat, you thought you escaped this question, you represent the entire industry.

  >> THE PANELIST:  Well, I don't know that I can speak for Vodaphone but if I understood the question correctly it was why, also, a mobile operator that's international would adopt local rules.

I think the initiative that I spoke about, the mobile privacy initiative that the GSMA has launched with its members seeks to put in place a global framework with consistent privacy standards and for users but they should always have local law to comply with.  Whether that is data inter exception law, laws to the retention of data.  So it's very complex, but there are at least grappling with that situation and they are coming together to put in place an overarching set of principles to give you baseline protections, that's all I can say.

  >> AMBASSADOR DAVID GROSS:  Any of the other ‑‑ Juliana wants to get into the middle of all these legal issues.

  >> JULIANA ROTICH:  I would like to sort of ask, again, sorry, I don't know if you've heard of Barlow's Internet Manifesto, and he's a paragon and a legend in Internet matters, I would just encourage the GSMA association to sort of think of like a mobile Manifesto, and think of the issues, because as he pointed out, not only was that an issue of Internet services being shut down at some point but there was an issue of blockage.   So ‑‑ because he had TOR which is a circumvention tool, he was able to still get to the channel that he needed for self‑expression.  So I would just like to sort of flag that for Pat, that you ‑‑ as you talk to the other mobile operators, to take inspiration from him in a thing called a mobile manifesto that expresses the right to users to express and to communicate, because we ‑‑ we really greatly depend on mobile services.  It's part of our lives.  And I think that that needs to be taken very, very seriously. 

  >> AMBASSADOR DAVID GROSS:  Let me just comment for those who may not know who she's referring to.  He was ‑‑ is one of the founders of the electronic frontier foundation, then also perhaps in my mind more importantly, an occasional lyricist for the grateful dead.  So he is somebody who wrote a money fess stow in 1998 which I believe is the one she's referring to that was a reaction to the Telecoms law that was passed in the United States in which he declared famously that no one, no government has jurisdiction over people using the Internet, which is acquaint concept today in light of all the things we've been discussing today.


  >> SUNIL ABRAHAM:  Some companies do push back, like (inaudible) in motion in India.  For the last four years the Indian government has been asking them to place the knock within the Indian jurisdiction and therefore they would come under the Indian ISP license and then real time surveillance equipment would have to be installed inside the knock.  But they have refused for four years now.  So some companies do stand up for users.


  >> THE PANELIST:  Other companies choose to make a conscious decision to interpret what is the situation that allows them to hand over their entire network to the government to administer in the case of national security which was not the case in Egypt.

So actually even looking at a narrow definition of what is national security those companies were ‑‑ did not have the obligation to allow the ‑‑ the authorities to administer.

  >> AMBASSADOR DAVID GROSS:  I believe we have an online participant, one of the bills who are listening in has decided to participate.


  >> YIANNIS THEODORU:  From Germany, a person thinks most of the issues discussed today are not privacy.  So she wants the panelists to share their views as to the distinction of those two notions particularly as in EU they constitute separate fundamental rights. 

  >> AMBASSADOR DAVID GROSS:  Okay.  Who's brave up here           (Laughter.)

  >> AMBASSADOR DAVID GROSS:  When in doubt always go to the academic.  Ian, you must understand that.

  >> IAN BROWN:  Yes.  It is an interesting question, I don't want to bore you for hours and hours, in the European framework between privacy and data protection is that the European consult of human rights often will go further perhaps than the EU's legal system in telling governments:  You just can't do that.   That just ‑‑ even for ‑‑ for very serious national security, serious crime purposes.  For example in the UK there was a case just a couple of years ago where previously the UK government had been keeping DNA samples from everybody who was arrested for everything except the most minor crime in the UK.

So they were up to about 5 million people's samples on this database.  You know, for ‑‑ for ‑‑ they had some very good reasons for doing so, for catching rapists, you know, many very serious crimes have been solved using that technology.  At the same time, the Court said ‑‑ this goes too far.  Keeping this data on everyone that's arrested, not just people that have been convicted.  So a lot of people that are arrested are not convicted and they should be treated as innocent until proven guilty.  The Court said that just goes too far.  And the relation ‑‑ the relevance of this to mobile privacy and Internet policy is, it takes a long time for those kind of cases to filter through the European legal sort of framework, all the way up to this Court of human rights and I think when issues of data retention of the Regulation Investigatory Act and a lot of countries that do give a lot of power to law enforcement and intelligence agencies about people's Internet use reach these courts I would not be surprised if they come to some of the same conclusions to say, look, okay, these laws do set out relatively clearly when these powers can be used so people can understand what's going on, but in democratic society they just go too far.

  >> AMBASSADOR DAVID GROSS:  Okay.  Anybody else?

All right.  Next question? Go ahead. 

  >> THE PARTICIPANT:  Hi.  (Name) from Pakistan.  Well, in the immediate short‑term.


  >> THE PARTICIPANT:  In the immediate short‑term, how is like ‑‑ this goes to like everyone, from government prospective to the civil perspective.  In the immediate short‑term, the next 12 months, how does each of you see and to what level or to what extent are you going to address these issues that you've discussed? And how much is ‑‑ do you think is a success factor? I come from a country where there's no consultation on this.  And when they're taking over, you know who I'm talking about, they just come with access and they just chop down the optical fibers and whatnot to the towers.  That is it for them.  So then no regulation, nothing plays over here, it's just communication has to be shut down so that no one knows what's happening.  And the next day you find out that you know what happened.  So the Manifesto idea is also something we said what are you going to do in the short‑term to arrest this.  Obviously no one solution fits all.  But there should be sort of a global move towards it, at least, you know, sort of a GSMA becomes the central point for that information.  But still what is the shortest term thing you can achieve with regards to privacy and these issues we discussed? Thank you.

  >> AMBASSADOR DAVID GROSS:  All right.  I think we will turn first to Pat because you've just spent a lot of time working on a very important set of principles and policies, suggestions, how does that fit into it? How do you see this being implemented, what sort of time frames are we talking about and how important is it.

  >> THE PANELIST:  Okay, so, yes we have our initiative which is looking at establishing baseline prince, we have operators who operate in Pakistan who are committing also to doing this.  In terms of time scales I can't give you an absolute time scale at the moment.  We're still working out some of the ideas.  So in October for example we have a workshop where we're bringing people together to discuss is it possible to have privacy icons on the device, because in many places in the world people can't read and write.  We have to communicate them easily.  Carnegie Mellon University in the U.S. are making people aware of particular social context of a need to make a decision.  We're bringing them over and other academics in October in London to discuss how we can do and translate some of these words into meaningful actions, so we are doing things. 

  >> AMBASSADOR DAVID GROSS:  Jeff, of course, wants to say something which means I need to move the microphone again. 

  >> THE PANELIST:  The gentleman from Germany, I think the way you framed the issues of, you know, it's very different issues dealing with government versus big companies and privacy.  And I'm Pat was talking about one aspect of it.  But I also think there ‑‑ and it was on the panel earlier today, we were talking about the need for clear process rules that governments would follow when there is an emergency or, you know, a legitimate security situation versus an overreaction.  And then obviously you have a challenge when you have governments that are just repress ive.

And I think ‑‑ so that, to me, is translating over all principles down to more specific guidelines.  And you know, as we were talking about before, before this discussion, you have to have a discussion with governments in ‑‑ before the emergency or the urgent situation happens.

And I think even in developed countries as we've seen this summer, that discussion needs to happen of the and we need to find ways to establish some bed ‑‑ I think, some bed processes for dealing with these issues.  And then, as you said, we have to have a global discussion with the countries where we really have challenges. 

  >> AMBASSADOR DAVID GROSS:  You have a question over here.

  >> THE PARTICIPANT:  My name is (name), project manager from Kenya.  I request the panel to allow me to give a slight comment, essentially a comment.

I believe, ladies and gentlemen, let us give credit to where it deserves.  I'm a Kenyan originally. Basically I've seen what the Googles and the Twitters have done, they've done a commend job, you've done a really amazing job with the Internet.  What I'm saying is if I can take back to the world where we are back it is so difficult to accessibility.  But simplified innovation and it's credible and I just want to say you've done a good job and keep it up.  Thank you.

  >> AMBASSADOR DAVID GROSS:  Well, we should end this panel immediately it seems to me at that point, how could you get any better than that.   You guys have changed the world.


  >> AMBASSADOR DAVID GROSS:  Any last questions or comments?

I think we're going to end with an FTC.

  >> THE PARTICIPANT:  I have a question for Jeff from AT&T.  So yesterday wired news had a story based on a document from the Department of Justice that the ACLU had obtained that compared the data retention practices of the U.S. wireless carriers, and that document noted that AT&T keeps cell tower location data forever.  So starting in 2008 you guys have been keeping cell tower data indefinitely.  And this is not something that a consumer would have learned by reading your privacy policy or by navigating your website.  And so this isn't a matter of you know whether the icon is sufficiently understandable or whether the privacy policy has been written by lawyers or whether it's 50 pages or one page.  You are not disclosing the privacy practices that matter.  And so two part question:  One is:  Why are you retaining cell tower location data indefinitely? And two, why are you not telling your customers about this?

  >> THE PANELIST:  Chris, I saw the article but I don't have the detailed explanation but I believe we ‑‑ you know, we obviously need to know why we're keeping the data how long and really is this accurate? And I don't have that information right now.  I saw the article.  But to your broader point, I think we do feel like we feed to be clear with our customers both in terms of how we're using the data and what we're using it for, and we do try and do that on our privacy policy.

And I'm the other point that goes to this issue is we need clear standards for what law enforcement can get access to and what that threshold should be as well.

  >> THE PARTICIPANT:  Your privacy policy doesn't mention the word retention at all.  There's no number of months or days listed in there. I mean if the DOJ document is wrong and you're in fact only keeping it for a year, that's great.  But you don't disclose that you're keeping it for a year.  You disclose nothing. 

  >> THE PANELIST:  Right.  And I think that's, you know, as far as I know that is not standard practice with any company to disclose at that level of detail and that's something maybe we need to have a further discussion about. 

  >> AMBASSADOR DAVID GROSS:  Okay.  Our time is up, so ‑‑ but I'm going to let in the tradition I tried to establish at the beginning of this fun workshop, one last thing that you want people know before they leave and rush for the doors that you think is important, that they probably don't already know and I'm going to start with Ian, oh, we have more questions more and more questions, all right.  No more first things, we'll go with the audience.

  >> THE PARTICIPANT:  Good evening.  (Name) I'm President of Public Universities in Kenya.  I'd like to talk about what Juliana said about the young people on the social network.  Growing up as a young man who is frustrated with life and so they would like to maybe raise this frustration to the social networks without knowing whatever you put on there it will come back and haunt you.  Whatever you put on the network may come back to haunt you.  I think what we need to do maybe ‑‑ we need narrow down to the institution so that they can talk maybe about the privacy issues that they are doing, the guys doing computer things they get to know that this network it's not good doing certain things about whatever you upload on that site know that even your kids or grandchildren may one day get to know that.  So that encompass as drunk card and he is telling us that we should not drink alcohol so we need to look an those issues.  Thank you.

  >> AMBASSADOR DAVID GROSS:  Pat, you look anxious to talk about drinking.

  >> THE PANELIST:  No, I spoke with a really clever developer who proposed something in social networking and about the fact that people really understand what they're doing when they post.  And ‑‑ and I'm conscious about Facebook, but this guy suggested that, hey, you have this pop up saying:  Do you realize your mum can see this? And people will make a very different decision about was it is they're going to post if they could realize the context and the implications and it's how do you help people like that.  My son is nine and a half.  When he was eight, he's on Clip Penguin, which is a little thing for kids to get together and pretend they're penguins a little social networking thing but you need parental concept.  So you have to sign up.  The email goes to the part.  Right? And you pressed yes, I agree.  So he and his little friend when they were eight years old I called them let's go into Hotmail and set up an email and we can just click okay.  So I think for young people, education is very really important to help people understand this area, I think. 

  >> AMBASSADOR DAVID GROSS:  I'm going to exercise the extraordinary powers given to the moderator at these sessions by noting the fact that our time is well up.  I invite everyone to approach any of the panelists perhaps with the exception of me with your further questions and with that I think we should appreciate in the standard and typical way our panelists.

(Session concluded at 10:02 AM CT.)