The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
>> Good morning, everyone. There was a change in the rooms. Today, now, here we're having Terms of Service as Cyber‑Regulation. If there is anyone interested in Cuba, or something else, that was the previous location, here is not the Cuban workshop. It's about terms of service and cyber‑regulation. Thank you.
>> As we had this change of room, and we have the pleasure of having this roundtable, I would invite all the attendees to have a seat here with us so we can have a convivial discussion. Please don't be shy. Don't be shy, please. There a lot of seats. Okay. Good morning to everyone, and welcome to this workshop on terms of service as cyber‑regulation. I'm a researcher at the center for the society foundation.
We will have, today, a roundtable discussion co‑moderated by me and Nicholas. We have a selection of very distinguished panelists. We're starting from my left, we have Jerry from EFF, Sean from UNESCO. We have Allon Bar, a colleague at CTS, Nathan, who is a prosecutor in Rio. And Mattias, who worked for the UPN parliament. We are going to have an interactive roundtable discussion, and I would like to start by asking to Sean to speak to us about the recent study on intermediaries that UNESCO has developed this year.
And before that, just to provide a little bit of context, I would like you all to think about how your behaviors are usually regulated by laws, and how your behaviors in the offline environment, obviously. And how your behaviors in the online environment are regulated. What is the instrument ‑‑ what are the instruments that regulate your behaviors in the online environment, and who defines those instrument? And how the relation that you have with every content and application provider, with every internet access provider, is shaped. Who shapes that religion? And having that in mind, I would like to ask him to provide us some initial remarks on the UNESCO. Thank you.
>> Thank you. Actually, it's a research ‑‑ we have commissioned Allon here. I believe he can speak better than me about the research, but, I believe he raised a very valid question. The intermediaries, they are, yes, to a large extent, they are impacted by the national legal framework. But this research is very interesting. It explores both the external legal environment, and also the internal self‑regulated framework, the impact on human rights.
So the major finding is that this does have control over many aspects through their internal policies and practice regarding the privacy. Actually, the study looked at the three categories of the companies, ISPs, both mobile and fixed line, and search engines, and also social network platforms across five continents, for example, like, such as social media.
Normally, in many countries, they are under very strict liability law. But they, too, have the possibility to protect user rights through informing them when their content is restricted. and to inform the user that when their data has to be disclosed, or shared, or retained. And same as search engine. They are under governmental request to remove content. But on the other hand, they are also using the user personal ‑‑ how you say that? Personal ‑‑ personalize the information search results for the user.
It's also a kind of control. So, we do think that both the ‑‑ I mean, we cover the states to create enabling legal environment. But on the other hand, we also think that companies themselves, their terms of service play a very instrumental role to protect users' rights, and respect privacy. And the recommendations, of course, we have quite many. And I think most importantly, I think, it's like the governmental policy should also be under a multistakeholder approach to get the users involved in creating their internal regulator framework.
And also, we need more transparency. Yes, many companies are launching a transparency report, but most about how they deal with the governmental request, or the request from other actors. But, regarding how they deal with the internal policy, internal regulation, we are not very much informed about that. So, more transparency will help. And maybe I stop here, because we can discuss more in detail.
>> MODERATOR: Thank you very much. It's actually a very good connecting thread with the coalition meeting that we had earlier, where we raised the point that terms of service can be a supplement to transparency report in explaining more in detail what are the company's policies in addressing requests for removal of content. But your report is particularly focused on the social role of intermediaries. It's not necessarily very technical from a legal perspective, if I understand correctly.
And in this respect, I would like to go to a lawyer, Jeremy, because I know you have developed this principles initiative, focusing on what is out there with regard to international human rights principles that can be used to define the framework for intermediaries. So, I'm asking just a very nice question, what legal instruments are out there to regulate or assess terms of service, and are they binding?
>> PANELIST: The principles you referred to are, themselves, not very legalistic. They're really just a set of aspirational principles that we think maybe liability regimes should comply with. From the side of government, intermediaries shouldn't be compelled to content without a court order. On the side of intermediaries, they should have transparency in their terms of service and the application of those.
But, there are some legal ‑‑ given that this session is really about legalization of terms of service, there are some legal instruments that can be used to induce the intermediaries to give a certain set of protections in the terms of service. They're not necessarily legally binding, but, we have the OECD's work on, for example, the e‑commerce recommendation, the consumer policy committee. They have a number of soft instruments that they produce, which direct intermediaries to disclose all of the relevant aspects of their policies to users, and to give them certain avenues of redress, and so on.
Given that the intermediaries are often working across borders, it's difficult to ‑‑ short of having a treaty, these kind of instruments are what we have to rely on to guide them along a consistent path, because there's not going to be ‑‑ we don't want the laws of one country to be exported to all of the other environments in which these intermediaries operate, so I think the idea potentially has a role here, and the platform responsibility has an important place to produce a nonbinding, sort of, norms to guide intermediaries along the correct path, even though they're not legal and don't have the force of an OECD recommendation, let alone a treaty or anything of that nature. The fact that they are developed using a multistakeholder process gives them a certain sort of legitimacy of their own that maybe we don't have through laws alone, so I think that's a very useful role for the coalition.
>> MODERATOR: Thank you. Sorry if I offended you by calling it legal. It was intended to be, actually, a strengthening point. But I do recognize that the role of multistakeholder processes in this context is particularly valuable to increase the legitimacy of what we are trying to do here, which is to find a common framework.
>> PANELIST: Yeah. Of course the development of the principles was in the shadow of the law. We did analyze the legal regimes from different countries in which we wanted these principles to be taken into account. So, you're correct in saying that there is, certainly, the influence of law behind the principles, but they don't have a very legalistic framing themselves.
>> MODERATOR: I would like, now, to go to Allon, knowing that we were starting to scratch the surface in terms of this debate. You have done very interesting work with the ranking project, aware of the fact that the behavior of the users is shaped by terms of service, the content of users can share online, can access online, is defined by terms of service. What kind of policies, or practices, should these intermediaries that provide content that can shape behaviors, what are the policies and practices that they should have in place to protect human rights?
>> ALLON BAR: Thank you for the question. With the project, we released a ranking last week called the corporate accountability, in which we looked at 16 of the largest technology companies around the world ‑‑ internet companies, communications companies ‑‑ to see how they deal with privacy and free expression issues towards their users. Terms of service are part of that, and they're used to regulate the content that users can share, can read, can access, can communicate with each other, as well as their behavior.
So, we have various indicators in this index on which we assess companies that touch on terms of service specifically, and which we believe are important guidelines or standards as to how companies should deal with and construct terms of service, what they should be looking like, etc. I want to mention a few of those to give you some idea of things that we believe to be very important for companies to respect the right of free expression of users in a good way.
So, some of the things we were asking, for example, does the company make the terms of service available in the language spoken by its users. That seems an easy question. It's not only self‑evident, obviously, if your behavior or the content that you communicate is ruled by a company, you want to be able to know what those rules are. Other things that we are looking at is, does the company communicate with users when they make changes to these terms of service.
If these terms of service are so important, then we'd expect users to be informed when any changes are being made. Do they provide clear examples of those rules? It's not just a legal term, but actually, a very concrete thing that people can learn from. This is what is allowed, this is what is not allowed. Do they inform users when they implement, for example, and they take out content from the terms of service?
If you have any sort of due process in place, imagine that the user should know what the rules are. The user should know when the rules are being implemented. The user should know what kind of process the company has in place to deal with that. And the user should also know, historically, examples or data about how often, under what circumstances, companies have enforced these terms of service. So, these are the kind of things that we were looking at.
And to give you some insight into the results, for some things, companies score very well. Do they make the terms of service available, yes, in general. Do they inform users when they make change to terms of service, companies scored much less well that they received poor scores in general. Oftentimes, companies do give the rules, what is allowed, what is not allowed, but they don't really give examples of what these rules mean.
One specific indicator that I want to highlight, in which all companies in the index scored zero, is that data about terms of service enforcement. So, if you recognize that terms of service are so critical to how behavior and the user communicating align, when they use mobile phone technology, then at least we should know how has the company dealt with this. And across the board from the companies that we looked at, we don't have any such information, which is one big black box.
So when companies, for example, do a report, they say, we've taken down content on the request of this government. They give zero information about content that they've restricted based on their own terms of service. So, they're saying we have these rules, and evidently, because we know of all sorts of content being restricted on social networks and other platforms, we know the content is being restricted, but we get no evidence of numbers from the company itself to help us understand what is actually taking place, what are the circumstances under which content has been taken down.
And as a consequence, for us as users who want to hold these companies accountable, we have fewer means to do so. So, that is something we hope companies will improve much upon, as well. Thank you.
>> MODERATOR: If I may ask you to further expand a little bit on, what are companies? Can you provide some example of what are companies doing in reality, in regards to terms of service? What is their attitude, what are the practices that they are developing?
>> PANELIST: Yeah. So these are basically the things that I just mentioned. So, their behavior toward these terms of service aspects is quite disperse. So, they do make terms of service available, but they don't necessarily give good examples. If we want users to understand how their behavior is being guided by these terms of service, then companies should treat users as the primary audience. They shouldn't talk to ‑‑ they shouldn't regard lawyers as the primary audience, they shouldn't regard governments or regulators as the primary audience.
The key this is that users are meant to understand how the company is ruling their behavior, and how they communicate. And that, across the board, is not looking so well for companies. Oftentimes, as we also discussed in the previous meeting, the coalition, terms of service are not read because it is way too challenging for anyone. And without very legal knowledge to understand what they mean. They are very long, etc. Some companies go a bit further by, for example, creating things like community guidelines where they try to give more specific examples.
Okay, these are the rules in plain language, what you're allowed and not allowed to do. And these are the kind of steps we hope companies will take much more to treat users as a primary audience to help them understand what they can and cannot do.
>> MODERATOR: Thank you very much. I believe this is the right way to go. So, saying the terms of service should be really for users, and not as a legalistic way for the company to protect, you know, behind complex terms. And also, I think from a technical perspective, sometimes they can be very complicated to understand. And the examples that you mentioned are very useful.
So, with that, I just have in mind what was done at the foundation in the project of terms of service, in which I was partially involved. And I'm wondering if Jamila can explain to us, more or less, what were these complexities in assessing terms of service. And, you know, how ‑‑ there was a need for an interdisciplinary team, so to say, talking about lawyers, technical persons, knowing how the service needs to work from a technical perspective.
>> JAMILA VENTURINI: Yes, thank you, Nico. Just to highlight, building on what he was saying, it was very strong the feeling that we had, the difficulty that we had while analyzing the terms of service of 50 platforms for this project. Besides not having any information on how the terms of service are enforced, there is little information on how a user can report inappropriate content, for instance. So usually you can find rules on what type of content is allowed or not, on some platforms, but it is very difficult to understand, what can you do if you find some inappropriate content or something violates your privacy, or something like that.
So, it's very complicated for the user to understand what are their rights and obligations while interacting with the platform. Besides that, we found that a great challenge was to determine what was justifiable, what was reasonable, while you're collecting data, for instance, especially in regard with ‑‑ to privacy practices. So, what is reasonable, what is necessary for the operation of a platform, for instance. That's an expression we usually find when they are talking about tracking, for instance, tracking users is not their website, which is a very common practice among the platforms that we analyze.
So, what does that actually mean? How do you classify ‑‑ how do you understand what's necessary or not for the operation of a platform in terms of the data they collect? Another justification is that we would find very commonly in the terms of service is that they collect data, or they perform certain activities in order to provide better services for users, or to create new services. To what extent that justify the collection of data from users? To what extent that would be something that the user should not consent to while using some type of platform, especially considering that there are ‑‑ accepting the terms of service is usually the price that you pay to have access to some services nowadays.
So, these were some of the issues that we faced during the project. So, as Nico was saying, it was and it is necessary in order to improve the methodology that we use to assess how terms of service are complying with the international human rights standard to have a multidisciplinary approach. And more than that, to have cooperation among different stakeholders to understand how these things really work. It would be very valuable in order to improve this methodology to hear from companies how they classify, or how they analyze these things, that for a regular user wouldn't mean much, that's necessary for the operation of the platform.
We use this to develop statistics and research, and to improve our services. It would be really interesting, and really important to understand what this really means in order to make a clear assessment of how these terms of service work.
>> MODERATOR: And if I understand correctly, this is precisely what this terms of service project is aimed to do. So, to clarify in lay terms, what can be very complex at first sight. And I think it is important to, indeed, engage with the companies to understand what they mean when something is generic, let's say, in the terms of service. So, this is just to explain, because we didn't introduce the project as such. And I'm wondering if you can also tell a little bit about how this project can be useful, particularly in Brazil now, as there is this reform of data protection law and the implementation of the Marco Civil, if you think that these results that you have can be informative towards the way forward.
On the other hand, it was also a very interesting way of understanding and mapping common practices developed by these platforms. And in this sense, especially this first phase of the project, that was basically trying to develop and implement, and enhance the methodology to analyze the terms of service. We could have an interesting map of common practices by these companies that was already useful in the public debates around decree that we regulate ‑‑ regulate?
Implement. And on the other hand, on the debates around data protection, draft bill for data protection in Brazil. It is important to point out, just for context, for the ones that don't know, Brazil has norms regulating the right to data protection, but it lacks a comprehensive law to deal with the processing of personal data. We have three bills pending or being discussed in different chambers of the Congress right now.
And something that ‑‑ these results were already useful from the providers to give inputs to the ministry of justice in these public debates. But something that I would like to leave open for further discussion is that Brazilian consumer protection law is already very strong in determining that abusive clauses of any contract will be declared null and void. Although validated protection authority is very difficult to implement this in the online environment.
And considering the great amount of services that we use, and fact that we are regulated by terms of service even when visiting a website, and not just platforms, how to enforce these type of clauses is still a challenge that we have to face internationally. And I hope ‑‑ and we hope ‑‑ that this type of project can give input to this type of discussion.
>> MODERATOR: Thank you. So, staying within the Brazilian jurisdiction, actually, I think you mentioned consumer protection law. But there is also another way in which terms of service, in particular, modern terms of service that we can identify based on the data that we have with this project. So, modern contractual clauses that serve to protect consumers can be incorporated into the law. So this is a particular procedure that takes place in Brazil, whereby prosecutors can basically come to an agreement with companies with regard to the way forward in which they are going to interact with users.
And this is called (?), so, I would like someone to tell us a little bit more about how this works.
>> Good afternoon. As Luca said, I am a Federal Circuit Prosecutor in Rio, and I'd like to start off apologizing for my English. I've been practicing, but I'm not as fluent as I'd like to be. Anyway, I will do my best. (Chuckling.) The Federal Prosecution Service is a prosecution authority which has a ‑‑ it is responsible for filing class civil actions aiming the protection of civil and social rights, including environment. The human rights protected by the Federal Prosecution Service on Internet are concerning child and hate speech crimes.
Because Brazil has signed the United Nations convention on the rights of the child, and elimination of all forms of discrimination. The Federal Prosecutor Service created specialized groups to investigate and prosecute cyber crime in 2003, and in 2006 in Rio, due to increased hate speech crimes and child pornography. Those groups do investigation and prosecution of criminal case, operations in judicial extradition case, signing of the compliance terms agreements, recommendations, and conduct adjustments term, enforcement, preventive actions to carry out of workshops in school about the safe and ethical use of internet.
You do a lecture about these workshops tomorrow. So, the terms of commitment or cooperation are agreements to address the law shortcomings, seek the internet company's assistance, and voluntary collaboration without any prescribed sanctions. They conduct term service to end or prevent a public civil suit, where there is a violation by the online provider regarding any law or right which can be settled with the TAC compliance.
There are sanctions in case of breach of its terms. The DCA is equivalent to an extradition enforcement order. On the other hand, we can say the whole of online providers access and content in combating cyber crime consists of the preservation of crimes evidence, as well as the user identification. They must create filters in order to prevent uploading of illicit material, and also disclose the hotline link to all its potentially dangerous service, blogs, online communities.
To the operation integration compliance between the federal prosecutor service and content providers, its proposal was the prevention and prosecution of child pornography. In fact, its main goals were the prevention campaigns, users, notification to the federal prosecutor's office about any operating, that the preservation and storage for data secrecy removal. We can also condense a brief history about the terms of commitment and the conduct adjustment terms in Brazil.
In 2005, it was signed the first operation integration compliance term between the federal prosecution service and the Brazilian content providers located there. In 2008, it was signed a conduct adjustment term between Google and the federal prosecution service Sao Paolo. We had a class civil action against Google, and we finished it with this conduct adjustor. In 2008, it was completed a cooperation term between the Federal Prosecution Service there, and the internet access providers.
Also in 2009, the Federal Prosecution Service in Rio and the Brazilian content providers in Rio signed terms. It's the same that Sao Paolo did in 2005. In 2014, we tried to sign a conduct adjustment term between Secrets and the Federal prosecution service, but I already explain about it.
>> MODERATOR: I was going to go there. This is the famous case that took place in Brazil last year, whereby Secret was required to change their terms of service because of the provisioning in the constitution of Brazil which forbids anonymity. There was an issue of changing the terms of service only for Brazil. And the company was American. So, how this situation developed, how did it end up, if you could tell briefly what the story was.
>> PANELIST: Okay. Okay. The North American APP Secret, which was used in closed groups and was supposedly anonymous, market by Apple in iPhones, and Google on smartphones, quickly spread in Brazil in 2014, especially in the school environment. The company did not have representation in Brazil. And the APP diffusion was so intense with the possibility of allowing people within the group not to be identified, that soon emerged crimes as cyber bully, among adolescents.
As well, the exchange of child pornography material. The investigation authorities, law enforcement, had no way to identify criminals due to the lack of contact with the company. It turned out that a public prosecutor in the state of Brazil attained a court decision with national reach which prohibits the marketing of the app by Apple and Google, although Apple complies with the decision, Google did not.
And so it continued to be used by students until that. In Rio, a policy commissioner delegate consulted the Federal Prosecution Service demanding for support with this issue. Therefore, the federal prosecution in Rio, Anna, managed to obtain the company address in the United States and notify them in Portuguese to present themselves before her. The company indicated lawyers for the task, which were very pleasant with the contact. So, they began to outline the terms of the conduct adjustment, in which was predicted the immediate creation of an address to exchange information with the law enforcement authorities, and linking the app to explain that the service would preserve the user identified as long as he was not committing a crime, in which case the user identification would be disclosed to the authorities and others.
However, over the time of Apple's no marketing of the app, awaiting the completion of the terms of conduct adjustment, to reoffer its Apple store, Secret fell into disuse. So the company chose to leave the country, not signing the TCA. Not long ago, I learned that the company was closed in the United States. As a lesson to the Federal Prosecution Service, we shall conclude that the companies, before launching their service in a country, must know its law and how its institutions work.
Certainly, the Secret company was completely unaware of the Federal Prosecution Service, and the protection of the public social interest in Brazil. This social interest which reaches an undetermined number of people, it can easily be an issue. And the reach of the judicial decisions in the country. In the case of the Secret app, which was so successful in Brazil, a country of complete (?) it was surely a feat, including from the consumer's point of view, which would not use such a popular app like Secret anymore.
>> MODERATOR: Thanks for this concrete example. And I think now we could just switch geographical perspective, and go to Europe, and the European Union, more precisely. And we should consider that both terms of service adopted by online platform, and both terms of service adopted by internet service provider can be, actually, shaped by regulation. Actually, they have to comply with regulation. So, we know that the European parliament is considering some platform regulation. So I would like to ask Mattias to provide us a couple of information on what could be the effect of such platform regulation on terms of service.
>> MATTIAS: Hi, yeah. I will have to disappoint you all and say that at this point, that would be speculation. Fortunately, I don't mind speculating. So, the European Commission launched a strategy in May this year which outlines their ambitions for the next years. One of their key priorities is platforms. And they have a growing concern about the market dominance of some platforms, and how to manage that.
And therefore, they launched legislation. And this is where we currently are. They launched online consultation on the platforms ‑‑ well, on regulatory environment for platforms online intermediaries, data and cloud computing, and the collaboratory economy, which would be the short name. It's an open consultation. You can all find it online on the commission web page, and you can also submit comments. And the insightful comments from Michael should probably be inserted into that consultation, especially it would be very useful, any data you have.
One thing in particular to take into account when you look at the consultation is that we've mainly been talking about platform to end user interaction in terms of service, but the commission is equally interested in ‑‑ or maybe more interested in ‑‑ business to business terms of services, as these platforms become ecosystems where a lot of other companies have their products. And a quick change in terms of services can be devastating for a growing business if the model they are working with is no longer functioning.
So, the consultation is ongoing. It ends in December. And if you ask the commission, they will say that until the consultation is over, they don't know where they're going. And after the consultation, they will start formal discussions with European stakeholders. And then at some point next year, they will launch, hopefully, a proposal. And that will take two years of the European legislation process, and then we might have either a regulation or directive, which then has to be transposed into national law, a regulation.
Other than that, it's very difficult to tell at this point. I would encourage everyone who care about the topic ‑‑ which I think you all do ‑‑ to go to the commission website, look at the consultation. It's super easy to put your input there. If you don't think the questions are good ‑‑ which you will not do ‑‑ then you can easily submit a position paper, as well. You don't have to be confined to the questions. If I should criticize the questions, the first question, this is our definition of platform, it good or bad, or do you agree with it.
And the problem is, of course, the definition of platform would equally apply to a huge platform like Google or Facebook, to a little privately run blog, which adds a common thread. They would all qualify as platforms. And clearly ‑‑ and I hope, and I think ‑‑ I don't think a one‑size‑fits‑all solution for this span of entities. But the problem when you have one definition for everything is that it gets very tricky to regulate.
But, at this point it's very difficult to say what will come out of it, but we will have some regulation. And any terms of service that doesn't comply with national legislation will be void in those countries. It will not restrict what they can write in their terms of service, but, it will restrict what will be applicable in those terms of service.
>> MODERATOR: Thank you very much. Having in mind that platforms are not the only intermediaries using terms of service, of course, and that recently, just a couple of weeks ago, the European parliament has approved new telecom single market regulation that also includes some regions. And the adoption of these provisions were, at the very beginning, stimulated by a joint investigation of the European commission and BARAD that found that almost 50% of the mobile users and 20% of the European ‑‑ fixed internet users were affected by restrictions in their internet access.
And that was in there as part of it, from the internet service providers, it would be the right to get this speed ‑‑ the actual speed, and not the possible speed of your internet connectivity or connection. That entire part was taken out during the negotiations between the institutions in the EU. And generally, they referred to the universal service directive, which is another legislation that will be reviewed some point in the near future. And they said that a user's rights should not be spread out, it should all be pushed there.
So, of course, with the regulation, that would not be the case. It will still spread it out. But we will have not only platform regulations, but we will have the universal service directive review, which will give regulation on terms of services for all services platforms, and intermediaries, and offline services.
>> MODERATOR: Thanks for providing this perspective. Now, I would like to open the floor for questions and comments. So, if you have any comments or questions to the panelists, or any observation, please feel free to speak. Do we have ‑‑ let me introduce you, Nathalia, who is our kind remote moderator.
>> NATHALIA FODITSCH: We have a question coming from Thiago. His question is, "What kind of instruments, tools, policies, are companies developing to improve their data protection commitment?"
>> MODERATOR: Please be my guest. Mattias.
>> MATTIAS: Since no one seemed to have a quick answer, I will allow myself to speculate again. None, at the moment. Companies rarely do anything unless compelled. Currently, there is another legislative ‑‑ well, regulation in Europe on data protection. And that is almost finalized. And since Europe is a rather big market, I'm pretty sure that most companies are waiting to see what comes out of that package until they develop any tools.
But this will be, hopefully, finalized before the end of the year. And then quickly thereafter, I think we will see companies hurrying to try to comply with it. So, right now, very little. In the next year, I think they will start to try to develop tools to ensure that they comply with the European standards on data protection.
>> MODERATOR: Okay, anyone else want to react? Maybe Jeremy?
>> PANELIST: Sure. I think that some of the things that you see some companies developing in terms of data protection is giving some customizability of privacy settings to their end users. So, for example, sometimes users now have an option where they can say they do not want to see targeted advertising or such things. The fact of the matter is that oftentimes, when any such customization of your privacy settings is possible, it is really rather limited. Oftentimes, there's no ability to, for example, opt out of data collection by the company. They say, you won't see targeted advertising anymore, but you can't say anything against the data collection services in the first place, or any sharing that goes beyond sharing with advertisers and perhaps reaches other parties.
Some small steps are being made, but overall, users should have much greater control, as is sometimes established in law, to have access to their own information as the company has it, and be able to edit that information and modify it, withdraw it, etc.
>> PANELIST: I agree that we haven't seen a lot from the companies yet, but I do also think there is mounting pressure, particularly for the U.S.‑based platforms, to bring their data protection standards up to scratch, particularly with the rejection of the USA agreement that's going to bring increasing pressure to bear, depending on how that ends up being resolved. And also, the general data protection regulation that was referred to which will certainly have an impact, and we don't even know yet to what extent in. But I think the writing is on the wall for the practices of the U.S.‑based platforms, and they're going to have to bring themselves up to a higher standard.
>> PANELIST: I'd like to share the global observation on this, because if we go beyond Europe and the U.S., and we see the internet companies ‑‑
>> MODERATOR: I'm sorry, can you speak closer to the mic?
>> PANELIST: I think that if you go beyond Europe and the U.S., you will see the intermediaries in the developing world operating in a much worse legal environment, because no data protection law, no privacy law, no even free expression regime in place. So, it's really a very big gap in this aspect. That's another thing I want to draw our attention, that compared to those big global company, Google, Facebook, there are many small, local, regional companies. And they are really, you know, very much in a weak position to practice effective self‑regulation, not to mention compliance standards in any circumstance.
So I think that maybe with more research ‑‑ I don't know if the foundations research ‑‑ personally, the human rights assessment on the 55 platforms terms of service. 50, yeah. I don't know how many are from developing worlds. And if that shows a gap, it's huge. And that will be one aspect which we look into further.
Most of them track users in other websites, allow third‑party tracking, meaning that they put ads or other types of mechanisms that allow companies to track users activity on their website. All of that considering the terms of service, meaning that they ask consent to do all of this. Most of them retain data for more time than necessary for their operation, or they are unclear about how long will they keep that data. And most of them share user data for different reasons ‑‑ commercial reasons, processing, or other reasons.
All of that, by default. And I think that's something that could be further discussed on how that could be ‑‑ how users could have more choice on that. I mean, nowadays, they ask for consent to do everything by default, and if you want to opt out, some of them allow users to opt out. Some say nothing about that on their terms of service, or give unclear instructions. That doesn't mean ‑‑ all that to say, with regard to what the documents ‑‑ the binding documents say, this is like the situation that we find.
And it's not very positive for data protection and the protection of privacy of users. But on the other hand, we didn't analyze the implementation of this. And we understand that there are several companies doing improvements in privacy options, privacy settings, and giving more options to the user. The only thing is that ‑‑ we find concerning is that they are not committed to give these type of options on their binding documents.
>> MODERATOR: And also another element that I think is worth considering from the terms of the human rights project I was pointing out in the workshop we had this morning on civil rights, is that almost around 20% ‑‑ so one‑fifth of the platforms we analyzed ‑‑ contain in the contractual agreement a waiver of any class action. So, meaning that by accepting the contract that no one reads, one‑fifth of the platform impose to the user to waive its rights to have a class action if the contract provisions are not right.
So, that is actually a very interesting element. And I think maybe Allon has a comment on this.
>> ALAN: Thank you. So, one of the things we were also looking at is, indeed, what can users do to file a grievance when they feel their right to free expression has been violated by a company's conduct. If you step back a little bit and think of, well, if companies are using these terms of service and the policies to implement all sorts of measures that basically go beyond what they are required to do by law ‑‑ for example, imposing further restrictions, or collecting certain data, etc. ‑‑ then it seems sensible that these platforms should actually also provide some way, some access to remedy for their users.
And this is one of the key pillars of the United Nations protect, respect, and remedy framework that tries to outline what companies should be doing to respect the human rights of their users. And so that access to remedy is really important, that users have some way to protest, to complain, to seek remedy, find redress for what a company is doing. And when we analyzed how companies ‑‑ the 16 companies in our research ‑‑ and by the way, I forgot to mention, you can find all the data. All the results are published on the website rankingdigitalrights.org.
When we looked at how companies performed, I don't think any company got higher than 50% score. The two highest‑scoring companies were in India and South Korea, who, by law, have some obligation to provide some access to remedy. For example, India, there is a law that requires companies to have a privacy officer in place so that if a user, you want to complain, you have a concern about how your privacy is being treated by a company, you can find some channel, some person to contact.
But across the board, companies are not doing enough to offer even such a path to remedy, let alone explain the process, what they will do when they receive a complaint, give insight into how they've dealt with such complaints, etc. And this touches on what Luca was mentioning, that if companies are basically even closing down by ruling out in class action you can take in court, and we want to ensure that individuals will not only be able to continue to seek access in court if they need to, but also find a way to have much easier and quicker, sort of, way to find remedy with the company itself.
>> PANELIST: I fully agree with Allon. And this also reminds me of a current trend that I see, a very worrying discussion nowadays in internet governance. When people talk about the negative aspect of internet, like hate speech, and radicalization and child protection. Many people will suggest a very hard counterstrategy in order to take on this. And this would risk compelling the companies to take the easy route of censoring and removing content on their own without a process, without appeal, without remedies.
And so, I think this should really be addressed by the terms of service as well, because this kind of counterstrategy can never serve a full solution. It's a maybe half solution. And we need more process and remedy, and appeal to be in place to eventually solve the problem.
>> MODERATOR: Thank you. Given we still have some time, we lost two panelists, by the way, because we changed room. So we have more time than we expected for questions. If there is nothing burning ‑‑ yes, there is a question. Otherwise, I have one that is a bit of a provocation. Maybe you can bear me with ‑‑
>> MODERATOR: You can wait for our audience.
>> AUDIENCE: First of all, my name is Micah, I'm in ISOC, Internet Society Ambassador. I don't have a question, I wanted to raise two resources I recently came across that are related to terms of service. One of them is called ‑‑ forgive me, it's called ToS;DR. It reads terms of service and reduces it down into digestible ways. I'm not sure if you're familiar with that. Another one I recently came across was ‑‑ it's called iTunes terms and conditions, the graphic novel. And basically, this is tumblr artist went through and made a 47‑page graphic novel out of the iTunes terms and conditions. So, no question, just wanted to bring that to your attention, in case you're interested.
>> PANELIST: The ToS;DR site is a useful resource. There's an associated one which is not maintained. I feel bad about that. That's partly an EFF project. There was a ToSBack, basically tracking changes in terms of service so you can be notified whenever a terms of service document is changed. Because even though under some consumer laws, the company is required to notify you of changes of service, under other consumer laws, such as in the United States, they're not necessarily required to do so.
They're entitled to have a condition in there that says you have to come back to read the terms of service and to find any changes for yourself, which, of course, is ridiculous. So, tosback.org is a useful resource, and I think there is a beta version, too, which you can find, I think, at tosback.eff.org, and if that doesn't work, you can come see me later and I'll direct you to it.
>> PANELIST: With regards to informing users of changes of terms of service, this is something we were looking at. And these are things we often encounter in doing research on terms of service. The company says we will inform you on the terms of service site if we make any changes. Well, if almost no one reads terms of service in the first place when they have to consent, who's going to come back to the website to identify what changes have been made, etc.?
When we are being asked. So, if you look at the results of your research of the index and you sort of try to identify what can companies easily improve on, this is one of those things. There's no way ‑‑ there's no reason, there's no strong argument for companies holding back on simply informing users of the changes. But also, to provide an archive of previous terms of service.
Some companies are doing a good job of this. They provide a redline archive, with line‑by‑line insight how terms of service have changed over time. Others simply put up the new version and if you're lucky, you can find an older version somehow that allows you to compare, in a difficult way, what change has been made. But oftentimes, they don't provide such information. This is something that companies could very easily do.
>> JAMILA VENTURINI: Just a quick note, since I have the data with me, this study from 2008 found that in the U.S., users should reserve eight hours a day for 76 days of one year to read only the privacy policies of around 1500 pages that they visit every year. And considering that every page you visit might be collecting data of you, and tracking you, that amount from 2008 to today, this might have been even bigger, and even longer time. So, it's very difficult to believe that users will find ‑‑ or will track the changes on the terms of service. That's why this type of project is very relevant and important.
>> PANELIST: And just thinking what was just said about users, they don't read the ToS, and actually, they couldn't even understand if they read it. From a traditional media world, there is accountability mechanism in place. UNESCO has a mandate promoting press freedom. And we have observed that we cannot just work on the law, nationally, but we need to encourage self‑regulatory mechanisms, the independent press council to be created to help the newspaper in one country have a greater independent body to oversee, to monitor, to supervise that dimension of the journalist.
This self‑regulation is stronger, works, functions, the easier it will refrain them from having a regulator from the government. So, I think maybe for the intermediaries ‑‑ not intermediaries, but maybe the global region, the national accountability mechanism should be considered to be in place, like the GNI. They can help the whole industry to share some common terms of service. They have expert, independent expert help to follow up the changes. And that may be even better.
>> MODERATOR: Do we have any other questions or comments from the floor? From the online world? Okay. So, please.
>> MODERATOR: I never heard that they had class action provisions. Have you ever came across any action challenging that particular reference to class action? Because that's really abusing, right?
>> MODERATOR: Her question is, if you had any example of some organization or some user challenging the condition imposing the waiver of class action.
>> PANELIST: We have two class actions, actually, against Facebook and Yahoo. But because they didn't comply, the decisions of judges in Brazil. They refused to comply the decisions, because they have only a representation in Brazil. They are situated ‑‑ located in United States. But we didn't have any legislation about internet until our Marco Civil, that's all about the internet. And the Marco Civil states that companies who is located in Brazil or do service in Brazil has to comply the law, the judge, and the decisions from Brazil.
>> MODERATOR: And I think now is the moment of Nico's provocation.
>> MODERATOR: Thank you. No. It was just ‑‑ in the absence of other questions, I wanted to continue along the line of the question posed by the remote participant. So, what can we do, what kind of tools are companies developing to ensure better data protection? And so, I wanted to tell you about this fact that recently, there are many services which are promising more control to users over their personal data. In particular, these are called personal data stores, where people can entrust their personal data to some company, which offers those, essentially, to any company that is willing to buy them.
And in exchange, the users will receive a token. So, this could be either of monetary value, or it can be a coupon for a service of the company, for example. However, there is a problem there, because we don't know exactly what's the value of our data. So, we all say that personal data is the currency with which we pay many of the services that we access, but we don't know, actually, what are the condition under which those data are going to be used, then, shared with third parties, you know, how long they are retained, etc.
So, to that effect, I mean, I think it's almost like entering a contract in the real world where you are giving someone the ability to get your money, you know, or enter your house. And under very vague circumstances. So from a data protection perspective, there are, I think, five elements that are problematic. And I'm wondering about, you know, what's the feeling in the room? What is the most problematic aspect in having vague terms of service under a data collection perspective?
So, how much data the company is collecting. Data retention, so for how long the company is collecting the data. Data aggregation, so how much it's combining those with other databases to which they have access. Then, security of the data. So, protection vis‑a‑vis third parties. So, what's the security standard the company is using. And then, further use of the data. So, how the company can change, you know, the purpose for which it's processing the data.
So, yeah. Under that perspective, if you were given the power, you know, to judge as a regulator, on which of these five aspects you would like to crack on, essentially, to ensure that there is, you know, a fair transaction? Which one do you think is most important? Yeah?
>> MATTIAS: I'm happy to answer this. The collection. Privacy is a way to solve a lot of problems. If you don't collect data, you can't misplace it or accidentally give it away. You can't aggregate it, and you can't sell it to third parties with or without consent. Data organization is the best way of stopping misuse of data. It's the same principle for companies, as well. Not bulking up on data just because they might need it in the future, but rather, just ensure they get the data they really need for their business model. So it's not an impossible sell.
>> MODERATOR: I just want to point out that this is what the other side of the story says, according to companies. With big data, you cannot know how they're going to be useful in the future. So, just to mention, there is this other side. And the challenge is, of course, to try to limit how much they can repurpose those data that they collect for one specific purpose. Yes.
>> MATTIAS: I can go outside the box and say transparency in all of these fields rather than any of those. So the actual information about all of these things, how the company handle it, what they collect, why they collect, how long they retain, that is the most important thing. If we don't know that, we don't know what to criticize. If we know exactly how they handle the data, then we can also discuss how we want them to handle the data. One thing ‑‑ the security of the data, liability if there's a data breach for the data controller would be useful.
And a requirement to report any data breach, which is currently quite weak in most countries would be an excellent first step. If there is a liability, if they misplace your data, or have a security breach, then the companies will be less inclined to collect data they don't think they need, because it's a risk. But none of this will be achieved by one method. We can't legislate, it will never be enough. Legislation will be watered down by corporate interests.
There is no place in the world where laws are made solely for the interests of the individuals. So, law can only take us that far. Then we need best practices between companies. We need self‑regulatory measures that are an even higher standard than the law, and we need public debate about a topic that constantly puts pressure on companies, so they realize they gain goodwill and consumers by having a sound approach to data.
>> Are you equally confident about self‑regulatory measures?
>> PANELIST: I didn't say it was enough, it's one part that's necessary. It's not enough. We need a high standard set by regulation, and self‑regulatory measures on top of that. A basic thing is, stop talking about personal data as property. It's a contractual arrangement where you allow companies to interact with some of your personal data. It's always yours. It's never theirs. Don't see it as property. See it as a contractual arrangement. That's a good frame of mind to discuss personal data when it comes to companies.
>> PANELIST: I definitely agree companies should do a lot more to offer transparency on these things. There's no company that I've seen that gives you any sort of insight in the form of a profile or dossier of what the company holds about you. Just so I'm clear understanding, we have practical examples of what a company holds about you. That should be offered to individuals, consumers, so that they can make the choice, they can then make their choice, okay, this is what the company holds about me, this is what they do about it.
And on that basis, they can make a choice if they want to interact with a company or not. But, to piggyback on the question, to put this to the audience we have here, I assume many of you are interested in privacy, perhaps concerned about your privacy, likely want to make sure your privacy is respected in. What worries you about your data, and when a company collects, shares, or keeps your data, what is it that troubles you or that you are worried about?
>> AUDIENCE: Hi, I'm from Brazil. And I think there is one very important point about all of this, is that before we start ‑‑ well, actually, why are we discussing all these things that we've been discussing here? There's one very important point is that are the end users ready to understand those terms, right? People ‑‑ if people don't understand how internet works, how can they decide? And I think this is one of the main things that I see. Because I don't come from a technical background.
And not coming from a technical background, going to lots of discussion about cybersecurity and privacy, what I see is a lot of tendency of trying to decide what people should have or should not have. Why people should be telling that, right? And also, how can they speak up if they don't understand the basics, right? For example, when you talk about cars, and you think about 30 years ago that people didn't wear seatbelts, because they didn't think it was important. They didn't see the risk on that.
And with campaigns and awareness campaigns, people start to realize, hey. Wait a second, we should be doing this. But they understood the point. And I think that this is the point that we are lacking, you know. That when the internet started developing very fast towards usability, but not towards security. So now we have everybody using internet, and not understanding how it works. And so that they cannot decide what is safe, what they want or not. For example, I am one of the guys that uses Facebook.
And I know what they do with my data, but I have contacts all over the world, and this is the only way I can keep in touch with them. But I chose that. That was my choice. And I decide what I put. And I know that what I put there is going to be used by them, but most people don't. And I think if all of a sudden we start changing the rules of the game, how many people would say, no, wait a second, I'm not going to use that anymore? It's just convenient.
But that's the point. People need to understand, they need to understand how the information goes through all the path on the internet, how the data is being used. And I think this is the main principle of the discussion.
>> MODERATOR: We have a comment?
>> AUDIENCE: Thank you. As a segue to my point, I would use what the previous speaker talked about, wearing seatbelts. And I still wear seatbelts largely because law mandates it, rather than ‑‑ people ‑‑ the world works like that. Most people I know, they see the highway and they suspect police may be there. And then they put the seatbelt. The question I'm trying to come to is that, the terms of service are issues of private contract. And removing ‑‑ I mean, in a world where the government is more based on private contracts, and the relationship between the social contract with governments and private contract‑based governments is shifting, a lot of areas which were earlier based on social contracts are now being taken over by private contracts.
And we have to reclaim that public space. And I see a lot of people talking about law, etc., which supports better data rights. But we also should realize that, at least Brazil is a large country and has some leeway to even talk about it. And 80% of the countries simply have ‑‑ the new economic structures are so strong that it's either take it or leave it. And they simply have no choice at all. Brazil has limited choice, India has limited choice. What I'm trying to come to is that it is a global phenomenon, and we need to start talking about norms of what is ownership of data, what is the economic value of data, versus data as a human right.
What are those balances? And these kind of normative discussions and principles should take place at a global level. It's not taking place. Right now there is a world summit review taking place in New York. I was in New York last month. You sit in that room, it looks like the world is going on very, very well. And there are issues to be talked about. We tried to at least put words like data governance and platform governance into the text, and people say, what is this? Let's not talk about this.
So, if you don't produce a space globally like WHO talks about health norms, and UNESCO talks about education norms, you shrink away from putting those spaces where these norms are. We will make a lot of progress. It's more true of 80% of the countries which are not economically powerful. They're given an option, either take it or leave it, and that's how the system works.
>> MODERATOR: I think someone had a reaction.
>> PANELIST: Yes, a very quick comment. I just imagine that we do need a strong self‑regulatory framework, and we do need global, regional, national, accountability in place to help users defend their rights. But, on the other hand, particularly in the developing world, that citizens are not aware of their rights to understand technology, to understand what implication the service can bring to their daily life. That's why so many people, children, they just give up their privacy without thinking about it.
So we are promoting media information literacy, digital literacy to help everybody to understand how internet media works, how it can impact their civil liberties. This would be a very crucial issue.
>> MODERATOR: I think someone else had a reaction on this?
>> JAMILA VENTURINI: Just a reaction and provocation, like, say I don't disagree with anything that was said here. Just, first, to try to think about these things. I mean, is this really new? Is the fact that people do not understand the type of contract they are going into, is it really new? Is it something from the online world, or is this just ‑‑ there is just a multiplication of these situations? Because people do not understand technology, and do not understand how other services are offered to them.
And regulation, consumer regulation, somehow tries to address to that. So, it's not like the terms of service are just giving people more information, how these things work will solve the problem. And people's ability to get knowledge of all this is also limited. So, if, for us doing this analysis of 50 platforms was super difficult to understand all the technical challenges and the technical issues that are behind the collection and tracking, and putting an ad.
Because none of these things are explicit from the terms of service. They are not saying, we put this ad and we will track you. They are giving that information, in other words. So I think there is a mixture of things. And the other thing, the other problem that I would like to point out is that if we change the model from this by default model to an opt‑in model, what, in the end, are the options users will have not to opt in if we have such limited offer of some types of services, and the trend is for this to be concentrated?
So, it's a take it or leave it model. And if we change from the default to opting in, it will continue to be a take it or leave it model. So, I think all these discussions are really connected to several other discussions that we are having. I mean, there are some initiatives talking about the internet, and some other things that were discussed in the previous session on the dynamic platform responsibility. I think that these all should be taken, like, in a more general perspective, and just to conclude, I really agree with Mattias' comments on how all these layers can complement. And there is also this technical layer that should also be taken into account.
>> MODERATOR: Thank you very much, Jamila. Thank you ‑‑ on responsibility. I think, also, we have seen from a comment, that there is a need for this kind of forum to discuss how to approach, from a governance perspective, especially the use of data. I mean, right now, to date, we don't have such a forum. And the dynamic coalition, perhaps, is one initiative that goes in that direction. It would be nice, you know, to ‑‑ if you are interested in these issues, if you could join the mailing list, you know. That's only the first step.
We heard in the first session that we had this afternoon that also, the U.N. Special person for privacy and the council of Europe are interesting in having some ongoing discussion, possibly might co‑organize some event over the course of the year. And that's probably when we would have, also, more results from the terms of service and human rights project. So, we would be happy to share that in the list. But from, like, a more positive note, I think there are many possibilities in the laws already that we can look at.
So, both the European regulatory framework and the Brazilian regulatory framework offer the possibility to create codes of conduct, and regulators, have a role to promote the code of conduct that they recognize as valuable in this respect. And so, maybe if the future, some technical solution that can recognize automatically the companies who are complying with this code of conduct, and giving the ability for users to see immediately, maybe through a browser plug‑in, when those companies are the good ones.
That's a good way to go. Then there is much work to be done on how to make terms of service readable and easily understandable. But I think we can also ‑‑ to that extent ‑‑ have someone with the expertise to do this, and adopt a technical solution that make it visually apparent to consumers every time they access websites. So, yeah. Thank you for all the input. I think it has been a valuable discussion. And hopefully we can continue over the course of the next year, leading to the next IGF. Thank you very much.