The following report is a response to the European Commission's Directorate Generals on the Information Society and on Justice and Home Affairs call for comments on a proposed retention regime across Europe. We welcome the intervention of the Commission in this process that has to date been led by some Member States, and previously conducted in a closed manner. We hope that the Commission will re-invigorate this debate, and provide some deliberation on this pressing issue before it is too late.
In this response we argue that any regime for the indiscriminate retention of personal data is hazardous. At a time like this, the European Union should be fulfilling its role to uphold the rights of individuals, as technologies become more invasive, and as laws are increasingly reluctant to protect individual rights. Data retention is an invasive and illegal practice with illusory benefits. And to date, the paths to data retention in Europe have involved illegitimate policy processes.
The retention of personal data resulting from communications, or of traffic data, is necessarily an invasive act. With the progress of technology, this data is well beyond being simple logs of who we've called and when we called them. Traffic data can now be used to create a map of human associations and more importantly, a map of human activity and intention. It is beyond our understanding as to why the EU Presidency and some select EU Member States insist on increasing the surveillance of traffic data even as this data becomes more and more sensitive, concomitant to a decreasing regard for civil liberties.
Claims that the retention of this information is necessary for investigations are not entirely accurate. Security gained from retention may be illusory. It is likely that traffic data that is associated to one individual may actually be linked to activity taken by another, or by a process that is unrelated to the activities of that user. The linking of one individual to a set of actions through checking logs is a tenuous link at best. In an investigation of 'who' visited a website with controversial content, even if the logs are well maintained, there is little certainty that they will lead back to the individuals who are being sought. We may be attributing actions and intentions to innocent individuals instead, which is also an invasion of privacy.
Retention is illegal. Article 8 of the European Convention on Human Rights protects the right to a private life. The indiscriminate collection of traffic data offends a core principle of the rule of law: that citizens should have notice of the circumstances in which the State may conduct surveillance, so that they can regulate their behaviour to avoid unwanted intrusions. Retention would be so extensive as to be out of all proportion to the law enforcement objectives served. Under the case law of the European Court of Human Rights, such a disproportionate interference in the private lives of individuals cannot be said to be necessary in a democratic society.
The process through which EU Member States are attempting to establish retention is illegitimate. Some believe that the mandate for retention was established through Directive 2002/58/EC on Privacy and Electronic Communications. In fact this Directive was passed under problematic circumstances, as were many of the national laws on data retention. Many of these laws were passed in response to terrorism, only for legislators to be shocked to discover that data retention has little to do with investigating terrorism and is more commonly used for common investigations and surveillance. We agree that the policy situation across the EU right now is quite fragmented with some countries with retention and others without. This is because some countries concealed the policy under the guise of actions against terrorism. Others pursued this policy under the veils of silence and ignorance to its ramifications. Some Parliaments have actively rejected the policy. If the Commission calls for the adoption of data retention due to concerns regarding the single market and harmonisation, it will be rewarding these undemocratic actions of some member states whilst ignoring the democratic responses of others.
The process through which the EU is insisting on ensuring that all countries weaken privacy protections to support generalised surveillance is contrary to the principles of an open society.
About Privacy International. Privacy International (PI) is a human rights group formed in 1990 as a watchdog on surveillance and privacy invasions by governments and corporations. PI is based in London, England, and has an office in Washington, D.C. PI has conducted campaigns and research throughout the world on issues ranging from wiretapping and national security, to ID cards, video surveillance, data matching, police information systems, medical privacy, and freedom of information and expression. We have worked on many occasions with the European Commission and the European Parliament on issues including travel surveillance, privacy and internet and electronic services, and trans-border data flows. For more information, please see http://www.privacyinternational.org/.
About EDRi. European Digital Rights was founded in June 2002, and currently has as members 15 privacy and civil rights organisations from 11 different countries in Europe. Members of European Digital Rights have joined forces to defend civil rights in the information society. To date, EDRi has worked on policy areas including data retention, spam, telecommunications interception, copyright and fair use restrictions, the cyber-crime treaty, rating, filtering and blocking of internet content and notice-and-takedown procedures of websites. European Digital Rights has an active interest in developments regarding these subjects in the EU accession countries.
In August 2000 the United Kingdom law enforcement agencies wrote a report on the retention of communications data calling for the retention of traffic data for seven years. At the same time the Group of 8 industrialised countries was meeting with industry discussing a similar set of proposals. At one such meeting the Italian government proposed a central European repository for all such traffic data for the purpose of investigating cyber-crime. Even in the U.S. in the late-1990s, law enforcement agencies called for data retention. We are not surprised that law enforcement agencies propose such measures.
We are surprised and alarmed by those leaders in the executive and legislative arms of government who support these initiatives, sometimes with unquestioning vigour. In the late 1990s when the policy was first considered in the U.S., the Congress resisted it; in August 2000, leaders and legislators did not follow through on the recommendations of UK law enforcement agencies due in part to adverse public reaction; the G8 ceased to pursue data retention due in part to industry resistance. For any number of reasons, many have since abdicated their role to scrutinize and question demands. And in some cases, when resistance to the policy has arisen, other political strategies are sought to circumvent opposition and deliberation. If the European Commission supports data retention policy, it is rewarding this form of governance. If the Commission calls for harmonisation of data retention to support the single market, it is penalising us all.
The indiscriminate retention of traffic data gives rise to a number of challenges. Technologically, the practice of retention is invasive, as it involves indiscriminately collecting and retaining information of a highly personal nature. This is no longer just a log of telephone calls made and received; it the registering of all things that are read, received, sought, in places over time with varying people, all to be used for some unforeseen later analysis. This information can be used to interpret and map human relationships, understand and extrapolate upon human intention, and track every movement of an individual throughout her daily life. We will discuss this in section 1.
It is often claimed that traffic data retention will combat terrorism. Officials argue with fervour that retention is a key technique in the struggle for global security. The perceived security gains may be illusory, however, as retention introduces many additional risks. Innocent individuals will be surveilled, with intimate details of their lives available to any and all agencies of governments. Linking specific traffic to a specific individual is a process that is fraught with error. Technologically, this linkage and storage of vast amounts of information is also difficult to achieve and costly to maintain. As others have said, and are likely to say during this consultation process, retention is a costly and burdensome practice for communications service providers. All together these factors will inevitably have knock on effects for consumer trust and citizens' privacy. This will be discussed in section 2.
In section 3 we will argue that data retention is illegal. Indiscriminate retention of personal information for the purpose of investigating, preventing, and detecting crime is in contravention with Article 8 of the European Convention of Human Rights. Finally we contend in section 4 that the policy process pursued by Member States and currently being pursued in various European policy fora is illegitimate.
Policies on data retention regularly conceal how sensitive this data is. It is often assumed that this is merely logging of telephone calls made and received. With the change in technologies, 'traffic data' ends up being a remarkable source of information, peering into the deepest details of an individual's personal life. According to the Article 29 Working Party,
A feature of telecommunications networks and of the Internet in particular is their potential to generate a huge quantity of transactional data (the data generated in order to ensure the correct connections). The possibilities for interactive use of the networks (a defining characteristic of many Internet services) increases the amount of transactional data yet further. When consulting an on-line newspaper, the user 'interacts' by choosing the pages he wishes to read. These choices create a 'click stream' of transactional data. By contrast more traditional news and information services are consumed much more passively (television for example), with interactivity being limited to the off-line world of newspaper shops and libraries. Although transactional data may in some jurisdictions receive a degree of protection under rules protecting the confidentiality of correspondence, the massive growth in the amount of such data is nevertheless a cause of legitimate concern. [FN_A29WP]
This is often ignored, however.
In the consultation document, the Commission states that
"law enforcement authorities are concerned that some data may not always be stored by all electronic communications operators to the same extent they were in recent years."
The Commission is stating that changes in technologies and markets are affecting law enforcement agencies' abilities to access information. In the final words of the consultation document, the Commission makes a further note regarding changes in technology.
"As much as possible, responses should not only look at current, well established technologies but should also take into account technology developments in e.g. VoIP, broadband."
The Commission is noting the changes in technologies that have reduced the ability of law enforcement agencies to access data that they previously had access to, but at the same time the Commission is calling for new powers to be applied to all new possible technologies.
In turn, Article 2(2) of the Framework Decision defines traffic data expansively as data necessary to trace and identify sources and the telecommunication, routing, destination, time and date, to identify the communication device, and the location at the start and throughout the duration of the communication. This is a vast amount of information, well beyond what was available previously in 'recent years'.
What the Commission needs to recognise is that the changes in markets and technologies have also changed the types of data that are qualified as 'traffic data'. Current traffic data is substantially different from telephone traffic data of old. Traffic data now may disclose intimate details of the lives, choices, and preferences of individuals.
In the days of plain old telephone systems (POTS), traffic data was simple: numbers called, calling numbers, etc. This data was not considered overly sensitive or invasive into the private life of the individual, and therefore only required minimal constraint. Judicial warrants were not required, oversight was minimal in fact, and reporting of the use of such powers was frugal. An additional factor was that traffic data was stored by telephone companies and in turn was available for access by law enforcement agencies, while content was not: traffic data was available, legally less sensitive, and so, lawfully accessible.
The traffic data records collected by telephone companies are generally of a form similar to:
|19991003070824 178 165 0187611205 46732112106 ------001----003sth 46 4673000---0013 14 10260|
Figure 1. Telephone Traffic Data, taken from Communications of the ACM. [FN_CACM]
The most significant fields are: date and time of the start of the call, duration of the call, and the phone numbers of the caller and receiver.
Since then, however, there have been a number of advances in technologies and markets. In the telephone system alone there are now many additional services such as three-way calling, call waiting, and call forwarding. All these additional services now generate more traffic data. For example, if you have your phone forwarding your calls to another part of the country while on vacation, this is recorded as traffic data. The market is also more diffuse, and it is now more difficult to identify which telephone service provider an individual subscriber is using because the era of state-run monopolies is now over.
The greatest changes can be seen in digital communications, such as through mobile telephony, internet access over telephone lines, wireless communications, and internet transactions. The constitution of 'traffic data' differs for each of these technologies. These changes in technology make it increasingly difficult to differentiate legally between what is communications 'traffic data' and what is actual communications content. We expect this to be challenged in the future in the courts. For example, in the United Kingdom intercepted communications content is not admissible in a court of law. As traffic data may be used as evidence, difficulties may arise where intercepted content will have to be disclosed if there is to be a fair hearing as to what is 'traffic' and what is not.
As individuals connect to their Internet Service Providers (ISPs), they usually authenticate with the service provider who verifies that they are indeed customers, and then they are assigned an IP address. This is usually done using the Remote Authentication Dial-In User Service (RADIUS), where the service starts a session by assigning an IP address once confirmed as a customer, and stops the service when the user hangs up. Part of the Start and Stop RADIUS records may look like:
Fri March 19 11:30:40 2004
Fri March 19 11:31:00 2004
Figure 2. RADIUS Start and Stop records. See reference [FN_CACM].
From this log we can extract a limited amount of information regarding the content of the communications transactions that took place. The user has been identified (firstname.lastname@example.org), the number of the caller (01223555111, which is a Cambridge number) and the place being called (02075551000, London), IP address assigned (220.127.116.11), the duration (131s), number of bytes and packets sent and received, type of connection, date and time. This data may be collected for billing purposes, but only in situations where individuals are billed for such access.
Though this information seems unproblematic, it constitutes more than just telephone 'traffic data'. Such traffic data over time identifies the change in location of a user. As users roam globally, calling from different telephone numbers, the user identification remains static. In this sense, the collected traffic data is more sensitive than traditional telephone data: where telephone traffic data pivots around a given telephone number, RADIUS data pivots around a user ID regardless of location. This discloses location shifts. The ISP knows everywhere their customers connect from; information that may be useful to other parties, particularly in the case of business travellers.
Some service providers do not require authentication for billing purposes, however. It is now common for wireless networks to provide open access to any user in an area. It is also possible to charge individuals for this access, as some companies allow individuals to roam in cities and even countries, connecting to wireless network hubs wherever they are. These wireless networks give rise to interesting traffic data as well.
The significant records of a centralized association system log are:
Figure 3. Wireless LAN data logs. See reference [FN_CACM].
Using these logs it is possible to identify moments where two individuals are alone within a cell, and whether they arrived together. Cell_IDs represent places (conference rooms, coffee shops, restaurants, airports) and the registration timestamps can reveal if two communications devices are together, and if they are moving together. Data mining can provide sufficient information to draw a map of human relationships and movements.
Traffic data for authenticating with communication providers is therefore changing increasingly. The same can be said for mobile telephones; and increasingly services are being offered to track employees' locations, and identify the locations of family members, friends, and others. Some telephone providers have chosen to not implement such systems for consumer purposes because it was considered too intrusive.[FN_TIMES] Yet the logs exist and are considered 'traffic data' by law.
Another form of traffic data is that which appears at the application level of interactions on the Internet. This type of data includes the names of servers to which the user tried to connect, possibly limited to IP addresses but easily resolvable to servers such as aids.helpline.org, and in some cases unless carefully delineated by law, URLs such as http://www.usdoj.gov/ag/trainingmanual.htm may be collected through web proxies and treated as traffic data. Monitoring the DNS traffic from a home connection will inform upon much of what the people inside may be doing.
When we keep track of all activities of any given individual while she is on-line we are able to see every resource in which this individual came into contact. When this information is collected over a period of time, we are able to track common habits but also 'suspicious' activity. Asking for all of the traffic data of an individual for a four week period in order to investigate a crime is the equivalent of having had an investigator track every movement of a suspect for that given month, watching which bookstores she entered, what documents she looked at, what homes she visited, and who she spoke with. And consider the non-suspect individual: she must conduct her on-line affairs knowing full well that in the eventuality that the State or some other entity has an interest in her, all of her activities are being recorded for future analysis.
We can even go the extra mile and possibly see intentions. Search engine logs contain the IP address of the client, the connection time, the object requested and its size.
18.104.22.168 - - [05/Mar/2004:19:33:05
+0000] "GET /cgi bin/htsearch?config=htdig&words=pornography+child
HTTP/1.0" 200 2225
Figure 4. Sample search engine traffic data. Adapted from [FN_CACM].
Observing the logs we can see for example, that 22.214.171.124 has requested (in a short period of time) information about "railway+info+London" and "union+strike". We may identify not only the patterns of an individual's movements on-line, but we may also interpret an individual's intentions and plans. Or more dangerously one could derive false intentions. A search such as "child+pornography" may be a search for studies on the effects of pornography on children; and a search for "google+playboy" may be to find information about an interview in Playboy about Google. Much more can be assumed with some data-mining, even if IP addresses are assigned dynamically; and compounded with location data, previous traffic data, etc., a comprehensive profile can be developed.
Even the Council of Europe recognized this in its (problematic) Cybercrime Convention.
"The collection of this data may, in some situations, permit the compilation of a profile of a person's interests, associates and social context. Accordingly Parties should bear such considerations in mind when establishing the appropriate safeguards and legal prerequisites for undertaking such measures."[FN_COE]
Access to this traffic data is a new power, and it is an invasive act. Requiring its retention for an extended period of time makes this data available to others through creating security weaknesses, or through other legal processes.
Yet the Framework Decision states that
It is essential to retain data existing on public communications networks, generated in consequence of a communication, hereafter referred to as data, for the prevention, investigation, detection and prosecution of crimes and criminal offences involving the use of electronic communications systems. (paragraph 5)
The Framework Decision fails to note that all human conduct in an information society makes use of electronic communications systems. This power necessarily applies to all human conduct. With the increasing emphases on 'trusted' and ubiquitous computing, the generation of personal information can only increase. Retaining this data is a new power, an extended power, an invasive power.
Increasing the amount of information available to parties in an investigation of any type does not necessarily lead to more certainty. In fact, the gains may be illusory. Communications service providers, especially broadband providers, are dealing with immense amounts of data through their pipelines. They have no billing purposes to attribute each incoming and outgoing bit to an individual user, and therefore their registration and authentications systems inherently lack accuracy and reliability. This lack of accuracy might lead to in-depth investigations of the behaviour of innocent users, just because some bits are missing from the ISP records.
For unforeseen reasons, a number of governments have sought the power of ordering retention as though once it is achieved many problems will be resolved. It is as though it is an inevitable thing, to want access to traffic data, because it is so informative and useful in the prevention, detection, and investigation of an act.
It is fair to say that all human conduct in an information society necessarily generates traffic data. The corollary is that traffic data can be used to piece together all human conduct in an information society.
The Framework Decision states that traffic data is useful for forensic purposes, in future investigations or judicial proceedings, thus justifying why preservation of specific traffic data for a specific individual is insufficient. The Framework Decision claims that this data may be needed months or years after the original communications. We question whether the forensic value of this information is the driving concern, in contrast to the perceived intelligence value of mapping all conduct in an individual's life.
We often presume that traffic data is immediately useful, and that retention will have only positive effects on the conduct of society, civil liberties concerns aside. The illusion of benefits to security must be offset with some realities: that traffic data does not easily link to individual conduct, this policy is linked with the increased identification requirements, and there are significant technological and financial ramifications to this policy.
Linking an individual to a set of actions, recorded in logs, is increasingly difficult. In the days of telephones, you would presume that someone would have to break into your home in order to make a telephone call to a known terrorist, so it is likely that the individual who owns the home and is a subscriber to a telephone company made that phone call. It is also safe to assume that the telephone company was the monopolistic phone company that recorded telephone numbers dialled for billing purposes.
Now the situation is different. Article 6(d and e) of the Framework Decisions calls for the retention of data (traffic, user, and subscriber) and orders communications service providers to ensure that the data is accurate, integrity of the data is maintained along with its confidentiality. There are many problems with this, beginning with how some providers may not even collect this information in the first place due to differing market conditions, and costs and risks. Second there are problems in linking this data with any given individual, in part because of various market conditions but also due to the types of services used. Third there are knock-on effects to the failure of the second that may lead to increased identification requirements that inhibit market development and the freedom to use communications devices.
The costs of retaining data can be prohibitive, and the amount of data therefore differs based on the market structure, the form of services provided, amongst other considerations. Free-ISPs may collect caller-ID information, but they do not have credit card details to verify the information of subscribers that other providers have. Individuals who run wire-less routers are sharing their internet connections with others in their neighbourhood, while the ISP that provides the fixed line service is not likely to know any better, nor will the individual have a recorded log of such activities. The market structure is sufficiently complex that it is hard to imagine a one-retention-policy-fits-all could ever apply, even within the same sector, e.g. ISPs, or mobile-telephony, or VoIP providers.
On the issue of prohibitive costs, this is sometimes a limiting factor in retention. Currently communications service providers (CSPs) may retain data until the data is no longer necessary for billing and engineering purposes. Under privacy law, they must delete this data or anonymize it when this is no longer necessary. We understand that this is not always the case, and some firms are retaining data for longer periods of time, even prior to the legislative initiatives that called for retention in some Member States. This does not mean that they are retaining all forms of data, however, because this would involve massive stores of data, and would incur greater costs than the return on the investment of storing this data and mining for any possible commercial usage.
To retain this data solely for law enforcement purposes is a significant cost that will be incurred by all service providers, and this burden will be shifted to the consumer. The cost is not just storage-related, however; it is also about granting access to all this data upon request. According to a representative from America On-Line (AOL), speaking to UK Parliamentarians on the idea of 1 year voluntary retention:
The norm is for IP addresses for AOL is that we would keep IP addresses for around three months. This would be something which suits our business and the security of our customers, but also law enforcement because we have been working with them for quite a long time. Adding on nine months to it is adding enormous cost to us. In our submission, we have given you some ideas of rough estimates of what we have done, which was $40 million just to set up the system and then around $14 million to run it. [...] As an example, AOL has, on average, per day 392 million sessions. We send -- not receive -- 597 million e-mails. We are just one ISP. I appreciate that we are a big one, but we are still only one ISP. ... [T]hat is about 100 CDs a day.
It makes it extremely expensive for us. Any unit cost, because you are going to multiply it by a number of days, is going to have an enormous impact on the business. [FN_ISPA]
On trying to actually access this information when the police require a disclosure, a representative from Thus PLC, another large ISP, based in the UK, responded,
If I can get slightly technical for a moment, we have this figure of 36,000 CDs. That is one years' data. You would not just store the data in raw form so you would have to search for it. You would organise it so that you could find stuff on an individual customer relatively simply. In effect, you would alphabetise it or whatever, but it takes time and effort to do that and computing power to do that that would have to be paid for. There is a trade-off here between the ease of storage and the ease of retrieval.
On the resulting costs,
I could probably justify £5 million or £6 million for my company alone. We understood that that was the amount of money that was available rather than what it would actually cost. If I could go back to your original question on this, what industry stores varies very much from company to company. As an example, e-mail transactions, like who sent them out to whom, we store for a couple of days, and it is stored on spare space on the e-mail systems because there are very very few requests to search. In fact, I cannot think that we have had any.[FN_ISPA]
All these resources are expended to provide information that may have very little benefit, or integrity, despite of Article 6(d). We are therefore quite surprised that in preamble paragraph 16 the Framework Decision states that "Member States shall ensure that implementation of the Framework Decision involves appropriate consultation with the Industry." In the past industry has repeatedly been clear on the costs and burdens of retention. In fact, generalised consultation in almost all cases has led to the rejection of mandatory data retention.
We are also concerned that this policy, once implemented, can be easily changed to compel service providers to collect and retain data that they otherwise do not need or use under traditional privacy law. The current proposal is worded in extremely broad terms as to what exactly should be logged. We have seen previous detailed proposals from Europol[FN_EURPL], Interpol[FN_INTPOL] and the G8[FN_G8] that would oblige ISPs to store not only access-records, but also mailheaders (including the subjectline) and surfing data, and IRC-behaviour (including nickname and channels), as well as USENET behaviour (reading and posting).
Once we start requiring retention, it would be a simple shift in policy, long envisioned by the G8, Europol, and Interpol, to compel the collection of additional data. It is not a leap of imagination that for 'standardisation purposes', just as we are considering the Framework Decision for 'harmonization purposes', all service providers would be compelled to collect similar types of data. Normally most ISPs do store access records for billing records, but there is no business incentive to store mailheaders or any usage of the other mentioned internet-protocols, since there is no billing involved. Similarly, in the case of monitoring surfing behaviour, providers that do not have a transparent proxy would have to basically wiretap all the communications of their users and filter out all individual port 80 requests. This results in a complete violation of all communications secrecy.
If retained data is used as evidence in a court of law, the defence will ask for and expect disclosure of the means of collection so that it can be assessed for reliability. This will place further burdens on communications service providers, and may also lead to some standardisation in the data that is collected. This process, however, is fraught with problems.
Whether the policy entails the retention of already collected data or the compelled collection of additional data, the data is not necessarily useful. Identifying an individual or a service provider based on some strand of traffic data is not always possible. We are therefore left to ask whether there is any forensic value to traffic data.
Even in existing logs, frequent errors arise such as the mismanagement of time-zone information, possibly linking the wrong subscriber account to the wrong ISP allocation for a given moment.[FN_ICFLAMB] Or the data management system may be problematic and investigations will finger the wrong individuals. According to the London Internet Exchange,
it is usual to employ the "unreliable" UDP protocol for transferring the information, so packets may have been dropped on the floor in the face of congestion. This may mean that an action is ascribed to a previous user of the same NAS port.[FN_LINX]
Where previously innocent individuals had nothing to fear, now their personal data will be analysed time and again.
Accounts are often shared, or sometimes leaked.[FN_CLAYTON] Billing systems are known to be insecure and can be used without permission. This all presumes that billing data is actually verified, or subscriber details are accurate, which is not always the case.[FN_ICFLAMB] Quite often caller data is not available, sometimes due to international calls or the use of generic telephone numbers through local switches. Open networks rarely record much information about the individual users, and cash-based services are always possible, particularly if pre-paid.
If retention is used to enable international co-operation in investigations and for use in forensics, and if this data is in turn unreliable, then one must ask if the retention of traffic data is purely for intelligence purposes. That is, retention may help in pointing to people worthy of further attention.[FN_CLAYTON] This evidence will not be useful in courts of law as no one can ever vouch fully for its integrity, nor be certain that the individual suspected of activity was indeed the individual using the mobile phone at that given moment. We therefore question the intentions lurking behind this policy of retention, and how, once established, the legal provisions for accessing this data may change, or expand. Therefore, innocent people's data is likely to be accessed for any number of purposes.
If the intentions are not necessarily to promote retention for intelligence purposes, then it is possible that the side effect of this policy is to enforce 'reasonable' conduct. According to the London Internet Exchange, "the ability to trace actions back to their source will, in itself, discourage unreasonable behaviour." If the by-product of retention is that it discourages unreasonable behaviour because of the fear of the recording of all of our conduct within the Information Society, then it will have significant effects on the ways in which we conduct our lives. If a mobile phone company is required to record all phone transactions for three years, individuals may be less likely to use the phone for making 'private' yet completely legal calls. If all transactions with services on-line are to be recorded regularly for the purpose of ensuring traceability in case of crimes, the purpose may be to promote 'responsible' behaviour and to minimize transactions with pornographic or other 'controversial' content. This is when retention starts to interfere with our general conduct, and other civil liberties apart from privacy alone.
As soon as one wishes to identify individuals then one is reduced to deduction and inference rather than being able to be "sure".[FN_CLAYTON] To trace the individual behind the transactions will require binding the individual to the device at a given moment. Solutions along these lines could include requiring further identification, mandatory practices for recording subscriber details, disallowing the sharing of resources by end users, limiting pay-as-you-go mobile phones, requiring identification cards and CCTV coverage of cybercafes, universities, and libraries, and banning open Wifi networks. All these measures have been considered and implemented in a variety of countries, most of whom we would not associate with favourable human rights practices, often with the intent to curb freedom of expression and to perform mass surveillance of communications.[FN_SILENCED] Even if we put civil liberties concerns aside, the economic implications of limiting the freedom of action of private actors remain significant.
All these contentions that we raise above show how data retention strikes at the very notion of an open society. Despite these difficulties, these costs, and the problematic integrity of this data, policies are still be established to create legal liabilities, burdens, and obligations. A former senior official from the Dutch Ministry of Justice pointed to this when he said
"Suppose there will be an obligation to retain all traffic data for 36 months, while an evaluation shows that only 2% of these data are being demanded for inquiries in criminal cases. Of that 2%, it turns out, only 10% proves to be really necessary as proof in the case, be it as direct evidence, or as a trace to such evidence. In that case, only 0.2% of all stored data are necessary for law enforcement. In that case, 99,8% of all these data would be stored on behalf of the useful 0,2%. Let us, for the sake of this example, continue to suppose that half of the 2% of data would be requested within the first week, and 9/10 within the first month. In that case during 35 months data would be stored on behalf of the 0,02% that would be useful in a criminal court case." [FN_CMPTRCHR]
The retention of data is disproportionate to the costs and burdens to industry and consumers, for the minimal gains and problematic identification of innocent individuals and parties.
According to the pre-amble from the Framework Decision,
"Such a priori retention of data and access to this data may constitute an interference in the private life of the individual. However, such an interference does not violate the international rules applicable with regard to the right to respect to privacy and the handling of personal data" ... "where such interference is provided for by law and where it is appropriate, strictly proportionate to the intended purpose and necessary within a democratic society, and subject to adequate safeguards for the prevention, investigation, detection and prosecution of crime and criminal offences including terrorism." (paragraph 10)
It goes on to say that
"The retention periods shall be proportionate in view of the needs for such data for the purposes of preventing, investigating, detecting and prosecuting crime and criminal offences as against the intrusion into privacy that such retention will entail from disclosure of that retained data." (Paragraph 12)
The authors of the Framework Decision assume that proportionality tests need only be applied to the disclosure of traffic data, not retention; we disagree with this assessment. And so does the European Court of Human Rights, as we will expand upon below.
We contend that any retention of traffic data should be specific, just as the Framework Decision states in Article 6(a) that access should be specific. Personal data should never be collected 'just in case', but only if there is a specific reason, i.e. in law enforcement, if there is specific, reasonable suspicion against a specific individual. We believe that the retention of this information under policy envisioned by the Framework Decision is surveillance as a result of state action, and it thus fails the limits on state action as under European law. Therefore, based on legal advice that we will summarise below (available in full in [FN_PICOV]), we believe that data retention is contrary to privacy and human rights law.
The data retention regime envisaged by the Framework Decision, and now appearing in various forms at the Member State level, is unlawful. Article 8 of the European Convention on Human Rights (ECHR) guarantees every individual the right to respect for his or her private life, subject only to narrow exceptions where government action is imperative. The Framework Decision and national laws similar to it would interfere with this right, by requiring the accumulation of large amounts of information bearing on individuals' private activities. This interference with the privacy rights of every user of European-based communications services cannot be justified under the limited exceptions envisaged by Article 8 because it is neither consistent with the rule of law nor necessary in a democratic society. The indiscriminate collection of traffic data offends a core principle of the rule of law: that citizens should have notice of the circumstances in which the State may conduct surveillance, so that they can regulate their behaviour to avoid unwanted intrusions. Moreover, the data retention requirement would be so extensive as to be out of all proportion to the law enforcement objectives served. Under the case law of the European Court of Human Rights, such a disproportionate interference in the private lives of individuals cannot be said to be necessary in a democratic society. We would also like to note that the European Court of Justice has adopted the same approach to data protection/privacy as the EuCtHR.[FN_FIPR] The Commission risks another collision course with the Courts if it pursues this policy without adequate safeguards.[FN_PNR]
Article 8 of the ECHR guarantees the individuals right to respect for his private and family life. The Article specifies that public authorities may only interfere with this right in narrowly defined circumstances. In particular, any interference must be in accordance with law and necessary in a democratic society, in view of such public interests as national security and the prevention of crime.
These provisions have been interpreted in a series of decisions by the European Court of Human Rights. In these cases, the Court adopts a three-part test for assessing the legality under the Convention of a governmental measure affecting individual privacy:
The Court has on numerous occasions decided cases involving analogous governmental surveillance of its citizens, frequently finding such regulation to be in violation of Article 8. Analysis of those cases shows that the data retention regime proposed by the Framework Decision and now reflected in certain national laws would interfere with the Article 8 right to privacy. Moreover, indiscriminate retention of personal data is not in accordance with law because it fails to distinguish between different classes of people and therefore denies citizens a foreseeable basis on which to regulate their conduct. Finally, such laws are not necessary in a democratic society because blanket retention of data is wildly disproportionate to the law enforcement aims that it seeks to advance.
The European Court of Human Rights has interpreted Article 8s reference to respect for private life expansively. Private life does not consist only of an individuals innermost thoughts -- those that he chooses not to share with the outside world. It extends to the right to establish and develop relationships with other human beings. Intrusions into an individuals personal or business affairs that interfere with this right therefore fall within the protection of Article 8.
An individuals use of communications services falls squarely within this zone of privacy. The telephone, the Internet and other communications services are quintessentially about bringing people together, in a personal or a business capacity. Government regulation that chills use of these services is accordingly an interference with the right to respect for private life protected by Article 8. Thus, in Klass v. Germany, the Court reasoned that because a law permitting interception of mail created a "menace of surveillance" for all users of the postal service, and because that menace struck at freedom of communication, the law therefore constituted an interference with the right to respect for private life. As we have pointed to above, the indiscriminate retention of traffic data strikes at freedom of communication in the same way as the law at issue in Klass. By ensuring that use of communications services will generate a record of ones private activities, data retention requirements threaten all users of those services with the menace that this record will be abused, either by public or private actors. That menace is no less an interference with the right to private life than the generalised threat in Klass that ones mail may be intercepted by the authorities.
Retention of data by the authorities is an interference in private life, whether or not the State subsequently uses that data against the individual. In Amann v. Switzerland, the European Court of Human Rights found Article 8 applicable when State security services kept a record indicating that the applicant was a contact of the Soviet Embassy, after intercepting a telephone call from the Embassy to the applicant. The Court specifically noted that storage of the information on an index card alone was sufficient to constitute an interference in private life and that the subsequent use of the stored information had no bearing on that finding. Similarly, in Rotaru v. Romania, the Court found that the storing of information by the security services on the applicants past activities as a university student constituted an interference with his Article 8 rights. The data retention envisaged by the Framework Decision and now seen in some Member State laws is of a far greater magnitude than that at issue in either of these cases. Under the Framework Decision, for instance, at any given time a record would be in existence recording each and every person or entity with which an individual had communicated electronically over a one to three year period, as well as the time of the communication and the location from which it was made.
Data retention is no less an interference in private life when it is limited to traffic data, rather than recording the content of individual communications. The European Court of Human Rights has repeatedly found the recording of numbers dialled from conventional telephones to constitute an interference with private life. In an earlier technological era, the Court pointed out that the records of such metering contain information which is an integral element in the communications made by telephone. Indeed, the information at issue in Amann -- that the applicant was a contact of the Soviet Embassy -- could have been inferred just as easily from traffic data as it was from interception of the content of the communication. With the technological advances and the amount of data that may be stored regarding an individual's interactions, as the case law of the European Court of Human Rights makes amply clear, this represents an interference of unprecedented proportions in the private life of every user of European-based communications services.
Of course, not all interferences with the right to private life violate Article 8 of the European Convention on Human Rights. Article 8(2) acknowledges that there are certain situations in which interference by the State is justified. But the Court has been clear that this paragraph, since it provides for an exception to a right guaranteed by the Convention, is to be read narrowly. The Court has accordingly interpreted Article 8(2)'s requirement that such interferences be in accordance with law, as meaning not only that there must be a law in place authorising the interference, but that it should meet the standards of accessibility and foreseeability inherent in the concept of rule of law. The data retention regime envisaged by the Framework Decision fails to meet these standards. Even if we assume that it was implemented by national laws that could be accessed by all citizens, the very idea of blanket data retention offends the standard of foreseeability as it has been developed by the Court.
The principle behind the foreseeability requirement is the simple notion that the State should give citizens an adequate indication of the circumstances in which the public authorities are empowered to interfere in their private lives. When laws are foreseeable in this way, individuals can regulate their conduct accordingly, so as to avoid invoking unwelcome intrusions by the State. Laws that offer citizens no reasonable means of avoiding surveillance of their private affairs by the State are the hallmark of the police state.
The requirement of foreseeability is not satisfied by blanket regulations, such as those envisaged in the Framework Decision, that allow everyone to foresee that the State will interfere with their right to a private life. As the Court said in respect of secret surveillance in Malone v. United Kingdom, it would be "contrary to the rule of law for the legal discretion granted to the executive to be expressed in terms of an unfettered power." Rather, what makes a law foreseeable is the extent to which it distinguishes between different classes of people, thereby placing a limit on arbitrary enforcement by the authorities. Thus, in Kruslin v. France, the Court found that a law authorising telephone tapping lacked the requisite foreseeability because it nowhere defined the categories of people liable to have their telephones tapped or the nature of the offences which might justify such surveillance. In Amann v. Switzerland, the Court reached the same conclusion with regard to a decree permitting the police to conduct surveillance, because the decree gave no indication of the persons subject to surveillance or the circumstances in which it could be ordered. Data retention laws that fail to distinguish between different classes of people would have a more pernicious impact on individual privacy than the vague laws at issue in Kruslin and Amann. Whereas the latter left every citizen vulnerable to a risk of surveillance, blanket data retention would subject every citizen to the certainty of ongoing and unremitting interference in his or her private life.
Blanket data retention laws also offend the principle of foreseeability because they make no distinction for relationships that the State already recognises as sufficiently special to warrant a degree of protection. In Kopp v. Switzerland, the Court observed that a law authorising interception of telephone calls would in certain circumstances contradict other provisions of Swiss law according protection to confidential attorney-client communications. The Court found that the telephone tapping law failed to meet the standard of foreseeability, because it provided no guidance on how authorities should distinguish between protected and unprotected attorney-client communications. The Framework Decision and laws like it suffer from the same flaw. Confidential attorney-client communications, to take one example, enjoy a protected status throughout the EU. Yet the proposed data retention schemes make no effort to distinguish between such communications (and others like it) and "normal" communications.
Blanket data retention is the antithesis of a regime designed to achieve the minimum necessary impairment of rights. In order to retain information bearing on the very small fraction of the population involved in criminal activity or threatening national security, mandatory data retention gives rise to an indefinite and ongoing interference with the privacy rights of every individual who uses European-based communications systems. Such a broad interference with an established right exceeds the bounds of permissible interferences as set forth in the European Convention and enunciated by the European Court of Human Rights.
Article 8(2)s limited exception to the right to respect of private life requires that any interference be no greater than is necessary in a democratic society. This condition is subject to the same narrow reading that the European Court on Human Rights applies to the rest of Article 8(2). The Court has explained the principle underlying this requirement in terms of the need for any interference in Article 8 rights to correspond to a pressing social need and to be proportionate to the legitimate aim pursued. Mandatory data retention laws fail on this score as well. The distinguishing feature of a blanket data retention requirement is the absence of any reasonable relationship between the intrusion on individual privacy rights and the law enforcement objectives served.
For a measure impairing individual rights to be proportional, the State must put in place safeguards ensuring that interference with those rights is no greater than necessary. In Foxley v. United Kingdom, for example, the Court found that interception of a bankrupts mail violated Article 8 because of the absence of adequate and effective safeguards ensuring minimum impairment of the right to respect for his correspondence.
European legislators can make no showing that such large-scale impairment of individual rights arising from mandatory data retention laws is the only feasible option for combating crime or protecting national security. Indeed, international practice points strongly in the opposite direction. For example, in the U.S. there is no mandatory data retention policy. And, as recently as 2001, the Member States of the European Union signed a Council of Europe Convention providing for data to be retained on a selective basis, where the authorities have reason to believe that the information may be relevant to a criminal investigation. Law enforcement requirements can be met without widespread interference with individual rights. In short, blanket data retention is unnecessary. The interference in individual privacy rights required by mandatory data retention laws cannot therefore be necessary in a democratic society.
Proportionality also requires that interferences in private life take account of the specially protected nature of certain communications. Thus the Court has on occasion analysed the impact of State surveillance on the attorney-client relationship as part of its inquiry into whether a given regulation was necessary in a democratic society. In finding that the interception of a bankrupts mail was not necessary in a democratic society, the Foxley decision, for example, accorded particular weight to the authorities failure to distinguish between privileged communications from the applicants lawyer and other items. As already noted, blanket data retention falls short on this measure too. The Framework Decision, for instance, fails to take even the minimum steps necessary to ensure respect for attorney-client and other specially-protected communications.
The requirement that communications providers retain traffic data for up to three years (and even longer under some national legislation) would effectively create a massive database reaching indiscriminately into the personal and business affairs of each and every user of EU-based communications services. Whatever national rules were developed to regulate access to traffic data by law enforcement agencies, the very existence of this database would put at the disposal of the State an unprecedented amount of information about the everyday activities of its citizens. This would be a significant departure from the traditional approach in societies based on the rule of law, where the States ability to monitor individuals is strictly limited and regulated by such requirements as probable cause and a duly-authorised warrant.
The retention of traffic data by communications providers would also greatly enhance the risk that personal information could be stolen and exploited by third parties. Stored traffic data would present an attractive target for malicious hackers, who would be able to access multiple personal details about individuals in one place. Moreover, because the information would be stored, malicious hackers would be able to sort through stolen data at their leisure, rather than trying to intercept valuable personal details in real time, as at present. Thus, in the name of facilitating the investigation and prosecution of crimes, mandatory data retention laws would in fact make the job of the cybercriminal considerably easier.
Concern about the misuse of sensitive personal information could undermine public confidence in electronic communications systems. A blanket requirement on communications providers to retain traffic data would give all users of electronic services reason to fear that stored data relating to their personal lives might be improperly accessed. As the 2002 EU legislation recognised, "the successful cross-border development of these services is partly dependent on the confidence of users that their privacy will not be at risk." A loss of public confidence could, in particular, retard the role of the Internet as a channel of social intercourse and a vehicle for electronic commerce.
Privacy laws and practices were developed in order to increase the public's confidence. Retention policy contradicts every intention of data protection and privacy law. Sensitive information about the lives of all individuals is collected and treated indiscriminately. The Framework Decision reduces this all to a simple balance test, stating that
"while maintaining a balance between the protection of personal data and the needs of the law and order authorities to have access to data for criminal investigation purposes." (paragraph 3).
The balance of privacy law and the needs of law enforcement are already built into privacy law. Balances already exist in all protections of civil liberties; and thus allow the state to contravene rights in specific and proportionate ways. The Framework Decision discards those balances and decides to redraw the lines between individual rights and law enforcement interests. This is why the proposed policy is illegal.
The stated purposes for retention are many. We often assume that it is in order to combat terrorism, probably because it is often used as a trump card in debates. The Framework Decision claims that
In particular, it is necessary to retain data in order to trace the source of illegal content such as child pornography and racist and xenophobic material; the source of attacks against information systems; and to identify those involved in using electronic communications networks for the purpose of organised crime and terrorism. (paragraph 5)
In the United Kingdom, for example, voluntary data retention is provided for in its Anti-Terrorism, Crime and Security Act, 2001, developed in response to the terrorist attacks in the United States. Yet, as was later discovered, data retained for combating terrorism can be used in the prevention and detection of any crime, as well as a myriad of other purposes under UK law. After much convincing, deliberation and debate took place, and this access was eventually regulated to some degree.
We remain unconvinced that the EU has any initiative or interest in regulating access similarly. For instance, in article 5 the Framework Decision states that, when co-operating with international requests for traffic data,
The requested Member State may make its consent to such a request for access to data subject to any conditions which would have to be observed in a similar national case.
We are concerned that this is not a mandatory statement. Otherwise we see little purpose for national consultation and deliberation on access regimes if they can be circumvented in international co-operation. We also see this as an indication from the authors of the Framework Decision that they do not take rights seriously.
In turn, we find it ironic that the Commission states in its call for consultation that
"From a European single market point of view, a proportionate and consistent approach in all Member States is desirable. Consistency would avoid the situation where the providers of electronic communications services are confronted with a patchwork of diverse technical and legal environments. From this perspective, it is desirable that any data retention measures taken by Member States differ as little as possible, in particular in terms of the types of data concerned, the periods of data retention, the technical feasibility of any requirements and the sharing of costs."
This is not helped when in Article 7(d), the Framework Decision states that
"the process to be followed in order to get access to retained data and to preserve accessed data shall be defined by each Member State in national law."
This is an unacceptable turn of logic. The proposed plan is as follows: to expand the requirements to retention across the board due to international cooperation, regulatory harmonization, and combating terrorism, and then to say that access is left to national law, and arbitrary processes. If the EU insists on creating this massive regime for surveillance, it has to devise ways to curtail the monster's powers at the same time.
In establishing privacy laws and directives, and even in the process of developing the Information Society, the Commission often acted in a manner that promoted the rights of the individuals. Now the Commission risks heading in the opposite direction.
The list of stated purposes for this policy is growing. Starting with the maintenance of the single market, the Framework Decision goes on to say that
"To ensure effective police and judicial co-operation in criminal matters, it is therefore necessary to ensure that all Member States take the necessary steps to retain certain types of data for a length of time within set parameters for the purposes of preventing, investigating, detecting and prosecuting crime and criminal offences including terrorism. (paragraph 9)
Another purpose is to ensure harmonization amongst all the varying implementations of data retention within Europe (paragraph 8).
The grounds of international co-operation are exactly the reasons why the U.S. called upon the EU to implement data retention. In October 2001 President George W. Bush wrote a letter to the President of the European Commission recommending changes in European policy, to "[c]onsider data protection issues in the context of law enforcement and counterterrorism imperatives," and as a result to "[r]evise draft privacy directives that call for mandatory destruction to permit the retention of critical data for a reasonable period." [FN_BUSH]
This was building from recommendations from the U.S. Department of Justice to the European Commission that "[d]ata protection procedures in the sharing of law enforcement information must be formulated in ways that do not undercut international cooperation." Very similar language later appeared in the G8 documents from the May 2002 summit regarding data retention.
Ensure data protection legislation, as implemented, takes into account public safety and other social values, in particular by allowing retention and preservation of data important for network security requirements or law enforcement investigations or prosecutions, and particularly with respect to the Internet and other emerging technologies.
Using the argument that the retention of traffic data was critical for the war on terror, a number of countries adopted retention policies. What is remarkable, however, is that the United States has not adopted a policy of data retention, nor has it ever stated its intent to do so. The causes of international cooperation and combating terrorism are either not that important to the Americans (and Canadians, for that matter) or they consider this policy to be too invasive and problematic for their own systems of government.
While it is an interesting question to ask why the U.S. is pushing a policy in the EU when it has no similar policy at home, it is even more interesting to ask the following question: why are Ireland and the United Kingdom seeking this policy at the EU when neither country has an open mandatory data retention regime at home? In fact the opposition to these policies, when deliberated within national Parliaments, amongst industry and civil society, and monitored by the media, is remarkably high in these Member States. But these governments are pursuing this policy through the EU. Using the European Union as a forum for policy laundering is unacceptable. Such conduct must not be rewarded by the Commission.
Meanwhile, the Framework Decision does allow for national consultation, and accepts that this form of deliberation may lead to a rejection of retention. That is, Article 4(2) states that
A Member State may decide to derogate from paragraph 1 of this Article, with regard to data types covered by paragraph 2 of Article 2 in relation to the methods of communication identified in paragraph 3(b) and 3 (c) of Article 2, should the Member State not find acceptable, following national procedural or consultative processes, the retention periods set out in paragraph 1 of this Article. A Member State deciding to make use of this derogation at any time must give notice to the Council and to the Commission stating the alternative time scales being adopted for the data types affected.
This sounds reasonable, except that the Framework Decision then says that
"Any such derogation must be reviewed annually."
This is disingenuous. Even if data retention is rejected nationally, the authors of the Framework Decision insist that Member States revisit this rejection every year. This is akin to forcing a decision every year on the European Constitution if a Member State at one point has the audacity to vote against it. We consider this to be insulting to democratic procedure.
On the grounds that the purposes for retention are ever growing, while the purposes for access are well beyond those, and that this is an act of policy laundering intent on circumventing and ignoring national deliberative processes, we consider this Framework Decision to be an illegitimate act. We hope that the Commission will take note of this, and that it will not permit its role as a protector of the single market to be used and abused.
The retention scheme that is being deployed in a number of EU Member States is already problematic for any number of reasons covered above. The Framework Decision is problematic for an even larger number of reasons, as covered above.
Thus far we have explained the various risks and challenges to deploying data retention, from the technological, regulatory, and legal perspectives. We would like to conclude on a moral note.
Many of our understandings of human rights and civil liberties are hinged upon a sense of morality and dignity. These may be natural laws, or they may be based on public opinion and the voice of the silent majority. We do not permit torture or capital punishment, not only because they are against the law, but because it offends us. We regulate many activities not only because they are illegal, but because we feel that they are wrong, and unnecessary in the type of society within which we choose to live. Political processes such as those that establish data retention through opaque laws devised in sensitive times on grounds of protecting society from great fears have more insidious effects that merely introducing new laws. They also introduce new norms to an unsuspecting society, changing our senses of morality and dignity.
Changing norms will change our regard for what are proportionate and necessary measures in a democratic society. Once it becomes accepted that all information that is derivative from our interactions in modern society is collected by default in the eventuality that you do wrong to someone or to the State, there is little grounds for people to feel offended by forced collection of DNA of all newborns, or the default fingerprinting of all individuals. After all, the logic goes: 'collection is not problematic, and don't we all want to combat crime and terrorism?' and, 'unless you have something to hide/fear, this data will never be used against you'. Future policy deliberation will be reduced to acquiescence through changing what it is we expect from ourselves and our governments.
We should all be working to maintain a sense of liberty and freedom in this era, not working even harder to ensure that every single source of information regarding one's life is subject to surveillance by default, and indiscriminately so. It is remarkable that the EU has long been perceived as a promoter of the right to privacy, among other rights. It would be a grave disappointment if the EU embarks on a path that turns these protections into regimes of surveillance. Five years ago we would never have pursued such a set of policies. We now worry most about what will happen five years from now, looking back and looking forward: what will we think is reasonable, proportionate, and necessary in a democratic society when all activities and intentions are recordable, accessible, and required?
This response was co-ordinated by Gus Hosein of Privacy International. It made use of previous material, however. The section on invasiveness comes from an article he co-wrote with Alberto Escudero-Pascual, which was published by the Communications of the ACM (see list of references). The section on illusory gains contains ideas that appear in a report he wrote for UNESCO. The section on illegality was written mostly by Dan Cooper of Covington and Burling, in a legal memorandum for Privacy International, with additional material from Douwe Korff and Peter Sommer. The section on illegitimacy is building on prior work with David Banisar, Tony Bunyan (Statewatch), Barry Steinhardt (ACLU). Sjoera Nas of Bits of Freedom also contributed to this report, as did Andreas Dietl from EDRi. We would like to thank Ross Anderson, Nicholas Bohm, Richard Clayton, Martyn Thomas, and Edgar Whitley for their valuable comments and recommendations.
This submission makes use of a number of resources, some written by the authors of this report. They include: