The Right to be Forgotten: Legislating for Individuals to Regain Control of their Personal Information on Social Networks
Kathryn Smith[1], Faculty of Law, Monash University
Abstract
Facebook has over a billion users and, along with other forms of social media, has become a part of the everyday lives of many people. However, the benefits of increased connectivity bring with them risks to privacy. Individuals can be at the mercy of others when it comes to the personal information shared about them on social networks. The European Union (EU)'s proposed 'right to be forgotten' could give EU citizens the right to erase unwanted data from social networks and take back control of their personal information.
However, issues of interpretation of the right to be forgotten need to be addressed, particularly in relation to the categories of data that could be subject to this right. Debates are raging over the relative weight that should be given to privacy and freedom of expression, and critics of the right to be forgotten argue that embodying it in legislation could amount to censorship of the internet. Against this backdrop, this article argues for a broad construction of the right to be forgotten, and highlights the importance of being able to 'forget' in the age of social networking.
Keywords: Social networking; right to be forgotten; privacy; data protection, Article 17, the right to be forgotten.
Introduction
Social networking is a useful tool that has become part of the everyday lives of many people (Henrikson, 2011). Facebook, Twitter, Instagram, LinkedIn, YouTube, and Pinterest are growing rapidly. There are 750 tweets per second on Twitter (Bullas, 2012); Facebook collects over 500 terabytes of data every day (Kern, 2012) and LinkedIn signs up two new members every second (Bullas, 2012: 184). Five million images are uploaded to Instagram every day and there are 575 'likes' per second on the network (Bullas, 2012). Users are sharing personal information through social media at an increasingly high rate: this is how they connect in the digital age.
However, along with the benefits of increased connectivity that social media brings, there are significant risks to privacy. In particular individuals should be concerned about the level of control others have over their personal information on social networks. While many of the comments in this article about data protection are applicable to all social networks, Facebook – the social network with over a billion users worldwide (Lee, 2012) – will be the focus.
Facebook not only collects data directly (from individuals about themselves), it also collects data indirectly when other users post information about their friends to the social network. Social networks can share information about individuals that those individuals would never choose to share about themselves. This information becomes instantly accessible to a vast audience, and individuals currently have no right to require the deletion of this information. The problem arising from indirect collection of data on social networks is illustrated by the hypothetical scenario below.
Amy is a Facebook user. She has high privacy settings on her account, and she has chosen to interact with a narrow audience on the social network. However, despite taking all possible technical measures to protect her privacy, Amy cannot guard against the way in which others control her personal information on the site. Ben decides to take photographs of Amy drinking and partying on his mobile phone, and uploads them immediately to Facebook. Ben chooses the audience to which the photographs are viewable: he can even make the photographs publicly accessible, allowing them to be indexed by search engines such as Google. Ben chooses whether or not he will notify Amy of the photographs through a 'tag', and he chooses how to caption the photograph. Although Amy asks Ben to delete the photographs because she considers that they portray her in an unflattering light, he decides not to delete them. The photographs are copied and shared by Ben's friends. Amy has no control of these pieces of information relating to her that have entered the online world. This article contends that users should have control over their personal information on social networks, including indirectly collected information.
Only a quarter of social network users feel they are in control of their personal information (European Commission Eurobarometer, 2011). Current privacy protection mechanisms in legal systems around the world are inadequate to deal with the problems arising from indirect collection of data on social networks. Although in many common law jurisdictions, such as New Zealand and the United Kingdom, a 'tort of privacy' has been developed, these causes of action are most commonly used to protect the privacy of high-profile individuals (Douglas v Hello [2005] ECWA Civ 595), (Campbell v Mirror Group Newspapers Limited [2002] ALL ER (D) 17), (Hosking v Runting [2005] 1 NZLR 1). Court processes are also not financially viable for the average social networker seeking to remove unwanted information from his or her network, nor is a remedy offered by a court timely. Existing statutes such as the Australian Privacy Act 1988 (Cth) contain gaps from a data protection framework perspective, by allowing the operation of implied consent for the collection of personal information and most notably by failing to offer a right to require the deletion of personal information. However, the EU is actively seeking to meet the challenges posed by social networking platforms. The European Union Draft Data Protection Regulation (General Data Protection Regulation) 2012/0010 COD), if passed, would impose a complete data protection framework with strong consent, notification and erasure measures giving users much greater control over their information, even if the information has been posted by others on the social network. The ambit of the Regulation is broad, and this article will focus on one specific aspect of the EU Regulation: Article 17, which is commonly referred to as the 'right to be forgotten'.
Article 17 of the EU Regulation states that data subjects 'shall have the right to obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data.' A data subject is a person who can reasonably be identified in a piece of data (General Data Protection Regulation, 2012/0010 COD, art 4(1)). This right to erasure is available if:
- the data is no longer necessary in relation to the purpose for which it was collected;
- the data subject withdraws the consent on which the processing is based or objects to the processing of personal data; or
- the processing of the data does not comply with the Regulation for other reasons.
(General Data Protection Regulation, 2012/0010 COD, art 17). This right is qualified by exceptions, including an exception for freedom of expression.
The right to be forgotten brings with it significant issues of interpretation. This article highlights the importance of forgetting in the digital age, addresses the issue of whether Facebook or its users would be bound by the right to be forgotten, discusses the categories of data to which the right would apply, argues that the right should be given a broad application and provides an example of how the right to be forgotten could be implemented practically.
Why is 'forgetting' important?
According to Professor of Internet Governance Viktor Mayer-Schönberger, the ability to 'forget' is a virtue. He notes that 'since the early days of humankind we have tried to remember, to preserve our knowledge […] and we have devised a number of mechanisms to aid us […] yet through the millennia, forgetting has remained just a bit easier and cheaper than remembering' (Mayer-Schönberger, 2009: 49).
However, new technology is making it easier for people to remember. Social media assist us to record and share thoughts, likes, dislikes, life events, photographs and messages, and to view data from many years ago. Our profiles demonstrate how we have grown over time and how our relationships with others have evolved. A Facebook profile, for example, can provide a 'snapshot' of an individual at any particular point in time, allowing the individual to look back on this much like a diary or a photo album. Except, unlike a diary, the data is generated by multiple authors and is not stored privately in a desk drawer or a dusty attic: it remains indexed and searchable by a large number of people, and indirectly collected information currently cannot be destroyed.
Mayer-Schönberger gives an example of the damage that can be done by sharing a memory online through the story of a Vancouver-based psychotherapist named Andrew Feldmar. Feldmar was crossing the United States-Canadian border, as he had done many times before; however, on this particular occasion the border guard searched online and found an article written by the psychotherapist in which he noted that he had taken LSD in the 1960s. On the basis of information pertaining to events that happened over half a century ago Feldmar was detained for four hours, fingerprinted and barred from present and future entry to the United States (MayerSchönberger, 2009: 4). Our pasts can come back to haunt us much more readily in the digital age.
Individuals who have grown up with social media are more accustomed to self-censoring on social networks (Mayer-Schönberger, 2009: 5). Through the choices users make as to what information they share about themselves and what information they reserve for the offline world, users create a personal brand, a profile of how they want to be perceived. The term 'personal brand' is often used to describe the way in which some people use social media to market their unique characteristics for professional gain. However, I would argue that all users create a personal brand in a broader sense, through the choices they make on social media. It is beyond the scope of this article to discuss whether this selfediting is psychologically and socially beneficial, or whether the brands many users create for themselves are positive. What is central to this article is the notion of control of information, personal brand and memory.
While individuals can censor the direct information they provide to social networks, indirectly collected information – the information about a person that is posted to a social network by others – threatens an individual's ability to control his or her personal brand. The EU's proposed right to be forgotten is a crucial step in the age of social networking which will allow individuals to regain control of their personal information.
Who clicks delete? Facebook or its users?
Only 'data controllers' are bound by the right to be forgotten, according to the EU Regulation. A data controller is the party who determines the means of processing of personal data – this includes collecting, deleting, using, storing, disseminating or otherwise making available the data (General Data Protection Regulation, 2012/0010 COD, article 4(5)). A preliminary issue in the interpretation of this right is determining who are the 'data controllers': Facebook or its users? In most cases, the data controller will be Facebook, due to the existence of a 'household exemption' which would excuse most individuals from the obligations contained in the Regulation.
The household exemption is contained in Article 2 of the EU Regulation, which states that the Regulation 'does not apply to processing of personal data by a natural person, without any gainful interest, in the course of his or her own exclusively personal or household activity' (General Data Protection Regulation, 2012/0010 COD, article 4(5)). On the face of this provision, it would seem that since most Facebook users interact with the social network for personal recreation and correspondence, they fall within the exemption and are not regulated as data controllers. However, due to the very nature of social networks in facilitating the sharing of information with a large and sometimes indeterminate audience, the underlying issue is whether the use of social media is ever a purely personal or household activity (Wong, 2009: 142-49).
The 2003 European Court of Justice case of Bodil Lindqvist demonstrates how individuals may unwittingly take on the role of a data controller when publishing information online. In this case a community-minded parishioner shared personal information, such as telephone numbers and family circumstances, of her parish colleagues on a public website to assist churchgoers. Due to the public nature of the webpage, and the fact that it was available to an indefinite audience beyond her 'personal' or 'domestic' sphere, she was regulated as a data controller (Sweden v Bodil Linqvist [2003] (C101/01) ECRI I-I2971).
Article 29 of the Working Party on Social Networking has addressed the application of the exemption in the specific context of social networks (Article 29 Data Protection Working Party, 2009). According to the Working Party, ordinarily users are considered data subjects, rather than data controllers (Article 29 Data Protection Working Party, 2009: 5, 61). However, in some circumstances the activities of a user may not be covered by the household exemption. These circumstances include: where the user acts on behalf of a company, or promotes a commercial goal; where the user has a high number of third-party contacts whom he or she does not actually know; and where access to profile information extends beyond self-selected contacts, to all members of the social network, or where the profile is indexable by a search engine (Article 29 Data Protection Working Party, 2009: 6).
This interpretation of the household exemption recognises the fact that social network users are capable of publishing information to the public at large. The policy rationale is that in circumstances where social media is used to achieve a commercial aim or to publish to an indefinite audience, they must take on the responsibilities of a 'data controller.' In any other circumstances it would be too onerous to impose the obligations contained on the Regulation on ordinary users.
Due to this 'audience-based' interpretation of the household exemption it is crucial that social networking sites introduce the principles of privacy by default, to ensure that user profiles are not automatically publicly available (General Data Protection Regulation, 2012/0010 COD, art 23(2)). The privacy by default and design provisions in the EU Regulation require controllers to implement technical mechanisms to ensure that the privacy of individuals is protected automatically (General Data Protection Regulation, 2012/0010 COD, art 23). Due to the onerous responsibilities imposed on a data controller, particularly under the right to be forgotten, users should be required to make an informed decision to extend access to their profiles beyond a 'personal' or 'household' sphere, before they assume the legal responsibilities of data controllers.
A secondary issue under the EU Regulation is that Facebook users may be regulated as data controllers if they receive a 'gainful interest' (General Data Protection Regulation, 2012/0010 COD, art 2(d)). Such an interest could be obtained when individuals use the social network to sell private possessions to friends (Lindsay, 2012: 439). The 2012 Albrecht Report (Albrecht, 2012/0011(COD)) has recommended deleting 'without any gainful interest' from the exemption to deal with this problem (Burton et al., 2013: 99). Although in some cases individual users, through the way they use social media, may attract the legal obligations of a data controller, ordinarily Facebook itself rather than its users will be regulated as the controller (General Data Protection Regulation, 2012/0010 COD, art 4(5)). Thus ordinarily it is Facebook, rather than individual users, who would have the obligation to erase data, subject to an individual request under the right to be forgotten.
What must be deleted? Categories of data subject to erasure
It is not yet clear how the right to be forgotten will be interpreted. However, commentators have broken the potential obligations of the data controller into three separate conceptual categories of data (Fleischer, 2011). Category one only allows the data subject to require the deletion of directly collected information he or she has posted to be deleted. Category two extends to copies of directly collected information. Category three is the broadest and allows the data subject to require the deletion of all information relating to him or her to be deleted, even if it is indirectly collected. The categories are discussed in more detail in the following section.
Category one
The first and narrowest category is that the right to be forgotten only allows the data subject to require information he or she has posted on Facebook to be deleted (Rosen, 2012: 88, 90). This means that if an individual posts a photo on Facebook and later thinks better of it, the individual can remove the photo. This is the least controversial construction of the right to be forgotten, as users can already delete photographs that they have posted through technical measures provided by the site (Facebook, 2012b). If the right to be forgotten were confined to category one data, it would not significantly improve the privacy rights that Facebook users currently enjoy. The only additional benefit could be providing users with confirmation that data has been erased from Facebook's archives (Rosen, 2012), and proving a legal basis to support and preserve technical measures currently in place.
Category two
The second and slightly wider category of the right to be forgotten is that in addition to the user having the right to delete information he or she has posted, the user also has the right to delete copies of personal information he or she has posted (Rosen, 2012). Under this construction if a data subject uploads a photo of himself or herself, and other users copy or repost the photo, the data subject would have the right to demand the deletion of the copies (Rosen, 2012). This is a more controversial construction, as the copies of information may be located in the account of another user, perhaps copied to their personal album with an original caption, and thus deletion may involve an interference with freedom of expression (Rosen, 2012). Facebook could be placed in the difficult position of arbitrating between the merits of a privacy claim, and a claim for freedom of expression under this category of data (Fleischer, 2011).
If Facebook elects not to delete the photograph, it would essentially have the burden of proof to demonstrate to an EU commission authority that retention of the copy is necessary for exercising the right of freedom of expression (Rosen, 2012). The stakes are high for Facebook, as failure to comply with the right to be forgotten can result in a fine of up to 500,000 Euros, or up to 1% of its annual worldwide turnover (General Data Protection Regulation, 20120/0010 COD, art 79(5)(c)). This is a significant penalty, and it may leave data controllers with no choice but to delete content that is subject to a request under the right to be forgotten section of the legislation (Sartor, 2013). However, this pro-privacy approach is appropriate when considering how much damage can be done to a person's reputation and personal brand in a short period of time, due to the publication of information on social networks that would not normally be available in the public realm. The current freedom of expression limitation on a data controller's deletion power undermines the technical implementation of the right to be forgotten, and should be reconsidered in relation to social networks.
The implementation of category two also raises technical problems, where copies of the original data have been copied to locations outside the social network, such as to a public blog. If a Facebook user's profile is set to 'public', the content automatically links to search engines like Google, and can also be copied and shared on the internet at large. The task of locating all copies of a person's data that has been made public after being posted to Facebook is nearly impossible (Druschel et al., 2011). In addition, even if all copies can be located, Facebook will not have the authority to delete a particular copy if it is stored outside the social network.
Under the EU Regulation, where content on Facebook has been made 'public', Facebook does not have the responsibility to secure the removal of all copies (Pastukhov, 2013). However, Facebook does have to take all 'reasonable steps', including technical measures, to inform the server processing the data of the data subject's request that the search engine erase any links to, or copy or replication of that personal data (General Data Protection Regulation, 2012/0010 COD, art 17(2)). The reasonable steps to be taken have not been defined by the Regulation. Facebook has expressed concern about its level of responsibility for data that has been copied from the social network to another location on the Internet, (Facebook Ireland, 2012) and the Commission must clarify its role in this respect. The Commission has the power to adopt delegated Acts detailing the conditions for deleting links, copies or republications of personal data from publicly available communications services (General Data Protection Regulation, 2012/0010 COD, art 17(9)(b)). In order for the right to be forgotten to be workable, in adopting these Acts the Commission should consider the technical requirements that may be imposed on data controllers by this category of deletion, and provide industry-specific guidance as to what technical steps are 'reasonable' to inform third parties of data that is subject to the right to be forgotten. The guidelines should consider the following issues.
One problem is that contact with the server processing the data may not be possible if the server is not linked to a real-world person or organisation, and even if the contact is possible, if the server is outside the jurisdiction of the EU Regulation, it would not be compelled to erase the data (Druschel et al., 2011: 146).
Another issue is the cost of tracking down copies of a piece of data. The cost of detection of copies and deletion should also be considered, as someone has to pay for the detection and deletion of the data that can end up in a myriad of places on the internet (Determann, 2008). If Facebook is overly burdened with detection, it may either need to charge all users, or charge the data subject exercising the right to be forgotten, which it is permitted to do if the request is 'manifestly excessive', (General Data Protection Regulation, 2012/0010 COD, art 12(40)) to make the continued provision of its services viable. Such charges could seriously affect the usage and utility of social networks and hamper innovation (Determann, 2012: 20, 53).
The Commission's guidelines should also set out how far social networking sites are required to go to search for copies of a piece of data, including the problem of locating all copies of data on the internet, and what types of copies derived from the data item must be traced (Druschel et al., 2011: 146, 148-54). The guidelines must allow the liability of a data controller to end, if the server cannot be reliably linked to a natural person or organisation, or is outside of the jurisdiction of the EU Regulation (Druschel et al., 2011: 146, 148-54). Finally, the guidelines should acknowledge that there are some copies a data controller simply cannot be required to delete, such as downloads by individual users, screenshots and hard copies made from online data. This would be administratively unworkable for the data controller.
Category three
The third category, and broadest construction of the right to be forgotten, is that data subjects also have the right to demand the deletion of any information relating to them posted by other users (Rosen, 2012). This construction of the right is viable because of the broad definition of 'personal data' (Pastukhov, 2013). Personal data is defined as 'any information that relates to a data subject', rather than information that the data subject has given out about himself or herself (General Data Protection Regulation, 2012/0010, art 4(2)). The fact that category three data is not excluded by the wording of the Article has caused commentators to be concerned about the 'chilling effect' (Sartor, 2013: 12, 133) the right to be forgotten could have on freedom of expression and censorship of the internet (Determann, 2012).
In favour of forgetting: arguments for broad interpretation of Article 17
The current consensus of commentators is that, in its current form, the right to be forgotten applies to all three categories of data (Druschel et al., 2011). However, in response to concerns over freedom of expression, commentators have suggested limitations on the right.
The Albrecht report has suggested that where the individual has consented to the initial publication of his or her personal data, controllers should not have an obligation to take reasonable steps to contact third parties and request the deletion of copies of the data (Burton et al., 2013). Although this consent-based approach may be appropriate when users choose to share their own personal information, like Andrew Feldmar did, I would argue that it is not appropriate where the data subject's personal information has been shared by others, like Ben's photographs of Amy. Users give the requisite consent to publication of their data by third parties at the time of signing up to the site, rather than in relation to the sharing of an individual piece of information. If a piece of information is published about a data subject, and the data subject objects to its publication and obtains erasure of the original, the right to be forgotten is meaningless if copies of the information are not also erased.
Other commentators have suggested implementing automatic expiry technology which means that content such as photographs become inaccessible after a certain period of time (Commission of the European Communities, 2007). This is not a practical solution because of the indiscriminate nature of the deletion. Automatic deletion reduces users' control over important information that may not be retained anywhere else. Information such as messages from loved ones, congratulations on life milestones, and photographs that have been deleted from their original source should not be removed without a request from the user. The right to be forgotten is about giving users the control to erase selected pieces of unwanted information from social networks. Automatic expiry technology contradicts the aim by deleting information indiscriminately after a certain period of time.
These suggested solutions do not achieve a meaningful implementation of the right to be forgotten for social networks. The right will impact differently on different industries, and industry-specific consideration must be given as to how the right will operate technically. For social networks, I argue that the right should apply broadly, to all three categories of data.
The right to be forgotten should be applied to the first two categories as far as it is technologically reasonable, subject to industry-specific guidelines on the 'reasonable steps' required to contact third parties processing copies of the data. In principle, data subjects should have the right to require the deletion of information that they have posted about themselves, and copies of that information. While this will involve a degree of interference with freedom of expression when a user copies the data subject's personal information and re-posts it, this interference is acceptable because according to Facebook's terms and conditions a data subject is entitled to control how information he or she posted is shared. Users cannot properly control how their information is shared without being provided with the right to demand deletion of unwanted copies of their information.
In relation to the third category of data, data subjects should be able to require the deletion of any unwanted information that relates to them, where the data has been shared on social networks, despite the potential interference to freedom of expression. It is administratively unworkable for Facebook to arbitrate between a privacy claim and a freedom of expression claim every time a user requests that data be deleted. To require data controllers to do so would be the undoing of the right to be forgotten, due to the sheer volume of data shared on the network each day. For the right to be practically implemented a technical solution needs to be universally applied, and it is argued that for social networks a default position of deletion should be preferred. The interference with freedom of expression is justifiable due to the permanence, accessibility and searchability of information available on social networks. Ordinary people are now subject to a higher degree of scrutiny than ever before. Information once posted to a social network can be used against an individual, out of context, at a later point in his or her life (Lindsay, 2012a). Over half of employers research potential job candidates on social networks (Henrikson, 2011). It is one thing for users unthinkingly to post unflattering information about themselves and never think better of it, but it is another thing for indirectly collected information that is damaging to remain available when the data subject objects to its availability and accessibility. This tips the balance between remembering and forgetting in favour of forgetting, where it has abided for millennia (Mayer-Schönberger, 2009), but in a way that affords social network users a greater degree of control over what will be remembered by them and about them, than ever before. The principle is simple: if a user no longer wants data relating to them processed by a social networking site, the data should be removed (De Terwangne, 2012: 116). The technology that is available should be wielded to allow individuals to take advantage the benefits of social networks while maintaining control over their personal information.
How would the right to be forgotten work in practice? An example
Technical implementation of the right to be forgotten in these terms is achievable. Facebook already has mechanisms in place that allow data subjects to request that other users remove photographs. Currently, a data subject can click on a photo and request that it is removed. Facebook then requires the data subject to select the reason they want the photo removed from the social networking site and provides an automatically generated message requesting the photo be taken down. Ultimately, however, whatever the reason for the data subject's request for deletion, it is currently left to the user who posted the photograph to decide whether or not he or she will delete the photograph.
If the right to be forgotten were to be introduced, this system could be adapted to satisfy Facebook's Article 17 obligations. Instead of allowing a user to decide whether or not he or she will delete the photograph, Facebook would delete the photograph automatically. This function should also extend to other types of data posted to the network. In addition, a message should automatically be sent to the user who posted the photo, notifying him or her that the data has been deleted. This gives the user an opportunity to consider the information posted and whether it could be re-posted in a way that does not interfere with another person's privacy, but still allows the user to express himself of herself. Facial recognition software and tags could also be used to ensure that the deletion request does in fact relate to the data subject making the request. Deterrents for abuse of the deletion function such as site bans could be introduced if necessary. This system would give proper effect to the right to be forgotten allowing it to be a meaningful right that facilitates greater control over personal information shared on social networks.
Conclusion
Social media is not a fad; it has become a part of the everyday lives of many individuals. The EU is tackling the challenge to privacy posed by the age of social networking. The right to be forgotten, if broadly construed and appropriately implemented, would allow Europeans to take greater control of their personal information. Although the problem of public exposure of indirectly collected information on social networks is only partly a legal one, technical solutions will not develop without legal incentives and sanctions. Human memory fades, but without a right of erasure, social networks will never forget.
Notes
[1] Kathryn Smith graduated from Monash University this year with a BA LLB (Hons), receiving first class honours in her law degree. During her time at Monash, Kathryn completed a semester of law at the Prato Centre in Italy, and studied conflict resolution in Israel and Palestine, for which she was awarded the War and Peace prize from the Australian Centre for Jewish Civilisation. This article is derived from her undergraduate honours thesis which she was selected to present at the inaugural International Conference of Undergraduate Research. Kathryn is currently working as a graduate lawyer in Melbourne.
References
Albrecht, J.P. (2012) Draft Report on the proposal for a regulation of the European Parliament and of the Council on the protection of individual with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) (COM(2012)0011 – C7-0025/2012 – 2012/0011(COD), Committee on Civil Liberties, Justice and Home Affairs, 2012/0011(COD)
Arnold, B. (2012), 'Privacy, Confidentiality and Data Security', Lexis Nexisdatabase
Article 29 Data Protection Working Party (2009), Opinion 5/2009 on Social Networking, available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2009/wp163_en.pdf, accessed 9 April 2014
Australian Law Reform Commission (2011), 'For Your Information: Privacy Law and Practice Report 108', available at http://www.alrc.gov.au/publications/report-108, accessed 9April 2014
Banisar, D. (2008), 'Privacy and Human Rights 2000: An International Survey of Privacy Law and Developments', available at www.privacyinternational.org/survey/phr2000/overview.html, accessed 4 April 2013
Bullas, J. (2012), 'The Latest 27 Social Media Facts, Figures and Statistics', available at http://www.jeffbullas.com/2012/11/28/the-latest-27-social-media-facts-figures-and-statistics-for-2012-infographic/, accessed 4 April 2014
Burton, C., C. Kuner and A. Pateraki (2013), 'The Proposed EU Data Protection Regulation One Year Later; The Albrecht Report', Privacy and Security Law Report (12), 99
Clarke, R. (2004), 'What's 'Privacy'?', available at www.anu.edu.au/people/Roger.Clarke/DV/Privacy.html, accessed 9 April 2014
Commission of the European Communities (2007), 'Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETS)', available at http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52007DC0228, accessed 9 April 2014
Datainspektionen (2008), 'Every other young person has been offended on the Internet', available at http://www.datainspektionen.se/in-english/every-other-young-person-has-been-offended-on-the-internet/, accessed 9 April 2014
Determann, L. (2012), 'Social Media Privacy: A Dozen Myths and Facts', Stanford Technology Law Review, 7, 1-8
De Terwangne, C. (2012), 'Internet Privacy and the Right to be Forgotten/ Right to Oblivion', IDP Revista de Internet, Derecho Política, 13, 110-121
Druschel, P., M. Backes and R. Tirtea (2011), 'Impossible, the right to be forgotten- between expectations and practice: report by European Network and Security Agency', available at http://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/the-right-to-be-forgotten, accessed 4 April 2014
European Commission (2012a), 'How will the data protection reform affect social networks?', available at http://ec.europa.eu/justice/data-protection/document/review2012/factsheets/3_en.pdf, accessed 19 April 2013
European Commission (2012b), 'How does the data protection reform strengthen citizens' rights?', available at http://ec.europa.eu/justice/data-protection/document/review2012/factsheets/2_en.pdf, accessed 19 April 2013
European Economic and Social Committee (2012), 'Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)', available at http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf, accessed 9 April 2014
European Commission Eurobarometer (2011), 'Special Barometer 359 – Attitudes on Data Protection and Electronic Identity in the European Union', available at http://ec.europa.eu/public_opinion/archives/ebs/ebs_359_en.pdf, accessed 9 April 2014
European Data Protection Supervisor (2010), 'Opinion of the European Data Protection Supervisor on Promoting Trust in the Information Society by Fostering Data Protection and Privacy', available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2010:280:0001:0015:en:PDF, accessed 9 April 2014
Facebook (2004), 'Eleventh Amended and Restated Certificate of Incorporation of Facebook, Inc.', available at http://c114872.r72.cf0.rackcdn.com/FACEBOOK%2010012010.pdf, accessed 9April 2014
Facebook (2012a), 'Data Use Policy', available at https://www.facebook.com/about/privacy/, accessed 17 April 2014
Facebook (2012c), 'Statement of Rights and Responsibilities', available at https://www.facebook.com/legal/terms, accessed 17 April 2014
Facebook (2013a), 'Key Facts', available at http://newsroom.fb.com/content/default.aspx?NewsAreaId=22, accessed 5 April 2013
Facebook (2013b), 'Tagging Photos', available at https://www.facebook.com/help/tag-suggestions, accessed 5 April 2013
Facebook (2013c), 'Accessing your Facebook Data', available at https://www.facebook.com/help/405183566203254/, accessed 17 April 2014
Facebook Ireland (2012), 'Facebook's views on the EU Data Protection Regulation', available at http://www.europe-v-facebook.org/FoI_Facebook.pdf, accessed 9 April 2014
Fleischer, P. (2010), 'Which photographs reveal 'sensitive' personal data?', available at http://peterfleischer.blogspot.com.au/2010/04/which-photos-reveal-sensitive-personal.html, accessed 9 April 2014
Fleischer, P. (2011), 'Foggy thinking about the right to oblivion', available at http://peterfleischer.blogspot.com.au/2011/03/foggy-thinking-about-right-to-oblivion.html, accessed 9 April 2014
Greenleaf, G, (2010), 'Country Studies B.2 – Australia' in Korff, D. (ed.), Comparative Study on Different Approaches to New Privacy Challenges in Particular in Light of New Technological Developments, Country Studies, Report to the European Commission Directorate-General Justice
Henrikson, J. (2011), 'The Growth of Social Media: an Infographic', available at http://www.searchenginejournal.com/the-growth-of-social-media-an-infographic/32788/, accessed 4 April 2014
Johnstone, M. (2007), 'Should Australia force the square peg of privacy into the round hole of confidence or look to a new tort?', Media and Arts Law Review, 12, 441-480
Kern, E. (2012), 'Facebook is collecting your data - 500 terabytes a day', available at http://gigaom.com/2012/08/22/facebook-is-collecting-your-data-500-terabytes-a-day/, accessed 4 April 2014
Lee, D. (2012), 'Facebook surpasses 1 billion users as it tempts new markets', available at http://www.bbc.co.uk/news/technology-19816709, accessed 4 April 2014
Lindsay, D. (2012a), 'The "right to be forgotten" is not censorship', available at http://www.monash.edu.au/news/show/the-right-to-be-forgotten-is-not-censorship, accessed 9 April 2014
Lindsay, D. (2012b), 'The Emerging Right to be Forgotten in Data Protection Law: Some Conceptual and Legal Problems', presentation at the 8thInternational Conference of Internet, Law and Politics, Barcelona, 9-10 July 2012
Mayer-Schönberger, V. (2009), Delete: The Value of Forgetting in the Digital Age, Princeton, NJ: Princeton University Press
Miller, R. (2009), 'Facebook now has 30,000 servers', available at http://www.datacenterknowledge.com/archives/2009/10/13/facebook-now-has-30000-servers/, accessed 9 April 2014
Office of the Australian Information Commissioner (undated), 'What does it mean to get the consent of all the individuals?', available at http://www.privacy.gov.au/faq/smallbusiness/q3, accessed 9 April 2014
Office of the Australian Information Commissioner (undated), 'What is privacy?', available at http://www.privacy.gov.au/aboutprivacy/what, accessed 9 April 2014
Office of the Data Protection Commissioner of Ireland (2011), 'Facebook Ireland Ltd.: Report of Audit', available at http://dataprotection.ie/documents/facebook%20report/final%20report/report.pdf, accessed 9 April 2014
Owen, T. (2012), 'Facebook users upload 300 million images a day', available at http://www.businessinsider.com/facebook-images-a-day-instagram-acquisition-2012-7, accessed 17 April 2014
Pastukhov, O. (2013), 'The right to oblivion: what's in the name?', Computer and Telecommunications Law Review, 19 (1), 17-23
Reding, V. (2012), 'The EU Data Protection Reform 2012: Making Europe the Standard Setter for Modern Data Protection Rules in the Digital Age', presentation at the Innovation Conference Digital, Life, Design, Munich, 22 January, 2012
Roplh, D., M. Vitins and J. Bannister (2010), Media Law: Cases, Materials and Commentary, Oxford: Oxford University Press
Rosen, J. (2012), 'Symposium Issue: The Right to be Forgotten', Stanford Law Review, 64, 88-92
Sartor, G. (2013), 'Providers' liabilities in the new EU Data Protection Regulation: A threat to Internet freedoms', International Data Privacy, 3, 3-12
South Australian Government, 2011, 'Photographic Images and Privacy – Information Sheet', available at http://www.archives.sa.gov.au/files/privacy_infosheet_photoimages.pdf, accessed 9 April 2014
Svantessan, D. (2007), 'Protecting Privacy on the Borderless Internet, Some Thoughts on Extraterritoriality and Transborder Data Flow', Bond Law Review, 19 (1), 168-187
Tsukayama, H. (2012), 'Your Facebook friends have more friends than you', available at http://articles.washingtonpost.com/2012-02-03/business/35444265_1_facebook-users-photo-tags-friend-requests, accessed 9April 2014
Witzleb, N. (2009), 'Giller v Procopets: Australia's privacy protection shows signs of improvement', Torts Law Journal, 17, 121-129
Wong, R. (2009), 'Social networking: A Conceptual Analysis of a Data Controller', Communications Law Journal, 14 (5), 142-49
Legislation and related materials
Privacy Amendment (Enhancing Privacy Protection) Act 2012 (Cth)
Explanatory Memorandum, Privacy Amendment (Enhancing Privacy Protection) Bill 2012 (Cth)
Privacy Act 1988 (Cth)
Australian Privacy Foundation, (2012) Submission No 49 to House of Representatives Standing Committee on Social Policy and Legal Affairs, Privacy Amendment (Enhancing Privacy Protection) Bill 2012 (Cth) Submission 030
Facebook, Google, IAB Australia and Yahoo7!, (2012) Submission No 39 to Senate Standing Committee on Legal and Constitutional Affairs, Inquiry into the Privacy Amendment (Enhancing Privacy Protection) Bill 2012,
Charter of Fundamental Rights of the European Union 2010 OJ C 83/02
Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data CETS 108 [1981]
Council Directive 95/46/EC3 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281/31.
Council of Europe Committee of Ministers Resolution (73) 22 on the protection of the privacy of individuals vis-à-vis electronic data banks in the private sector ETS 108 [1973]
Council of Europe, Committee of Ministers Resolution (74) 29 on the protection of the privacy of individuals vis-à-vis electronic data banks in the public sector ETS 108 [1974]
Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) 2012/0010(COD)
Cases
Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd [2001] 208 CLR 199
Campbell v Mirror Group Newspapers Limited [2002] All ER (D) 177
Collins v Wilcock [1984] 3 All ER 374
Doe v Australian Broadcasting Corp [2007] VCC 281
Douglas v Hello [2005] EWCA Civ 595
Grosse v Purvis [2003] QDC 151
Hosking v Runting [2005] 1 NZLR 1
Sweden v Bodil Linqvist (C-101/01) [2003] ECRI I-12971
To cite this paper please use the following details: Smith, K. (2014), 'The Right to be Forgotten: Legislating for Individuals to Regain Control of their Personal Information on Social Networks', Reinvention: an International Journal of Undergraduate Research, Volume 7, Issue 1, http://www.warwick.ac.uk/reinventionjournal/archive/volume7issue1/smith Date accessed [insert date]. If you cite this article or use it in any teaching or other related activities please let us know by e-mailing us at .