1. Trang chủ
  2. » Ngoại Ngữ

Obscurity-by-Design-Stutzman-and-Hartzog1

37 1 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 37
Dung lượng 417,54 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

This essay proposes that design solutions for social technologies require increased attention to user interfaces, with a focus on “obscurity” rather than the expansive and vague concept

Trang 1

O BSCURITY BY D ESIGN

Frederic Stutzman Postdoctoral Fellow

H John Heinz III College Carnegie Mellon University fred@fredstutzman.com Woodrow Hartzog Assistant Professor of Law Cumberland School of Law at Samford University

Affiliate Scholar Center for Internet and Society at Stanford Law School

whartzog@samford.edu

Design-based solutions to confront technological privacy threats are becoming popular with regulators But these promising solutions have left the full potential of design untapped With respect to online communication technologies, design-based solutions for privacy remain incomplete because they have yet to tackle the trickiest aspect of the Internet—social interaction This essay posits that privacy-protection strategies such as “privacy by design” are hobbled with respect to social interaction due to a lack of focus on collaborative aspects of social software use

This essay proposes that design solutions for social technologies require increased attention to user interfaces, with a focus on “obscurity” rather than the expansive and vague concept of “privacy.” The main thesis of this essay is that obscurity is the natural state for most online social interaction and, as such, should be the locus for design-based privacy solutions for social technologies The purpose of this essay is to develop a model for “obscurity by design” as a means to address the privacy problems inherent in social technologies

Trang 2

II Better Living Through Obscurity 10

1 The Concept of Obscurity 112

* Postdoctoral Fellow, H John Heinz III College, Carnegie Mellon University

** Assistant Professor of Law, Cumberland School of Law at Samford University; Affiliate Scholar, Center for Internet and Society at Stanford Law School

Trang 3

2

INTRODUCTION

Privacy by design, that is, “the philosophy and approach of embedding privacy into the design specifications of various technologies,” can help change the law’s largely reactive approach to privacy threats.1

Government and industry are gradually embracing privacy by design and other design-based strategies to protect Internet users.2 But these solutions have thus far not embraced the full potential of design These design-based strategies for online communication technologies have yet

to articulate principles for the design of the most complex aspect of the Internet—social interaction Simply put, privacy by design has yet to address the social aspect of an inherently social medium

Currently, privacy by design focuses on the collection and use of data in compliance with fair information practices.3 There are numerous and significant problems with “big data,” but the organizations that deal

in this data are not the only threat to privacy on the Internet The ascent

of the social web has made it clear that online relationships present their own privacy challenges as millions regularly disclose vast amounts of personal information for the purpose of socializing.4

1 Ann Cavoukian, Privacy by Design, Info & Privacy Comm’r, 1 (2009),

http://www.ipc.on.ca/images/Resources/privacybydesign.pdf; Ann Cavoukian, Privacy

by Design: The Seven Foundational Principles, Info & Privacy Comm’r (2009),

http://www.privacybydesign.ca/content/uploads/2009/08/7foundationalprinciples.pdf;

2 Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change:

Recommendations for Businesses and Policymakers, FTCR EPORT (March 2012),

http://ftc.gov/os/2012/03/120326privacyreport.pdf; Directive 95/46/EC, 1995 O.J (L 281) 31 (Nov 23, 1995); Art 29 Data Protection Working Party, 02356/09/EN, WP 168, The Future of Privacy (2009),

Grimmelmann, Saving Facebook, 94 IOWA L R EV 1137 (2009); Lauren Gelman, Privacy,

Free Speech and Blurry-Edged Social Networks, 50 B.C.L R EV 1315 (2009)

Trang 4

3

Addressing the vexing privacy problems of the social web is a challenging task Few can agree on a conceptualization of privacy,5 much less how to protect privacy in our social interactions by design.6 There are

a number of practical reasons why privacy by design has avoided the social side of the user interface The translation of regulation to implementation is a complex process and may be more efficient when applied to formal technologies (e.g., databases).7 Additionally, there is little guidance regarding how designers should approach the implementation of privacy by design in a contextually variant, interactional space Many substantive protections entailed in privacy by design are effectuated on the “back end” of technologies, such as data security through encryption, data minimization techniques, anonymity, and structural protection though organizational prioritization of privacy However, the design of social technologies also involves “front facing” concerns such as privacy settings, search visibility, password protections, and the ability to use pseudonyms

The answer to these challenges might lie in refining the goal for the design of social systems The current goal of design solutions is “privacy,” which is too broad and opaque to provide meaningful guidance in designing social technologies Indeed, one conceptualization of privacy, secrecy, can be seen as antithetical to the notion of social interaction This essay recommends looking to the related concept of obscurity Empirical evidence demonstrates that Internet users aim to produce and rely upon obscurity to protect their social interaction.8 The concept of online obscurity, defined here as a context missing one or more key factors that are essential to discovery or comprehension, is a much more defined and attainable goal for social technology designers Obscurity is more flexible than some conceptualizations of privacy and also more feasible to

5 See, e.g., DANIEL S OLOVE , U NDERSTANDING P RIVACY (2008); A LAN W ESTIN , P RIVACY AND

F REEDOM (1967); Stephen Margulis, On the Status and Contribution of Westin's and

Altman's Theories of Privacy 59(2) JOUR OF S OCIAL I SSUES 411 (2003)

6 See, Ira Rubinstein, 26 BERKELEY T ECH L.J 1409, 1421 (2011) (“Privacy by design is an amorphous concept.”)

7 Seda Gürses, Carmela Troncoso, & Claudia Diaz, Engineering Privacy by Design, CPDP

2011, Belgium, 2011

8 See, e.g., Woodrow Hartzog & Frederic Stutzman, The Case for Online Obscurity, 101

C ALIF L R EV (forthcoming 2013)

Trang 5

4

implement Moreover, obscurity involves more than prohibitions on conduct; obscurity can be actively produced by Internet users themselves The main thesis of this essay is that obscurity is the natural state for most online social communications and, as such, should be the locus for the front-end of design-based privacy solutions for social technologies The purpose of this essay is to develop the concept of “obscurity by design” as a model for design-based privacy solutions

Part I of this paper reviews the broader concept of privacy by design, including its strengths, the challenges to its implementation, and its missed opportunity in failing to account for the front-end design of social technologies Part II sets forth the authors’ conceptualization of obscurity, including the four major factors of online obscurity: 1) search visibility, 2) unprotected access, 3) identification, and 4) clarity This article proposes that the four factors of online obscurity constitute a set of principles that designers should consider when building privacy into social technologies Finally, Part III proposes a model to implement obscurity by design This model suggests that obscurity by design can be effectuated through the combination of technologies, policies, and behavioral interventions

I.PRIVACY BY DESIGN

In recent years, consumer technologies have embraced the broad collection and storage of personal information Application such as behavioral advertising, consumer forecasting, and geolocational systems have pushed – and created new – boundaries for the collection of data about users While industry argues that increased data will lead to better products and predictions, the collection and storage of this data potentially opens consumers, and companies, to novel risk

Early approaches to protect the information and privacy rights were

to punish violators by utilizing torts, statutes, and regulations to levy fines and injunctions These “reactive” approaches remain in use, but the challenges of web-scale technologies, and the scale of risks such as breach

or hacking require a proactive approach to privacy protection These modern “design-based” solutions to privacy focus on concepts such as data minimization, security, information policy, and disclosure of information practices This proactive approach to privacy has crystallized

in the privacy by design movement, which seeks to build “the principles of

Trang 6

5

Fair Information Practices (FIPs) into the design, operation and management of information processing technologies and systems.”9

1 The History of Privacy by Design

Privacy by design can best be thought of as a technological design framework; when this framework is embraced in the design phase, the resultant technology should embody privacy protection In this sense,

“privacy” is not an afterthought, or a security treatment, but an essential value in the design and construction process

The modern privacy by design movement can be traced back to Dr Ann Cavoukian, the Information & Privacy Commissioner of Ontario, Canada Cavoukian’s approach to privacy by design is illustrated in numerous white papers10, as well as an edited volume of the journal Identity in the Information Society.11 Cavoukian’s approach to privacy by design argues for the inclusion of Fair Information Principles into the design of technologies; these principles include:

1 Recognition that privacy interests and concerns must

be addressed proactively;

2 Application of core principles expressing universal

spheres of privacy protection;

3 Early mitigation of privacy concerns when developing

information technologies and systems, throughout the

entire information life cycle —end to end;

4 Need for qualified privacy leadership and/or

professional input;

5 Adoption and integration of privacy-enhancing

technologies (PETs);

9 Ann Cavoukian, Privacy by Design, Info & Privacy Comm’r, 1 (2009),

http://www.ipc.on.ca/images/Resources/privacybydesign.pdf; Ann Cavoukian, Privacy

by Design: The Seven Foundational Principles, Info & Privacy Comm’r (2009),

Trang 7

6

6 Embedding privacy in a positive-sum (not zero-sum)

manner so as to enhance both privacy and system

functionality; and

7 Respect for users’ privacy 12

The privacy by design approach has proven to be novel within the privacy community, where much emphasis is placed on privacy-

enhancing technologies (PETs) or ex post remedies Using a process lens,

privacy by design argues that privacy is a critical part of the technical infrastructure of technologies, and that privacy is both a value and a tangible component that must be included in technologies To accomplish this goal, Cavoukian argues that privacy by design should be valued through the organizational hierarchy (e.g., qualified leadership) and that the privacy outcomes should be positive for the user In a sense, privacy by design provides both process and infrastructure for the inclusion of privacy as both a value and a tangible good in the design of technical systems (as well as organizational practices and physical design, notes Cavoukian)

socio-In reaction to failures of privacy enhancing technologies or ex post measures as a robust privacy strategy, privacy organizations, government regulators, and industry groups are moving toward privacy by design as a potential information-age remedy to privacy threats In 2010, the FTC draft framework “Protecting Consumer Privacy in an Age of Rapid Change” strongly encouraged companies to adopt privacy by design approaches to their business and technical operations.13 Later that year, the European Data Protection Supervisor also strongly recommended privacy by design as a legislative requirement – potentially requiring firms to follow privacy by design under threat of fines or other legal action.14 The adoption of privacy by design by regulatory agencies as a guideline or requirement would require organizations to change the way

12 See, Cavoukian, supra note 9, at 1

13 Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change:

Recommendations for Businesses and Policymakers, FTCR EPORT (March 2012),

Trang 8

7

privacy is treated in the design process Such a regulatory move would be noteworthy, as there are a number of challenges to the implementation

2 Challenges to Privacy by Design

The adoption of privacy by design as a universal approach to privacy has drawn sharp criticism in a range of communities The general criticisms lie with the incentives and enforcement of privacy by design, the challenges of adopting and applying privacy by design, and the technical hurdles of a privacy by design model of development While these criticisms are sharp, there is near consensus that privacy by design

is a useful way of thinking about the challenges faced by designers of technologies; that is, proactively building privacy into technology is an interdisciplinary problem that involves the coordination of product developers, engineers, legal and policy, and executives within an organization The privacy by design approach helps address these challenges by setting forth values that disparate parts of the organization can embody in the design process As critics will note, this is often easier said than done

As outlined by Ira Rubinstein, two of the primary challenges facing privacy by design include a weak specification of the approach, and lack of incentives for the firm to adopt such an approach As we address specification later in this section, here we concentrate on Rubinstein’s question of incentives Rubinstein considers why firms would adopt privacy by design (as well as PET’s), exploring endogenous (to the firm) motivation, market demand, and regulatory potential To the question of endogenous motivation, firms are differentially motivated towards privacy based on the data collected, tolerance of risk, and economic impact of privacy breaches Therefore, motivation as an endogenous trait

is not uniformly distributed across firms Rubinstein then questions consumer valuation of privacy and PET’s, arguing that there is little market demand for privacy goods (even non-zero-sum goods) Finally, Rubinstein explores the potential for regulatory enforcement, finding that the capability to enforce privacy by design to be premature due to challenges in establishing consent orders based on privacy by design language

As Cavoukian notes, the premise of privacy by design is to construct technologies that embody the principles of Fair Information Practices

Trang 9

8

The roadmap to the creation of these technologies is not one that can be directly specified, in the sense that there is a linear set of steps to follow This is the specification problem described by Rubinstein.15 The design of

a product (specifically, software) requires the translation of requirements (e.g., semantic descriptions of functionality) into code that can be compiled and executed In the context of a software product team, such a translation can be facilitated when requirements are precise, and product managers know the limits and capabilities of designers However, even in the context of highly skilled teams, the requirements engineering phase of product design is non-trivial When there is regulatory oversight of a process or design, new requirements engineering challenges emerge.16

Regulatory requirements are often vague, describing a generic process that can apply to many different types of systems Ensuring compliance

to such a process is highly challenging, as evidenced by Breaux and Anton.17 As the privacy by design specifications are inherently generic (which makes them flexible), the translation of these requirements into design is a significant challenge for adoption

Finally, we call on Rubinstein’s taxonomy of front-end and back-end technologies when describing the components of a system Rubinstein’s point is clear and important – systems are multi-faceted and the user experience has many different components Systems are commonly not built as a cohesive whole, but as parts that are placed together to accomplish a goal It is important to think about how the privacy risk model varies for different parts of the component For example, a website might have a front end (the website itself) as a back end (the data store) The risk model for these two components is different in that privacy attacks or problems can vary substantially A formal system, such as a database, has a known universe of threats that can be guarded systematically A front end, on the other hand, may invoke a range of threats, from the social to the technical The heterogeneity of these threats make it harder to apply formal privacy logics, leading to a potentially greater propensity to design privacy for formal systems.18

15 See Rubinstein, supra note 6

16 Travis D Breaux & Annie I Anton, Analyzing Regulatory Rules for Privacy and

Security Requirements, 34 IEEE Transactions on Software Engineering 5 (2008)

17 Id

18 S Gürses, C Troncoso, and C Diaz, , Engineering privacy by design, International

Conference on Privacy and Data Protection (CPDP), Belgium, 2011; S Spiekermann and

Trang 10

9

Thus, the many challenges to a large-scale adoption of privacy by design are significant, encompassing challenges to the demand, feasibility, and technical capacity to adopt these processes As Dourish and Anderson note, however, privacy is a unique challenge as it encompasses challenges both endogenous and exogenous to the technology.19 This is certainly the case in social media, where individuals interact equally with systems, and others mediated through systems These dense intermingling raises privacy challenges that have not been seen before in other interactive technologies For this reason, we use social media as the case we examine in the remainder of this paper

Externally, conceptualizing privacy within the context of social technologies in a way that is workable for design-based solutions has proven elusive.20 As previously mentioned, there is no general agreement

on what the term “privacy” means in a social context, much less how Internet design can protect it.21 While many scholars and regulators have agreed that “back end” protections, such as those provided for in the fair information practices,22 are critical design-based protections, these background safeguards fail to address the “front end” or social aspect of the Internet

Social interaction is messy, unpredictable, and contextual with a vengeance Consequently, any design rules or guidelines seem destined to either be inconsistently effective or miss the mark entirely But the social web is now too large to exclude from the realm of design-based solutions

Cranor, L F., Engineering Privacy, 35(1) IEEET RANSACTIONS ON S OFTWARE E NGINEERING

67 (2011)

19 P Dourish and K Anderson, Collective Information Practice: Exploring Privacy and

Security as Social and Cultural Phenomena, 21(3) HUMAN -C OMPUTER I NTERACTION 319 (2006)

20 See, e.g., Lauren Gelman, Privacy, Free Speech and Blurry-Edged Social Networks, 50

B.C L R EV 1315 (2009); Hartzog & Stutzman, supra note 8

21 See supra note 20

22 The FTC has identified the major substantive principles of privacy by design as data security, reasonable collection limits, sound retention practices, and data accuracy

Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change:

Recommendations for Businesses and Policymakers, FTCR EPORT (March 2012),

http://ftc.gov/os/2012/03/120326privacyreport.pdf

Trang 11

10

Social network sites like Facebook have hundreds of millions of users.23 Even commercial and news websites are incorporating social aspects into their user experience.24 Thus, the time has come for design guidelines to protect privacy in this social medium

II.BETTER LIVING THROUGH OBSCURITY

Most conceptualizations of privacy on the Internet seem to break down at the social level.25 The concept of privacy is simply too contextual and vague to meaningfully direct the relevant stakeholders in design-based decisions to protect Internet users Instead, this essay proposes that general design principles to protect users of social technologies should be based on the concept of obscurity This part explores the concept of online obscurity and why it is the ideal front-end design principle for online communication technologies like social network sites

Obscurity is capable of being easier to refine and implement than the broader concept of privacy Where the pursuit of “privacy” in design often seems like a quest for near-perfect protection, the goal of designing

for obscurity is that it be good enough for most contexts or a user’s

specific needs Protection is achieved via obscurity not necessarily

through the strength of the “armor,” but rather, through a significant reduction in probability of discovering or understanding information Obscurity is a more nuanced and accurate reflection of the expectations of users of social technologies than the broader, and potentially misleading, concept of privacy.26

23 It is anticipated that Facebook will have 1 billion users by August, 2012 Todd

Wasserman, Facebook to Hit 1 Billion User Mark in August, MASHABLE (Jan 12, 2012), http://mashable.com/2012/01/12/facebook-1-billion-users/

24 See Community Forums, Amazon.com,

http://www.amazon.com/forum/community?_encoding=UTF8&cdOpenPostBox=1 (last

accessed May 1, 2012); Community Guidelines, THE N ATION ,

http://www.thenation.com/community-guidelines (last accessed May 2, 2012)

25 See, e.g., Lior Strahilevitz, A Social Networks Theory of Privacy, 72 U.C HI L R EV 919

(2005); James Grimmelmann, Saving Facebook, 94 IOWA L R EV 1137 (2009); Lauren

Gelman, Privacy, Free Speech and Blurry-Edged Social Networks, 50 B.C.L R EV 1315 (2009)

26 Hartzog & Stutzman, supra note 8

Trang 12

11

1 The Concept of Obscurity

Obscurity is a simple concept, reflecting a state of unknowing For an individual to be obscure, an observer must not possess critical

information that allows one to make sense of the individual Such

information includes identity, social connections, and other personal information Without this information, observers are limited in their ability to make sense of the observed person’s actions and utterances If a passenger on an airplane gossips over the phone in the presence of other passengers, the gossip is obscure unless the other passengers know the subject of the gossip; the passengers must be able to draw on unspoken contextual information to make sense of the utterance This unspoken contextual information enables what some have referred to as

“presupposition.”27 In everyday interaction, we often find ourselves in zones of obscurity, where our identity and personal context are unknown

to those we interact with or share common space

We argue that obscurity is the natural state for most online communication for two reasons First, in everyday interaction, being in a zone of obscurity is commonplace; we are therefore conditioned to expect obscurity in other domains where we spend time, both physical and virtual.28 Second, obscurity is a protective, privacy-enhancing state where

we are guarded by an observer’s inability to completely comprehend our action; the concept is both flexible and useful

The choice to disclose online is the product of a complex and highly contextual decision-making process, where risks are weighed against the potential reward of disclosure.29 It is normal to expect obscurity in everyday life When we stroll down the street, we do not expect to be identified by all passers-by; indeed, we expect to be obscure in the eye of these observers With the rise of peer-produced online content, it is now just as clear that our expectation of obscurity transfers online

27 Erving Goffman, Felicity's Condition, 89 AM J OUR OF S OC , 1 (1983)

28 See, e.g., SANDRA P ETRONIO , B OUNDARIES OF P RIVACY : D IALECTICS OF D ISCLOSURE (2002);

I RWIN A LTMAN , T HE E NVIRONMENT AND S OCIAL B EHAVIOR (1975)

29 See, e.g., Joseph B Walther, Selective Self-Presentation in Computer-Mediated

Communication: Hyperpersonal Dimensions of Technology, Language, and Cognition

23 C OMPUTERS IN H UMAN B EHAVIOR 2538 (2007)

Trang 13

12

Empirical research demonstrates that individuals exert control over the information disclosed online by limiting the audience of the disclosure, by bounding the meaning of the disclosure, and by adapting the disclosure to a particular website.30 In social network sites, where the use of anonymity would violate norms and limit benefits attained from site use, individuals strategically develop techniques that effectively produce obscurity in disclosure.31 Interacting with both rules and norms, obscurity is flexibly – and reflexively – created in sites that we would consider highly identified Even in remarkable, nonymous contexts such

as Facebook, individuals rely on obscurity as an important aspect of managing both identity and privacy.32

Contrary to the powerful popular discourse that argues that individuals online have essentially different privacy and notoriety goals, our previous work demonstrated that online obscurity is a crucial aspect

of privacy for Internet users.33 Through obfuscation techniques and other normative practices, it is clear that obscurity is both desired and expected online Internet users routinely hide information by making it invisible to search engines, using pseudonyms and multiple profiles, and taking

30 See, e.g., Frederic Stutzman & Woodrow Hartzog, Boundary Regulation in Social

Media, Paper presented at AOIR 10: Association of Internet Researchers Annual Meeting

Milwaukee, WI (2009), http://ssrn.com/abstract=1566904 (last accessed May 1, 2012);

Martin Tanis & Tom Postmes, Social Cues and Impression Formation in CMC, 53 JOUR

OF C OMM 676 (2003); Amanda Lenhart & Mary Madden, Teens, Privacy and Online

Social Networks: How Teens Manage Their Online Identities and Personal Information

in the Age of Myspace, PEW I NTERNET & A MERICAN L IFE P ROJECT (Apr 18, 2007),

http://www.pewinternet.org/PPF/r/211/report\_display.asp; Amanda Lenhart, Adults

and Social Network Websites, PEW I NTERNET & A MERICAN L IFE P ROJECT (Jan 14, 2009), http://www.pewinternet.org/PPF/r/272/report\_display.asp.; Amanda Lenhart, Kristen

Purcell, Aaron Smith & Kathryn Zickuhr, Social Media and Young Adults, PEW I NTERNET &

A MERICAN L IFE P ROJECT (February 3, 2010), Media-and-Young-Adults.aspx

http://pewinternet.org/Reports/2010/Social-31 Joan M DiMicco & David R Millen, Identity Management: Multiple Presentations of Self in Facebook, GROUP '07: Proceedings of the 2007 International ACM Conference on

Supporting Group Work 383 (2007); Frederic Stutzman & Woodrow Hartzog, Boundary

Regulation in Social Media, In Proceedings of ACM Conference on Computer Supported

Trang 14

13

advantage of privacy settings.34 In short, users produce obscurity online

Thus, obscurity is the ideal locus for design-based solutions that empower users to produce and exist in their own privacy protective contexts

2 The Four Principles of Online Obscurity

Previous research has offered a clear definition of online obscurity: Information is obscure online if it exists in a context missing one or more key factors that are essential to discovery or comprehension.35 We have identified four of these factors as part of a non-exhaustive and flexible list: 1) search visibility 2) unprotected access 3) identification 4) clarity The presence of these factors diminishes obscurity, and their absence enhances it

This essay conceives of online obscurity as a form of everyday obfuscation Thus, the proper metaphor is the key and lock; to understand encountered information (i.e., release the lock), one must possess context (the key or keys) that renders the information un-obscure,

or, obvious This metaphor is likely well suited to technical design given the existing reliance on the digital version of the key: the password In essence, there is more than one key that can lock information Indeed, many kinds of keys and locks, each with varying strengths, exist, and considered cumulatively, fall along a spectrum that will allow regulators and organizational stakeholders to adopt guiding principles regarding the protection of online information based on the production of obscurity Obscurity could provide organizational stakeholders a set of “starting points” which could be applied flexibly via design-based solutions across culture and context Consider how each of the four factors of obscurity can

be approached through design

a Search Visibility

Search visibility is the degree to which individuals, and the content they produce, are locatable and accessible through search Search invisibility is the one of the most significant factors in online obscurity

34 Id

35 Id

Trang 15

14

because it is the primary method for discovering online information.36

Without search, information can only be discovered in a chain-hyperlink fashion via other websites, messages, and manual URL entry

In many ways, search invisibility is already the default for most online information.37 Search invisibility can be achieved by intentionally shielding websites from search engines using the robot.txt file as well as using privacy settings or other access restrictions such as passwords, which are another factor in online obscurity.38 Search is a primary and

36 See, e.g., Gary Marchionini, Exploratory Search: From Finding to Understanding, 49

C OMMUNICATIONS OF THE ACM 41 (2006); Jamie Teevan, Susan T Dumais & Eric Horvitz,

Potential for Personalization, 17 ACMT RANS C OMPUTER -H UMAN I NTERACT 1 (2010);

Deborah Fallows, Search Engine Use, PEW I NTERNET & A MERICAN L IFE P ROJECT (Aug 6, 2008), http://www.pewinternet.org/Reports/2008/Search-Engine-Use/Data-

Memo.aspx; Lee Raine, Big Jump in Search Engine Use, PEW I NTERNET & A MERICAN L IFE

P ROJECT (Nov 20, 2005),

http://www.pewinternet.org/Reports/2005/Big-jump-in-search-engine-use/Data-Memo.aspx; Susanna Fox, Search Engines, PEW I NTERNET &

A MERICAN L IFE P ROJECT (July 3, 2002),

http://www.pewinternet.org/Reports/2002/Search-Engines/Data-Memo.aspx

37 This information, collectively known as “the dark Web,” “the deep Web” or “the invisible

Web,” accounts for 80-99% of the World Wide Web See, e.g., Michael Bergman, The

Deep Web: Surfacing Hidden Value, 7 JOUR O F E LEC P UBL ’ G , at

http://quod.lib.umich.edu/cgi/t/text/text-idx?c=jep;view=text;rgn=main;idno=3336451.0007.104 (2001) (“Since they are missing the deep Web when they use such search engines, Internet searchers are therefore

searching only 0.03% — or one in 3,000 — of the pages available to them today.”); Andy

Beckett, The Dark Site of the Internet, THE G UARDIAN (Nov 26, 2009),

http://www.guardian.co.uk/technology/2009/nov/26/dark-side-internet-freenet; Russell

Kay, Deep Web, COMPUTERWORLD (Dec 15, 2005),

http://www.computerworld.com/s/article/107097/Deep_Web (“[M]ore than 500 times

as much information as traditional search engines "know about" is available in the deep

Web.”); Danny Devriendt, Data is Gold – 91,000 Terabytes of Uncharted Web: Welcome

to the Dark Side, PORTER N OVELLI B LOG (Apr 4, 2011),

http://blog.porternovelli.com/2011/04/11/data-is-gold-%E2%80%93-91000-terabytes-of-uncharted-web-welcome-to-the-dark-side/ (“The dark Web, or hidden Web is

approximately 550 times bigger than the Web you experience daily.”); Norm Medeiros,

Reap What You Sow: Harvesting the Deep Web, 18 OCLCS YS & S ERV 18 (2002); Yanbo

Ru & Ellis Horowitz, Indexing the Invisible Web: A Survey, 29 ONLINE I NFO R EV 249

(2005) ; see also CHRIS S HERMAN & G ARY P RICE , T HE I NVISIBLE W EB : U NCOVERING

I NFORMATION S OURCES S EARCH E NGINES C AN ' T S EE (2001); P AUL P EDLEY , T HE I NVISIBLE W EB :

S EARCHING THE H IDDEN P ARTS OF THE I NTERNET (2001)

38 For example, the popular blogging service Blogger allows users to make their blog

invisible to Google What do the 'listing' and 'let search engines find your blog' settings

do?, BLOGGER H ELP ,

Trang 16

15

common vector for discovery of individual content, designers should consider offering controls over inclusion in both internal and external search services For example, some people may want their profile to appear in Google, while others would prefer to only be “searchable” one or two network degrees out (e.g., by friends-of-friends) Designers may also consider offering various levels of search engine obfuscation, where only certain aspects of the profile are placed into search, or search placement is manipulated to raise or lower placement of results Design options are discussed in greater detail in Part III below

of the information Conversely, unfettered access to information, particularly when technologies like privacy settings are available but unused, can have the opposite effect on obscurity, leaving the information exposed and, in some instances, subject to being indexed and aggregated

There are many different kinds of access controls, including biometrics, encryption, privacy settings and passwords These controls can provide for user control over several variables, including the content shared, the specifics of the potential audience, or both As ubiquitous computing systems change and adoption increases, dynamically generated access controls are likely to evolve – controls that are reactive

to the environment and network configurations within the environment.39

http://www.google.com/support/blogger/bin/answer.py?hl=en&answer=41373 (last accessed April 27, 2011) Facebook profiles that utilize privacy settings are also not found

by search engines How Do I Prevent Search Engines (e.g., Google) From Showing My

Public Search Listing?, FACEBOOK ,

https://www.facebook.com/help/?page=764#!/help/?faq=12043 (last accessed May 6,

2011); see also Jonathan Zittrain, Privacy 2.0, 2008 U.C HI L EGAL F 65, 102 (2008) (“Today, nearly all Web programmers know robots.txt is the way in which sites can signal their intentions to robots, and these intentions are voluntarily respected by every major search engine across differing cultures and legal jurisdictions.”)

39 See, e.g., Maomao Wu, Adaptive Privacy Management for Distributed Applications

(June 2007)(Ph.D dissertation, Lancaster University),

Trang 17

16

Along with search visibility, access controls are one of the most important factors for the creation of online obscurity Consequently, they should be considered bedrock tools for designers embracing the principals of obscurity

c Identification

Identification refers to the degree that individuals are identified through personal and interpersonal disclosures in online settings Identification is defined here as the existence of an irrefutable piece of information that links content online to the individual’s person Of course, identification is a concept with an impact far beyond obscurity It

is one of the major components of general information privacy law.40

Information that cannot be linked to a person poses a reduced threat to that person’s privacy The relationship between anonymization and identity in is nuanced.41 The concept of identification and its relationship

to obscurity is similarly complex

While many PETs and other design strategies focus on anonymity, obscurity is much more concerned with the use of pseudonyms and ID variants, given their utility in socialization.42 Like passwords, ID variants and pseudonyms can serve two functions: 1) they can somewhat de-link content and identity to protect the discloser or subject of information and 2) readily apparent ID variants and pseudonym can serve as signals to the recipient of information that the identity of the discloser is sensitive or private

Social technologies present multiple challenges to identity management For example, on social network sites, where the articulation of the social network is a key feature, identification can occur

http://eprints.lancs.ac.uk/12984/1/PhdThesis-MaomaoWu.pdf; Giovanni Iachello &

Jason Hong, End-User Privacy in Human-Computer Interaction, 1(1) FOUNDATIONS &

T RENDS IN H UMAN -C OMPUTER I NTERACTION 137 (2007)

40 See, e.g., Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure

of Anonymization, 57 UCLA.L R EV 1701 (2010); Paul M Schwartz & Daniel J Solove,

The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86

N.Y.U L R EV 1814 (2012)

41 Id

42 See, e.g., Rubinstein, supra note 6

Trang 18

17

through both direct and indirect disclosures.43 Users maintaining a pseudononymous profile may become publicly identifiable based on whom the individual connects to, or what a friend writes on the individual’s wall.44 Therefore, designs should be aware that the intention

of the individual in protecting her or his identity extends beyond disclosure, to the management of disclosures about the individual, and the selective crafting of the online persona

it more obscure, and thus, more protected.45 If information is too vague

or incomplete to understand, it lacks clarity.46

Unlike identification, which focuses on the link between identity and information, clarity focuses on the link between content and some other external factor Many kinds of information can be removed from online disclosures to create obscurity Consider everyday communication, where shared interpersonal knowledge and linguistic styles facilitate interpersonal communication It is because of the sharing of knowledge within groups that we can “presuppose” in conversation, as Goffman argues.47 For the purposes of the argument, information that provides clarity can be conceptualized as the range of shared social, cultural and linguistic factors that enable presupposition The previously mentioned airplane passengers overhearing another passenger’s phone conversation

43 Judith S Donath & danah m boyd, Public Displays of Connection, 22(4) BTT ECH J OUR

71 (2004)

44 See, e.g., supra note 43

45 See, e.g., Frederic Stutzman & Woodrow Hartzog, Boundary Regulation in Social

Media, Paper presented at AOIR 10: Association of Internet Researchers Annual Meeting

Milwaukee, WI (2009), http://ssrn.com/abstract=1566904 (last accessed April 23, 2011)

46 Clarity, MACMILLAN D ICTIONARY ,

http://www.macmillandictionary.com/dictionary/american/clarity#clarity_3 (last accessed April 17, 2011)

47 Erving Goffman, Felicity's Condition, 89 AM J OUR OF S OC , 1 (1983)

Ngày đăng: 26/10/2022, 18:39

w