top of page

Designing a Privacy-Enhancing Smart Camera System that Empowers Underserved Users

Current smart home devices prioritize the needs of a primary user/owner. Yet these devices inevitably impact the privacy of people nearby, such as family, friends, neighbors, and domestic workers. These vulnerable and underserved users interact with smart home devices with relatively little or no direct awareness, consent, access, or benefits. This project responds to this pressing problem through an in-depth research and design inquiry. 

EMPLOYER

University of Washington's Interaction Design Department, on a part-time contract basis 

 

TEAM

Asst. Prof. James Pierce, PI and Design Lead

Robyn Anderson, Research Lead

Jackson Jiang, Interaction Design

Wang Jiang, Industrial Design 

Claire Weizenegger, Research Support

TIMELINE

Design: 4 weeks

Testing: 2 weeks

Analysis & Reporting: 1 week

I was hired by the University of Washington's Interaction Design department to lead a series of studies exploring issues of privacy, trust, and ethical challenges related to the use of interactive smart home cameras. This case study summarizes the first phase of an ongoing, multi-part research initiative funded by the National Science Foundation. My responsibilities include: defining objectives, developing research protocols, running recruitment, writing interview discussion guides, preparing user interface prototypes in Figma, moderating interviews, analyzing interview data, summarizing findings, and contributing to academic paper submissions. 

MY ROLE

EXPERTISE

  • Evaluative research

  • Concept evaluation

  • Usability testing 

  • Prototype refinement

  • Discussion guide design

  • Semi-structured interviews

  • Analysis and reporting 

  • Project management

Background

Current smart home devices prioritize the needs of a primary owner/user, yet these devices impact the privacy of people nearby, such as family, friends, neighbors, and domestic workers. These adjacent subjects/users interact with smart home devices with relatively little or no direct awareness, consent, access, or benefits. While research on bystander [2,3], non-primary [4], and adjacent user privacy [5] has emerged in response to these issues, interaction designers in academia and industry have yet to propose concrete design solutions aimed at encouraging primary users to extend control, transparency, trust, consent, cooperation, autonomy, and respect to adjacent users.

Digital privacy discourse tends to focus on relationships between individuals (users, customers, subjects) and powerful, surveillant organizations such as companies, third-party advertisers, governments, and cybercriminals. This dimension of privacy as been referred to vertical privacy [1]. Vertically, there is a significant power imbalance owing in part to a structural relation where the surveilling organization is some combination of hidden “behind the scenes” (e.g., a third party advertiser or cybercriminal), or in a position of extreme authority (e.g., corporate service provider or government agency).

Horizontal privacy, on the other hand, involves interpersonal, and often face-to-face relations between peers such as family members, friends, neighbors, landlords, tenants, and domestic workers. Horizontal relations often entail power imbalances, such as between a parent and child, employer and nanny, or landlord and tenant. However, these relationships often involve some level of pre-existing trust, cooperation, and social exchange. When a horizontal relation is especially skewed, it is sometimes refer to it as a diagonal relation. Diagonal relations are not as distant and authoritative as a vertical relations, yet shares elements of power imbalance and impersonal social relation.

nanny_camera_.jpeg

Our research finds that smart home cameras pose significant privacy harms along the horizontal dimension, particularly between primary users who own and operate devices and adjacent users who have little or no direct awareness, consent, control, or benefit of smart devices that nonetheless may affect, and harm them. Given this, we ask ourselves and the broader design community the following questions. 

​

How might we design a smart home camera system that enables and encourages primary users to extend control, transparency, trust, consent, cooperation, autonomy, and respect to adjacent users?

​

What if companies and primary users prioritized the needs of non-paying and not legally protected adjacent users--neighbors, tenants, guests, children, domestic workers, passersby, and others?

​

What if new products and services, and their interfaces, encouraged and helped shape new social norms and expectations in more inclusive, cooperative, and privacy-enhancing ways?

​

What if smart cameras and other sensing devices become so ubiquitous and invasive that such designs are demanded?

Meet Arca: A Smart Camera for You and Your Extended Household

Full Arca Hero.png

Goals and Objectives

The research featured in this case study is part of an ongoing series of user studies lead by James Pierce PhD on adjacent user privacy and smart devices. Prior to this study, James' team evaluated early stage Arca prototypes with 10 participants. Learnings from that initial round of testing informed the latest Arca prototype explored in this case study. The goals listed below describe the overarching aims of this ongoing research, while the objectives describe the aims of this particular study. 

Goals

Deeply understand issues connected to adjacent user privacy, everyday surveillance, and smart devices

​

Design and test plausible solutions that improve privacy, trust, transparency, control, and inclusion

​

Generalize design principles, patterns, and problem-framings, and identify limitations of technology and interfaces to address these issues

​

Create prototypes that challenge dominant social norms, primary-user-centric design philosophies, and the needs of individuals and paying customers over adjacent users.




Objectives

Concept evaluation: assess whether our design features may or may not be useful, applicable, or otherwise valuable to primary and adjacent users

​

Exploratory study: more generally understand participants’ preferences, experiences, and concerns with privacy, trust, control, inclusion, and social relationships with regards to smart camera devices

​

Preliminary usability testing: assess whether Arca's specific design features are intuitive, and where necessary, learnable 

​

​

Testing Metrics

Screen Shot 2023-02-19 at 7.20.08 PM.png
arca other hero.png

Stakeholders and Scenarios

owner _ user.jpeg

Primary Owners/Users 

The primary owner/user is typically the purchaser of the smart device and corresponding subscription plan. As these devices become increasingly more affordable, smart home cameras are being used by people of across the gender, race, and socio-economic spectrum.

doorbell.jpeg

Adjacent Subjects/Users 

Adjacent subjects/users may interact with smart devices but do so with relatively little or no direct awareness, consent, access, control, or benefits. They typically include the family, friends, neighbors, and hired help of primary owners/users.  

scenarios2.png

Use Case Scenarios

To inform our design ideation, we iteratively developed many use cases and scenarios. These scenarios allowed us to imagine a diversity of contexts in which better privacy features might be desired. Download the complete booklet to learn about use cases for household members, guests, neighbors, and domestic workers. 

Prototypes

The Arca prototypes featured in this case study were designed by James Pierce, Jackson Jiang, Wang Jiang of UW's Interaction Design department. These prototypes are the latest in a series of designs this team as been iterating on since summer, 2022. Click through the slide deck below for a look at Arca's features, including true on/off, privacy modes, shared access, and others. 

Slide 16_9 - 397.png

Data Collection

Participants

6 participants

All primary users​, 3 of whom had adjacent user experiences, including a nanny, Airbnb guest, and a surveilled neighbor

Methods

Concept evaluations conducted via 60 minute remote semi-structured interviews over Zoom

​

​

Tools

Zoom

Figma 

Google Docs

Otter.ai

 

Recruitment Criteria

​To participate in our study, respondents to our intake survey needed to meet either or both of the following criteria: 

  • (Adjacent User) People who have been monitored by a smart camera in a domestic setting

​

  • (Primary User) People who currently use, or have used, a smart camera to monitor a domestic setting

​

Recruitment Channels

  • Existing pipeline of participants from previous
    smart device studies conducted by this team 

​

  • Social media channels: Nextdoor and Facebook 

​

  • Paid ads on Craigslist and similar pages 

​

  • Flyer placement around town

​

​

​

​

Screen Shot 2023-02-20 at 2.27.30 PM.png
Screen Shot 2023-02-20 at 2.28.20 PM.png

Findings

Comprehension

Do participants understand the intended use and functionality of the prototype? Are they able to correctly identify and describe key features without help?

All participants grasped the basic use and functionality of Arca’s main control panel, including the multiple camera feeds, separate video and audio controls, and display of shared users and active events. This granular level of visibility and control was exciting and empowering to many participants, prompting favorable comparisons to other less customizable smart camera applications. For example, one participant highlighted the value of the in-app and on-device audio indicators we introduced: “I find it helpful to have more control over whether you want to hear the audio or not, because in my case I almost never wanna hear the audio” (P4).

Across all participants, our design intent of the LED indicators was extremely well validated. Participants found true off to be very intuitive, even without any instruction. Participants were also able to correctly guess and immediately understand which LED indicator referred to the mic, which referred to the camera, and which states indicated true on and true off. As we expected, participants could not intuit the meaning of partial and strong privacy indicators based on the LED lights alone. But as we hoped, once we introduced the features the indicators were easily learned and regarded as intuitive. 

​

Some participants wondered where Arca's recordings “lived” and desired more information about storage, access, and duration of recordings.Where's this stored? Is it being deleted, or is it permanently being stored somewhere?" (P3). This and related responses encourages us to continue developing other features we have begun to explore, such as better “data switching” controls for managing storage and duration of content.

​

control pnale phone.png
LEDs.png

Value

Can the participant envision using Arca's novel features?  In what ways do Arca's features and functionality add value to the user's experience (i.e., use cases and scenarios)?  

Of Arca’s features, Partial Privacy and Unmasking were the most intriguing to participants. Most participants had never before encountered or considered this type of functionality. Some participants speculated that Arca’s privacy settings might encourage them to extend their camera usage to areas of the home typically considered too private for surveillance, like the living room. “I could see [partial privacy] being intriguing, or like enticing to get people more comfortable, having cameras in their homes like how I was saying, like, we’re not comfortable having a camera in the living room [currently]” (P5). Privacy modes and unmasking offered the most feeding ground for discussion. 

​

We found that strong privacy seemed to hold the least value and presented the most uncertainty. As one participated explained, “I’m going to want to see what happened!” (P1). This suggests extreme reluctance to give up the ability to watch an event, particularly when the device is sending you an abstract notification that something has just happened. We continue to suspect that strong privacy can be highly applicable to contexts such as Airbnbs, workplaces, and domestic settings when properly configured with Rules and Exceptions. However, we have not yet tested Rules and Exceptions with participants to gauge if these indeed improve the usefulness of this feature.

​

Overall unmasking sparked significant interest and all participants identified at least one use case where they could imagine using it. One participant for instance, identified value in the “Double accountability of unmasking that prompts conversations between users” (P4). However, we also found evidence that the unmasking feature could lead to a false sense of privacy and security. One participant stated that this feature only truthfully works if all actors are informed about the capabilities. Else, they thought it was deceptive. “If you were saying to people like, ‘Oh, you're not being recorded right now’ ... and then, later on, you could uncover that and actually see what was said or what was happening ... that makes me a little uncomfortable .... unless everybody knew that that could be a possibility ... its a little deceptive” (P5). One the one hand, this confirms our intent that partial privacy has limited utility unless you are utilizing shared access. On the other hand, this finding highlights that features like Partial Privacy has notable limitations for protecting the privacy of users excluded from Shared Access.

partial privacy and unmasking.png
strong privacy phone.png
partial privacy and unmasking.png

Attitudes

How do participants feel about Arca? In what ways does the Arca prototype impact users' attitudes about privacy, trust, control, and social tensions?

Many participants said that Arca’s interface design prompted them to consider the privacy needs of adjacent users when they might not have otherwise done so. The shared access and partial privacy features in particular inspired participants to anticipate scenarios wherein they would willingly forfeit a degree of visibility or control to benefit adjacent users. Some participants articulated scenarios that incorporated the needs of domestic workers, while others only felt comfortable with scenarios that included closer relationships such as a spouse, friend, or roommate. Participants with personal experience as both a primary owner/user and as adjacent subject/user were better able to grapple with the nuanced tradeoffs inherent in these features. This dual perspective allowed them to evaluate the concepts with both stakeholder's needs in mind and were more likely to consider and incorporate the privacy needs of adjacent users.

​

One participant loved unmasking because it discourages surveillance with an extra step, which offers positive evidence supporting our “speed bump” design pattern for inhibiting viewing. Another participants expressed that the transparent viewing histories of Partial Privacy could facilitate better consideration and communication between users. “It makes them pause to say: Do you really wanna see this? And I think that has good privacy implications” (P4). 

 

Several participants expressed that our features hold potential to empower and encourage primary users to disclose devices and initiate better communication with adjacent users. One participant who had worked as a nanny recalled an instance when she discovered that her employer had been surveilling her without her knowledge for months. Her employers did not go out of their way to hide the camera, but neither did they directly notify her of its presence. She hesitated to broach the topic with her employers in part because she assumed a compromise would be impossible given the always on/ off nature of smart cameras. Our design prompted this participant to imagine a scenario that enabled her and her employers to engage in an open dialogue about their privacy and security needs. “I think it would start changing the conversation...we can discuss and come up with what's comfortable for all of us.” (P6)

​

As anticipated, some primary users/owners are still reluctant to consider the needs of adjacent users, unless there exists an established relationship (friend, family, etc). Many participants could not imagine scenarios where Partial Privacy and Strong Privacy would be helpful as a primary user: “Why would I want to see a blurry version versus just a full on version?” (P1). Not surprisingly, our study underscores that for many users and use contexts, people are simply unwilling to compromise and give up functionality for the benefit of others’ privacy.

​

Implications

What future opportunities, challenges, and solutions do these concepts suggest might be available to designers? 

In what ways might the design patterns presented in this prototype extend to future applications?

In an age of digital, networked, and “always on” technology, the concept of digital “on” and “off” states can be virtually meaningless. We argue that smart product designers need to create better controls and indicators for complex states “between” traditional “on” and “off.” To enhance privacy, Arca employs a more general design pattern we call attenuation controls. Partial and Strong Privacy modes allow users to attenuate, or diminish, the sensitivity of sensors or the display of data. The underlying metaphor is inspired by a dimmer light switch for sensors. Arca offers 2 primary states between True On and True Off: Partial and Strong Privacy.

​

In most domestic contexts, smart home cameras are consumer products that are largely under the control of primary users. While laws, policies, and norms [12,34,38] guide their behavior, and design and engineering implementations dictate parameters of use, primary user ultimately have both the power and responsibility to consider the needs and experiences of adjacent users. Our features such as Partial Privacy and Shared Access are not designed merely as neutral options; they aim to invite, encourage, nudge, and persuade primary users to, in a sense, voluntarily limit and inhibit their surveillant users of their smart cameras. We employ 3 basic design patterns to achieve this. Partial privacy and unmasking employs a speed bump pattern to discourage, but not outright prevent the unmasking interactions by increasing physical and cognitive efforts. Partial Privacy and the unmasking option further employs a social accountability pattern by notifying users if video is unmasked and watermarking the video. Partial Privacy furthers additional involves a more subtle latent privacy safeguard pattern [36]: by discouraging users from unmasking, video and audio data is more likely to get auto-deleted thus preventing users from accumulating saved video clips.

​

Traditional privacy/security often focuses on vertical dimensions and harms from improper disclosure of personally sensitive information, our research suggests that many (though certainly not all) adjacent users do not necessarily mind being recorded, but they do mind the lack of “communication,” “respect,” and “professionalism” from primary users.This suggests a very broad category of privacy design patterns we are currently calling peace offerings and conversation starters. Our studies reveal that even if our specific privacy modes and access sharing features are not regularly used, they may nonetheless function as mechanisms to facilitate better, more open conversation between primary and adjacent users. They might also play in a role in shaping more inclusive and open social norms around everyday surveillance and horizontal privacy relations.

References

1. Andrejevic, Mark. "The work of watching one another: Lateral surveillance, risk, and governance." Surveillance & Society 2.4 (2004).

 

2. Julia Bernd, Ruba Abu-Salma, Junghyun Choy, and Alisa Frik. 2022. Balancing Power Dynamics in Smart Homes: Nannies' Perspectives on How Cameras Reflect and Affect Relationships. Proceedings of the 18th Symposium on Usable Privacy and Security (SOUPS '22). To appear

​

3. Yu-Ting Cheng, Mathias Funk, Wenn-Chieh Tsai, and Lin-Lin Chen. 2019. Peekaboo Cam: Designing an Observational Camera for Home Ecologies Concerning Privacy. In Proceedings of the 2019 on Designing Interactive Systems Conference (DIS '19). Association for Computing Machinery, New York, NY, USA, 823–836. https://doi.org/10.1145/3322276.3323699

​

4. Ohad Inbar and Noam Tractinsky. "FEATURE The incidental user." interactions 16.4 (2009): 56- 59.

​

5. James Pierce, Claire Weizenegger, Parag Nandi, Isha Agarwal, Gwenna Gram, Jade Hurrle, Hannah Liao, Betty Lo, Aaron Park, Aivy Phan, Mark Shumskiy, and Grace Sturlaugson. 2022. Addressing Adjacent Actor Privacy: Designing for Bystanders, Co-Users, and Surveilled Subjects of Smart Home Cameras. In Designing Interactive Systems Conference (DIS '22). Association for Computing Machinery, New York, NY, USA, 26–40. https://doi.org/10.1145/3532106.3535195

​

​

bottom of page