Sensing or Watching?: Balancing Utility and Privacy in Sensing Systems via Collection and Enforcement Mechanisms
Published in ACM Symposium on Access Control Models and Technologies (SACMAT), 2018
Recommended citation: Lee, A.J., Biehl, J.T. & Curry, C. 2018. "Sensing or Watching?: Balancing Utility and Privacy in Sensing Systems via Collection and Enforcement Mechanisms." In Proceedings of the 23nd ACM on Symposium on Access Control Models and Technologies (SACMAT '18). ACM, New York, NY, USA, pp. 105-116.
Devices with embedded sensors are permeating the computing landscape, allowing the collection and analysis of rich data about individuals, smart spaces, and their interactions. This class of devices enables a useful array of home automation and connected workplace functionality to individuals within instrumented spaces. Unfortunately, the increasing pervasiveness of sensors can lead to perceptions of privacy loss by their occupants. Given that many instrumented spaces exist as platforms outside of a user’s control—e.g., IoT sensors in the home that rely on cloud infrastructure or connected workplaces managed by one’s employer—enforcing access controls via a trusted reference monitor may do little to assuage individuals’ privacy concerns. This calls for novel enforcement mechanisms for controlling access to sensed data. In this paper, we investigate the interplay between sensor fidelity and individual comfort, with the goal of understanding the design space for effective, yet palatable, sensors for the workplace. In the context of a common space contextualization task, we survey and interview individuals about their comfort with three common sensing modalities: video, audio, and passive infrared. This allows us to explore the extent to which discomfort with sensor platforms is a function of detected states or sensed data. Our findings uncover interesting interplays between content, context, fidelity, history, and privacy. This, in turn, leads to design recommendations regarding how to increase comfort with sensing technologies by revisiting the mechanisms by which user preferences and policies are enforced in situations where the infrastructure itself is not trusted.