A new paper published by the Future of Privacy Forum examines the appropriate privacy paradigm for the world of the Internet of Things. The paper was co-authored by Hogan Lovells Privacy and Information Management practice leader Christopher Wolf who also is the founder and co-chair of the Future of Privacy Forum (with co-author Jules Polonetsky). The paper was released in conjunction with the FTC workshop on the Internet of Things.
The whitepaper posits that current implementations of Fair Information Practice Principles (FIPPs) are not easy adapted in the world of the Internet of Things, where nearly every device or appliance will be connected to the internet and collecting data about consumers. Attempting to provide meaningful “notice” in a world of billions of connected devices is not feasible when many devices lack meaningful user interfaces or screens, and relying on consumers to read thousands of Privacy Policies will lead to many simply “giving up” on their privacy. Similarly, FIPP’s strict usage limitations may thwart technological progress, because many socially valuable uses of data are not discovered until the data is already collected. The challenge then is to allow practices that will support progress, while providing appropriate controls over those practices that should be forestalled or constrained by appropriate consent.
To that end, the paper proposes the following principles:
Use anonymized data when practical. Anonymizing personal information decreases the risks that personally identifiable information will be used for unauthorized, malicious, or otherwise harmful purposes. Although there is always some risk of Re-Identification, when data sets are anonymized and stored properly, re-identification is no easy task.
Respect the context in which personally identifiable information is collected. Managing consumer expectations is a good first step; however, respect for context should not focus solely on what individuals “reasonably” expect. There may be unexpected new uses that turn out to be valuable societal advances or important new ways to use a product or service. Rigidly and narrowly specifying context could trap knowledge that is available and critical to progress. Finding a balance may require more sophisticated privacy impact assessments that can analyze the impact of risks or harms and assess the potential benefits for individuals and society.
Be transparent about data use. Organizations making decisions that affect individuals should, whenever feasible, disclose the high-level criteria used when making those decisions. This will help insure that factors – such as a user’s ethnicity, sexual orientation, and political preferences – are not factored into a company’s determinations when they would be irrelevant or unduly discriminatory.
Automate accountability mechanisms. Automated accountability mechanisms could monitor data usage and determine whether the uses comply with machine readable policies.
Develop Codes of Conduct. Self-regulatory codes of conduct will be the most effective means to honor these preferences and others in the rapidly evolving landscape of the Internet of Things. Codes of conduct could establish frameworks that enable individuals to associate usage preferences with their connected devices.
Provide individuals with reasonable access to personally identifiable information. This will likely enhance consumer engagement with and support of the Internet of Things.