The Internet of Things and privacy – Part three: Protections beyond consent
This article is part of a series on the Internet of Things and privacy:
- Part one: Issues with consent
- Part two: Solutions for consent
- Part three: Protections beyond consent
Limitations of consent
Using the solutions proposed in Part two: Solutions for consent in this series, would be a positive step for Internet of Things (IoT) vendors. However, we recognise it may be difficult or impractical for vendors to implement those ideas in their products or services. Consent requires interaction with a human, but many IoT devices do not have screens and are designed to operate with little or no human interaction.
In light of this, we should consider ways for vendors to improve the privacy of IoT devices that do not involve consent or human interaction.
Privacy as the default
The privacy by design (PbD) principle privacy as the default would suggest that default settings for IoT devices should deliver the maximum degree of privacy protection. While the implementation of this principle is highly context dependent, it can, in theory, be applied to any IoT device regardless of whether or not it has an interface.
While privacy as the default is not a substitute for consent, the principle has a range of privacy benefits. Primarily, it means that users are not required to take any action to protect their privacy, and inaction does not make individuals vulnerable to having their privacy compromised.
Decentralised data processing
Apart from PbD, we can look at reducing the amount of personal information IoT vendors collect. One way to do this could be to make a distinction between a device collecting personal information and the vendor of a device collecting personal information. For example, an IoT doorbell that performs facial recognition locally would, from a privacy perspective, be preferable to a doorbell that sends facial biometric data to a server for processing.
This could be a principle of decentralised data processing – where personal information collected by a device should not be disclosed to the vendor unless it is necessary to do so. This would be one way to preserve privacy without impairing the functionality of a device.
Of course, in this scenario it may be entirely unnecessary to collect facial data to begin with. It may instead be possible to achieve similar functionality by, for example, recognising residents from strangers through the proximity of the smartphones or watches of residents.
Granularity minimisation
Another way to minimise personal information collected and used by IoT vendors could be through granularity minimisation – where when personal information is collected, used or disclosed, the information should be made as coarse as possible (such as by aggregating it) while still being useful for whatever purpose it has. For example, precise information about an individual’s location collected for the purpose of analytics could be reduced in granularity by only storing the suburb that the individual is in. It could then be made even coarser by aggregating the information with other users in the same or nearby suburbs.
Highly granular data is always associated with higher privacy risks, as there is simply more data that is easier to link to specific individuals. But high granularity can also increase risks in other ways. For example, as discussed in part two, data can be used to make inferences, but when data has high granularity, such as with raw sensor data, those inferences can be profoundly personal to the point that they are unexpected and invasive. A practice of reducing the granularity of personal information wherever possible could go a long way in protecting privacy without needing human interaction.
OVIC is researching the privacy implications of the IoT and will be publishing an issues paper in early 2020.
This article was written by Asher Gibson, Policy Officer, OVIC. The views expressed in this post are the author’s own and do not necessarily reflect the views of OVIC.