Maybe we got off on the wrong foot. When Internet of Things (IoT) devices first started becoming popular on the market, most of us were firstly familiar with smart home devices. Our Alexas, Roombas, and other household brand names, were the start of (and end) of the public's understanding of IoT. So, I sympathise with anyone’s main takeaway being: “why would anyone need a smart fridge?”
Casting all IoT devices as being just overpriced light bulbs and voice-controlled TVs is like putting computers, phones/tablets and industrial control systems in the same box. At this point, the most common IoT device is probably Bluetooth headphones, and only a few other uses are environmental monitoring, medical devices, and fault diagnosis in industrial equipment. With such a wide variety of uses, I am interested in unpicking the relationship that people have with these devices and ideas.
A moment that stuck out to me, in a previous job role as an IoT engineer, was when a client said they didn’t want any lights to flash on their CO2 sensors (indicating high levels of CO2) because it made people feel as though they are being watched. The devices in this particular case were all used within a LoRaWAN network, a specification which by nature meant they must be low-power and only send a tiny amount of data every couple of minutes. Being battery-operated, it would be a challenge to ‘watch’ anyone even if for some reason we wanted to.
Where does distrust come from?
It almost doesn’t matter if it would be technologically impractical for some IoT tech to spy on you. Addressing where the distrust of technology comes from is a larger question and not as easily answered. Considering how to design devices to look and feel trustworthy is part of it. What about our perceptions of big technology companies, and how we view the figures behind them?
It can be hard to change perceptions, so it is important to get it right the first time. The idea of trust as it is from an information security perspective is separate from trust from a general consumer perspective. The concerns do bleed into each other in some ways: a shared concern for both groups being “what if others could see my sensitive information?”. These worries come from different places, though, where the infosec professional sees the technical vulnerabilities in lots of IoT devices, and the non-technologist may have fears coming from how they don’t think tech companies have their best interests at heart, the design of some devices causes unease, or just general paranoia.
With distrust in technology seemingly growing, I don't think it is solved only by people educating others about technology, or writing articles with titles like "No, You Aren't Being Watched". Many folks genuinely are very nervous about their privacy, and of being spied on. What they don't need is to be talked down on by people they already don't entirely trust.
Before distrust begins
What should come before educating people is considering the varied ways that technology makes people feel, and ensuring anyone around these products has choice in the way they engage in it. Learning can come after first impressions are already established. The sociological view on this is often overlooked. If there is an inherent distrust running through the population, whether the fears are founded or not, it won’t make a difference how innovative IoT (or the next emerging technology) is. We need to make sure the social and psychological component of interacting with technology is addressed at the beginning of creating it in the first place. With paranoia playing a key role in the spread of misinformation, we need to think about how to prevent it before the distrust begins.