Should We Worry About the IoT Being Used as a Weapon of Mass Control?
Technologies like IoT add a particularly alarming set of weapons to violent and controlling people. They can potentially extend the reach of an abuser manifold, amplifying the scope of harm to which domestic abuse victims, in particular, can be subjected. Only by combining the social with the technological can we hope to ensure our technology isn’t misused for evil intent.
A WHO report shows 35 percent of women, worldwide, have experienced violence, and 38 percent of murdered women were killed by their male intimate partner. These figures are frightening. As a woman who has experienced this type of violence, I know how easily it can happen. However, what I want to explore here what role technology—and specifically, Internet of Things (IoT) systems—have on domestic violence and abuse.
Before I begin, we need to look at what the word “violence”means. Firstly, violence can be both physical and non-physical. It’s an aggressive act, performed using methods that can be psychological, sexual, emotional or economic. I gave figures above showing violence against women, but of course, men experience plenty of violence too, including domestic violence. However, technology may add a particular slant to violent and controlling acts by offering the abuser an extended reach into the life of their victim that they might otherwise not have had.
The IoT Mantra Should Be “Do No Harm”
What if you had a one-night stand? While at your place, you let this person use your WiFi. You have, quite rightly, password-protected your network. So, you hand over your password, which gives them global access to your home network. They get up at 6 am and walk out the door, still holding your password. You forget to change it, which is easily done. What’s the worst that can happen?
Well, if this person turns out to be malicious and wants to hurt you, they can potentially use this access to perform a “man-in-the-middle” (MITM) attack, stealing your personal data, inserting malware, and downloading porn images via your IP address. They could also potentially hijack your IoT / smart home devices. All that’s needed is a readily available WiFi Pineapple and an unsecured IoT device.
This doesn’t have to be the case, of course, but it comes down to design, education, and the policy makers (more on this later).
According to research by Metova, around 90 percent of U.S. consumers now have a smart home device, but who’s controlling those devices?
Researcher and lecturer at University College London (UCL) Dr. Leonie Tanczer, has, along with her team members Dr. Simon Parkin, Dr Trupti Patel, Professor George Danezis and Isabel Lopez, looked at how the IoT can (and likely will) contribute to gender-based domestic violence. Their interdisciplinary project is named “Gender and IoT” (#GIoT), and it’s part of the PETRAS IoT Research Hub.
I spoke with Leonie, who has extensive knowledge in this area. She pointed me to some of their GIoT resources, including a tech abuse guide, a dedicated resource list for support services and survivors, as well as a new report that features some of the research groups pressing findings.
Leonie also shared some relevant insights, some of which I would like to highlight below.
Is IoT an Issue for Domestic Violence Now?
Leonie and her team have been working for the past year on how internet-enabled devices can be used as a controlling mechanism within a domestic abuse context. The team has concluded thus far that the threat is imminent but not yet fully expressed. As of now, spyware and other malicious code on laptops and phones are the primary culprits featured in tech abuse cases. However, with the expected expansion of IoT systems, as well as their often intrusive data collection and sharing features, IoT-facilitated abuse cases are a question of if but when.
What Unique Issues Exist Between IoT and Domestic Violence?
One may try to regulate the security of an IoT device by putting some of the best and most robust protection measures in place. However, in a domestic abuse situation, coercion can override any of these processes. If one partner is responsible for their purchase and maintenance, and has full control as well as knowledge over their functionalities, the power imbalance can result in one person being able to monitor and restrain the other. The team’s policy leaflet emphasizes this dynamic quite vividly.
Can You Map the IoT Device Use Pattern in an Abusive Relationship?Â
When it comes to the safety and security of a victim and survivor, with regards to tech, we have to consider the three phases of abuse. These phases interplay with the security practices a person has to adopt.
For instance, while still co-habiting, the abusive partner may use devices in the home to spy on the other. Their online usage can be monitored and their conversations recorded or filmed. Once a woman has extracted herself from this situation, they may seek help at a local shelter. In this instance, many are advised to simply stop using technology to prevent a perpetrator from contacting or locating them. However, the phase in which a person has to effectively “reset” their life is equally as central, but it becomes extremely hard in our interdependent society. During this period, women will have to change their passwords, regain control over their accounts, and try their best to identify any devices and platforms—including IoT systems—that they may have shared with the perpetrator. This can become tricky, and I don’t think we’ve properly thought about how to ease this process for them.
Leonie stressed in our conversation that industry stakeholders must take heed of the research being performed in this area and consider how their system might lend itself to forms of abuse. In the identity space, in which I work, there’s a saying: “The internet was designed without an identity layer.” This has led to a kind of retro-fix for the internet to try and overlay identity. It’s complicated and messy and has yet to be fully fixed; retro-active fixing of missing functionality isn’t the best way to design anything.
I agree with Leonie; we need to ensure IoT is designed to guarantee that misuse is minimized at the technology level. However, Leonie also points out that social problems won’t be solved by technical means alone. Besides, statutory service such as law enforcement, policymakers, and educational establishments, as well as women’s organisations and refuges, need to be incorporated into the design of these systems and made aware of this risk. Proactivity is needed and will help to ensure we don’t repeat technical mistakes we’ve made in the past.
Advice on Designing Anti-Abusive IoT Systems
During our discussion, Leonie gave me some ideas about design issues her team has found during lab sessions while testing IoT devices. These include the following:
- Remove any unintentional bias in the service. This can be helped by including multi-discipline people, from diverse backgrounds, on design teams.
- Enable relevant prompts. Send out alerts on who is connected to what. The GIoT team suggests that prompts could be used to inform users about essential details, including what devices are about trying to connect to their IoT systems.
- Offer more transparency and support. Offer clear and unambiguous manuals; prepare policies on what helpline staff can do to advise victims, should they inquire about guidance; and allow them to switch on features users need.
Overall, technology-enabled abuse is a human-centric issue, and technology alone cannot fix it. We need to work towards a socio-technical solution to the problem.
Conclusion
IoT devices are becoming an extension of our daily lives. Unfortunately, it’s inevitable that they will become weaponized by people who wish to do us harm. This will include abuse within a domestic context.
Design is a crucial first step in helping to minimize the use of internet-enabled devices as weapons of control. Privacy issues, like the recent Apple iPhone Facetime bug, demonstrate this well. The bug was a design flaw. It allowed, under certain circumstances, a caller to listen to people in the vicinity of the phone without the call being picked up. It’s been downplayed because those circumstances were uncommon. However, it doesn’t negate the point that testing the design of user interfaces (UI) and the user experience(UX) needs to be holistic and privacy must be placed as a key requisite to sign-off on a UI and a UX.
Designing IoT devices should be done by a multi-disciplinary team. We must remove unintentional bias and use different viewpoints and experiences. Only by combining the social with the technological can we hope to ensure our technology isn’t misused for nefarious reasons.