Let’s say that tomorrow they invented the flying car and anyone could afford one. What would you see in your town by this weekend? A portion of your neighbors–the technology-driven–would already have one and be bumping into trees and power lines while they tried to park their new rides. A larger portion, let’s call them the technology-mindfuls, might be considering how such a car will fit into their lifestyle, and might be taking advantage of public transportation that flies already. Finally, there would be the technology-humbugs that simply would not have anything to do with these new-fangled flying machines.
Imagine the chaos of all these new flying cars in the air. What would be the new rules of the road? Could cars fly over each other? Would they need to follow existing streets? Where are flying cars allowed to park? How do we decide who is at fault when accidents occur? How long would it take legislators to battle out a new set of laws and policies regarding the use of flying cars? How long would it take security folks to develop defenses to protect lives and property from so many objects whizzing around in the sky?
What’s happening now with the Internet of Things feels a little bit like flying cars. We are seeing a multitude of new and generally affordable technologies that people are learning how to use–and misuse. We have technology-driven consumers buying the latest gadgets without fully understanding how to protect themselves while using them. We have companies developing these technologies without fully understanding how to protect the devices or the consumers that use them.
Now that International Privacy Day is over, I reflect on how far the technologies we’ve come to rely on provide us convenience and weigh the risks to our privacy and security by enabling them.
Security is concerned with protecting data: Access controls, encryption…basically making sure data’s confidentiality, integrity and availability cannot be compromised. Privacy is concerned about these things too. However, privacy also examines transparency–giving consumers more information about an organization’s practices regarding the collection and use of their data; and choice, empowering users to make decisions about what data is collected about them, about with whom it is shared, and about what is done with the data…and even how long it is stored.
I was reminded of one of my favorite movies based loosely on a story by one of my favorite authors: I, Robot by Issac Asimov. In the story, society is introduced to a fantastic new technology that promises to benefit all.
Aware that people tend to be afraid of the unknown and that this new technology may be viewed with suspicion, the manufacturer derives a set of Three Laws with the intent of assuring a nervous public that they have nothing to worry about.
The Three Laws of Robotics went something like this:
-
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
-
A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
-
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Quite predictably, when the Three Laws are violated, panic ensues.
In a way, the Internet of Things is much like the robots of the story. We have thousands of fantastic new technologies, all promising great benefits, each carrying the risk of making consumers afraid about how these technologies will affect them.
So I got to thinking, what if the Internet of Things had its own version of the Three Laws…the Three Privacy Laws of the IOT…rooted in Privacy By Design? They might read a little something like this:
- An organization shall not collect data in a manner that causes injury to a person, or through unintended consequences, allows a person to come to harm.
- An organization shall not collect more data than it needs to service a person and shall protect the data it collects at all times.
- An organization collecting data about a person shall give that person visibility and control regarding the collection, retention and sharing of their data so long as it does not conflict with the Second Law.
A tech maker of IoT devices could certify that they had met all the requirements of the Laws and when the certification process was complete, they could let their customers know that they were “Three Laws Ready” just like the company in the movie.
No matter what new article I read about privacy or the IoT, it always comes back to one thing: Trust. Trust is more than about good intentions, as we discovered in 2014, the year of the mega breach. As the people in Asimov’s story discovered, its also about executing transparency, choice and damn good security. Miss any one of these and don’t be surprised if panic ensues.
As a privacy advocate, I’m passionate about making sure we get it right. Whether its robots or flying cars, the technologies are advancing faster than some people’s ability to keep up. That doesn’t mean the makers of those technologies have any more right to violating our privacy. Assuming ignorance on the part of consumers may be a time-honored way of making money, but public or private organizations that operate in such a manner will eventually discover who actually holds the power. And in this age of social media connections and instant communications, Hell hath no fury like a consumer burned.
And this technology-mindful really want his flying car, so they better get it right the first time.