The Three Laws of Privacy in the Internet of Things

Internet-of-ThingsLet’s say that tomorrow they invented the flying car and anyone could afford one. What would you see in your town by this weekend? A portion of your neighbors–the technology-driven–would already have one and be bumping into trees and power lines while they tried to park their new rides. A larger portion, let’s call them the technology-mindfuls, might be considering how such a car will fit into their lifestyle, and might be taking advantage of public transportation that flies already. Finally, there would be the technology-humbugs that simply would not have anything to do with these new-fangled flying machines.

Imagine the chaos of all these new flying cars in the air. What would be the new rules of the road? Could cars fly over each other? Would they need to follow existing streets? Where are flying cars allowed to park? How do we decide who is at fault when accidents occur? How long would it take legislators to battle out a new set of laws and policies regarding the use of flying cars? How long would it take security folks to develop defenses to protect lives and property from so many objects whizzing around in the sky?

What’s happening now with the Internet of Things feels a little bit like flying cars. We are seeing a multitude of new and generally affordable technologies that people are learning how to use–and misuse. We have technology-driven consumers buying the latest gadgets without fully understanding how to protect themselves while using them. We have companies developing these technologies without fully understanding how to protect the devices or the consumers that use them.

Now that International Privacy Day is over, I reflect on how far the technologies we’ve come to rely on provide us convenience and weigh the risks to our privacy and security by enabling them.

Security is concerned with protecting data: Access controls, encryption…basically making sure data’s confidentiality, integrity and availability cannot be compromised. Privacy is concerned about these things too. However, privacy also examines transparency–giving consumers more information about an organization’s practices regarding the collection and use of their data; and choice, empowering users to make decisions about what data is collected about them, about with whom it is shared, and about what is done with the data…and even how long it is stored.

I was reminded of one of my favorite movies based loosely on a story by one of my favorite authors: I, Robot by Issac Asimov. In the story, society is introduced to a fantastic new technology that promises to benefit all.

Aware that people tend to be afraid of the unknown and that this new technology may be viewed with suspicion, the manufacturer derives a set of Three Laws with the intent of assuring a nervous public that they have nothing to worry about.

The Three Laws of Robotics went something like this:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Quite predictably, when the Three Laws are violated, panic ensues.

In a way, the Internet of Things is much like the robots of the story. We have thousands of fantastic new technologies, all promising great benefits, each carrying the risk of making consumers afraid about how these technologies will affect them.

So I got to thinking, what if the Internet of Things had its own version of the Three Laws…the Three Privacy Laws of the IOT…rooted in Privacy By Design? They might read a little something like this:

  • An organization shall not collect data in a manner that causes injury to a person, or through unintended consequences, allows a person to come to harm.
  • An organization shall not collect more data than it needs to service a person and shall protect the data it collects at all times.
  • An organization collecting data about a person shall give that person visibility and control regarding the collection, retention and sharing of their data so long as it does not conflict with the Second Law.

A tech maker of IoT devices could certify that they had met all the requirements of the Laws and when the certification process was complete, they could let their customers know that they were “Three Laws Ready” just like the company in the movie.

No matter what new article I read about privacy or the IoT, it always comes back to one thing: Trust. Trust is more than about good intentions, as we discovered in 2014, the year of the mega breach. As the people in Asimov’s story discovered, its also about executing transparency, choice and damn good security. Miss any one of these and don’t be surprised if panic ensues.

As a privacy advocate, I’m passionate about making sure we get it right. Whether its robots or flying cars, the technologies are advancing faster than some people’s ability to keep up. That doesn’t mean the makers of those technologies have any more right to violating our privacy. Assuming ignorance on the part of consumers may be a time-honored way of making money, but public or private organizations that operate in such a manner will eventually discover who actually holds the power. And in this age of social media connections and instant communications, Hell hath no fury like a consumer burned.

And this technology-mindful really want his flying car, so they better get it right the first time.

Data Privacy – My Privacy Pet Peeve

English: Paper Shredders = Security, Privacy &...

I hear it every day. Someone, even privacy professionals, will use the phrase “data privacy.”

There’s a serious problem with this phrase no matter how you interpret it. Reading it one way, one might think it was referring to data’s privacy (pardon my improper use of the apostrophe here, but I use it to make a point.) Of course, data has no privacy. It could care less whether it is revealed or kept confidential. Its just data and its not shy at all.

Another way one could interpret this could be that the phrase speaks to the act of keeping data confidential. Again, this is a common but inadequate way of looking at what privacy is about.

The problem with the phrase is that when we speak of “data privacy” the conversation inevitably leads to discussions about security controls–safeguards to protecting the confidentiality and integrity of the data. We talk about encryption, access privileges, and (if we’re on our game) how to properly dispose of information when its no longer required.

Certainly one cannot have good privacy without good security, but talking about privacy exclusively in terms of security controls ignores the privacy controls that are so critical to good privacy. Things like individual choice, and notice. Why? Because it completely ignores what privacy is REALLY all about: Protecting the freedom of the individual to make decisions without unwanted influence.

Privacy is not and will never be about data except that data can be used as a tool to damage privacy. We should always seek to describe privacy in terms of how it affects the individual.

I’m willing to make one exception to this rule. When we want to describe various aspects of what affects our privacy then adding a modifier to privacy makes sense. For example, if I use the term “energy privacy” you know I am speaking about the privacy risks associated with one’s energy usage data and not one’s personally identifiable information. Its a good shortcut to start a conversation quickly.

But perhaps we can agree to be more purposeful around how we use these terms and recognize that they are not synonymous with the privacy of the individual?

Enhanced by Zemanta

The Concept of Energy Privacy

My comments are my own and do not necessarily reflect the opinion of my company.

English: WASHINGTON (Oct. 7, 2011) An advanced...

For the last several years, Personally Identifiable Information, or PII, has been the buzz in privacy circles. That’s old school now. By itself, PII is fairly useless for violating one’s privacy, except as it pertains to identity fraud, or when coupled with other sensitive information that ties our behavior to our identity.

Lately, I’ve been tossing a new phrase in my new role (and really to anyone that will listen): “Energy Privacy.” That is, privacy issues having to do with an energy utility customer’s detailed energy usage information, generally obtained through “smart meters” or “advanced metering infrastructure.” The concept of energy privacy is nothing new to utilities. They’ve been analyzing coarse-grained usage data for years and have been generally very good at protecting customer privacy while doing it.

The difference now is how fine the granularity is becoming. Forget monthly reads. Smart meters are reading our energy usage in near-real time (even though many utilities only collect reads every 15 minutes or every hour.) Privacy professionals typically fear that this means that 3rd parties will be able to tell when customers are home and when they are not based on their usage.

Please. That doesn’t begin to scratch the surface of what we can expect.

Don’t get me wrong. I believe smart meters and the smart grid in general can provide some great benefits to everyone: customers, utilities and 3rd parties wanting to sell awesome products and services that will improve our lives and perhaps help preserve the environment. Energy usage information will help utilities build grids that are more reliable and less susceptible to power outages while accommodating more unpredictable renewable energy sources like wind and solar, and a flood of new energy-soaking devices like electric cars. I get it and I embrace it as long as my privacy is respected.

But consider this analogy. Today’s smart meters are akin to binoculars on the sides of our homes. The algorithms used to analyze usage information in order to find patterns that describe how the energy is being used allow anyone with access to it to see inside our homes the types of devices we plug in. For example, refrigerators, air conditioners, or electric vehicles. Analysts can see when we’re using these devices, how often, and how many we have.

Tomorrow’s algorithms will be more like microscopes. Not only will we be able to see that a consumer has a refrigerator, but what brand and model it is, what condition it is in, and even how much food it has in it (full refrigerators use less energy than empty ones…if I know the expected output of your brand and model, I can determine this.) Analysts will be able to tell what you’re watching on television. Tomorrow’s algorithms will be able to not only detect devices, but predict behavior. Of course, early algorithms will be used to determine how we can save energy. That’s a primary reason smart grid exists. But what if an algorithm could be written to determine whether a single parent was neglecting their kids? Not enough food in the fridge, too much time on the game console? Must be bad parenting. What if usage data could be used to detect criminal activity or “unwanted” behavior…I don’t just mean pot growers. I mean anything that society deems unacceptable at the moment. Maybe someone has too many water features plugged in their backyard, or watches TV too much (shouldn’t you be looking for a job?) All that is needed to see an average person’s behavior inside their home is to examine their usage data.

Now couple that with California’s consideration of plans to build an energy data center to house and analyze all this energy usage data. Their intentions are good. They want to help plan future infrastructure needs, especially local governments. They want to help us reduce energy use. But when the government wants to peer inside our homes with a microscope, regardless of their stated intentions, what privacy do we really have left?

Some say that as long as the data is anonymous or aggregated that it should be fine to share the information. Does anyone recall the privacy breach at AOL in which hundreds of thousands of “anonymous” customers were at risk of having their personal searches tied to them? How long will it be before an algorithm is developed that can determine who we are simply by our energy use coupled with the treasure trove of free information available on the Internet, such as Google Maps? How difficult will it be for smart mathematicians to de-aggregate information that we thought was aggregated? I don’t know except that it will be sooner than we think.

Enter the importance of energy privacy. Our energy usage data will say more about us than whether we are home or not. A lot more. This by itself is not a bad thing IF we as consumers have control over whom the data is shared with and how it is used. Give consumers control and confidence builds.

My goal is to raise awareness of the importance–and value–of your energy usage data. So informed, you can begin to participate in the discussion about how your usage information will be used and whom it will be shared with. I believe that as long as consumers have knowledge of the risks of sharing this information, have the ability to decide who they would like to share it with (referred to as “opt-in”), and the ability to review and terminate any such sharing in the future, that the consumer then retains control of this information. Control equals power.

At the vanguard of protecting our energy privacy are utilities (who often get a bad rap for protecting such information) and privacy advocates who understand the potential risks and are fighting to preserve this last bastion of personal privacy. Why should utilities care about your privacy? Its quite simple: They don’t want you to remove the smart meter from your house. Even if you don’t fully trust your own utility, you can absolutely trust that they have an intrinsic business-minded reason to passionately protect your privacy. They want you to participate.

Now is the time for us all to consider how important our energy privacy is inside our own homes and how much intrusion we are willing to tolerate. Ask your utility and your government about your energy privacy and what they’re doing to protect it. Let’s have a conversation and ensure consumers retain the power they have every right to expect.

Enhanced by Zemanta

Microsoft Adheres to Privacy Principles

 

Microsoft took a bold step last month in announcing that IE 10 will ship with “do not track” enabled by default. Advertisers are up in arms about it. They claim it will “harm consumers.” Really? When we believe that protecting an individual’s privacy somehow harms them we have entered a very Orwellian world of double-speak.

 

Microsoft has adhered to a fundamental principle of Privacy By Design: Make privacy the default setting. All of us that ever hated Microsoft for shipping products with security and privacy features turned off (and every other feature turned on!) should be shouting for joy and leaping to defend this embattled company.

 

Microsoft made the right call. I hope they stick to it.

 

Will it hurt marketers and advertisers? Doubtful under this voluntary system (see related articles.) But let’s say they played by the rules and did not bypass the setting. If anything, it means marketers will have to try harder to convince consumers to overcome their inertia to disable privacy protection. Or here’s a novel idea, advertisers: Convince consumers to give you the information you want willingly instead of sneaking it from cookies and other deceptive tools.

 

What is true is that the direction we are headed is generally the wrong one. Everyone from big companies to political campaigns are recognizing the power of “big data” and they all want more of it. And let’s be very honest about why they want it: To manipulate you and me. We can go back and forth all day about it helps get the right ads in front of the right people, but remove all the double-speak and what you have left is manipulation.

 

I for one don’t want to be manipulated. Catered to, perhaps. Pampered, for sure. But go manipulate someone else as far as I’m concerned. Why not give every person browsing the web that same opportunity for privacy without having to take extra steps to protect themselves? The bold and the foolhardy can always undo the settings at their convenience.

 

In the end, Microsoft’s choice will not undo the millions of dollars spent on Internet advertising. Nor, unfortunately does the cynic in me believe it will technically protect our privacy. But it has started a conversation, which for now, is good enough for me. I hope more people wake up to the importance of protecting their fragile privacy.

 

 

 

 

Enhanced by Zemanta

Privacy? Its by the Men’s Room

Neiman Marcus

Image via Wikipedia

While visiting the local Neiman Marcus in San Francisco I happened to go to the basement floor to visit the restrooms and found this interesting notice about a product called Euclid posted on a small sign.

It reads, “To enhance our customer’s experience, we use Euclid to identify mobile devices in and around our stores. Only the information that your device publicly broadcasts will be collected. If you do not want this information collected, or want to learn more information about Euclid, visit euclidelements.com/consumer.”

On the website, the company swears they care about privacy and they do:

  • Limited data collection
  • Only share aggregated and anonymous information
  • Easy opt-out and delete

To opt out you have to share your MAC address with this company. It seems odd to have to share identifying information with a company in order to enable them not to identify you especially since then they will also have the MAC address of the computer you used to access their website and your IP address too! Apparently the company tracks phones listening for wifi access points so they can determine your MAC address (which uniquely identifies your phone on a wifi network.)

As soon as a hacker worth her salt breaks open their database, the movements of thousands of mobile phones through malls will become public information. How much does Euclid invest in information security? No idea.

Of course, the average person entering the store–or merely walking by– will never see this sign. Consumers are advised to kiss their privacy good-bye and perhaps to turn off wifi and Bluetooth on their phones when not in use.

Enhanced by Zemanta

Privacy? But I have nothing to hide!

Daniel Solove wrote a great article on why privacy matters even if one thinks they have nothing to hide. It is high time to dispel  this myth that if we’re “innocent” that we have nothing to hide.

Let me say it this way: Everyone has some information they do not want to fall into the wrong hands at the wrong time.

You may not have information you think needs to be hidden right now. But in a year, you may decide to run for office. You may have information that you are fine if your local bank sees, but you might be embarrassed if your co-workers had access to it. Otherwise, why not wear your social security number, date of birth and bank accounts on a tee-shirt?

Even if you think you have nothing personal to hide, what about those you love? Parents and grandparents, let me ask you some questions and tell me if they start to make you uncomfortable:

  • What time do your children get out of school?
  • What route to they walk home?
  • How long are they home alone?

There is a strong relationship between privacy and security. Each of the answers to these questions is technically “public” information and could in theory be learned legally by a third party who was very interested in the answers. But that doesn’t mean it’s something we’d want to share with a stranger who suddenly began asking these questions. Let’s face it. Even if we don’t care about our own privacy, surely there is someone’s privacy we do care about.

Everyone has some information they do not want to fall into the wrong hands at the wrong time.