Privacy 2.0: An Uncomfortable Compliance

If the GDPR formalized our natural rights to digital privacy, why does so much of the work around data seem so unnatural in regards to privacy?

This new state of discomfort is manifested in both consumer experience and corporations usage of data.  In a future where everything is data driven, companies have to move beyond compliance to solve this awkward state.

Privacy Policy 2.0 – An Awkward Customer Experience

Anyone tired of consent popups and emails asking for consent to receive an email?  How about being redirected to a Privacy Policy to be told how cookies really work?

Privacy Policies are now in clear and understandable language instead of legalese, but you can’t help feeling like it was inspired by a lawyer protecting the company instead of your rights.

Most privacy policies are not being read.  They weren’t being read in Privacy 1.0, and even with clear language, they are still not being read in Privacy 2.0.  So where is the value?  Is this really what was meant by transparent and informed choice?

Privacy Policy 2.0 – Transparency Alone Falls Short

In fairness, it is no easy task to balance the need to be understandable, transparent, and protect your company from being fined in a compelling and interesting manner.  My own Privacy Policy fell short of what I wanted to do, and I am not even collecting data for secondary usage.

It shouldn’t be this hard to respect privacy and use data appropriately.  So why is it?  What are we missing here?

It has been said that data is the new oil.

If data is the new oil, then Privacy is the new dollar

Companies are coming up short on Privacy Capital, and this uncomfortable compliance can’t pay the bill for the data they want to use.

Where Are We on the Path to Privacy?

We are at the necessary, but temporary state of Privacy 2.0 – Privacy as an Afterthought or Compliance.  The initial emphasis of GDPR enforcement of transparency(start@3:46) is resulting in an attempt to do the right thing the wrong way (patching up 1.0 systems not designed for privacy).  Again, it is necessary, but awkward and it is manifested in Privacy Policies 2.0.

We went from Privacy Policy 1.0 – No privacy “get over it” to hefty monetary penalty avoidance on May 25, 2018.  This change created a scramble to compliance illustrated below as Privacy version 2.0 where most companies tried to remediate 1.0 systems instead of redesigning their systems.

The volume of usage should go down from the Wild West days of Privacy 1.0, and this decrease is to be expected.  The goal, however, ought to be the increase in the legitimate use of data, and Privacy 2.0 won’t get us there.

Facebook’s Privacy Capital Deficit Only Grew with 2.0 Transparency

Facebook’s privacy impact stats show the usage effect of exposing Privacy 1.0 practices.  The Cambridge Analytica revelation resulted in ~25% of Facebook users removing the Facebook app from their phone.  GDPR rights giving Facebook users the ability to download the info collected on them resulted in 47% of those users removing the app from their phone.

Facebook has incurred a growing Privacy Capital deficit which has impacted their stock due to a decreased use of data, increased security costs, and as this stock analysis article cites, impending U.S. privacy regulation.

The breach of trust has to be repaired and privacy as an afterthought won’t do it.  Privacy 2.0 begs the question of how do we get to Privacy 3.0 – Privacy by Design?

Identify Self-Defeating Organizational Factors

What’s preventing you from moving to Privacy 3.0 today?  Your legal team could be too busy worrying about being fined instead of privacy that enables data usage.  Your business wants to hide anything which will reduce the amount of data they can collect and use.   IT has to redesign their end to end data flow with privacy as the default while dealing with unclear guidelines, competing interests, and the lack of will/priority to invest in privacy.  Most organizations have not aligned all three of these groups to rationalize how they use data.

Privacy 2.0 is at best a transition phase.  Trying to duct tape privacy as an afterthought may get you compliant on primary data (regardless if it is needed), but it won’t enable you to use data for secondary purposes.  On the contrary, the longer a company puts off Privacy 3.0, the less data they will have to use and the more likely they will be to have consent leakage.  Consent leakage is when a company unwittingly violates consent choices of their customer because they never designed for privacy.  This approach is Lawsuit by Design, and it is the inevitable result of Privacy 2.0 mindsets.

So what does Privacy 3.0 look like?

  1. It understands and respects data privacy.  This is basic Golden Rule stuff as Senator Durbin pointed out in the Facebook hearing –   Mr. Zuckerberg, “Would you be comfortable sharing with us the name of the hotel you stayed in last night?” –senator no.  Well, then, if you don’t want to be tracked, don’t track others.  If you do want to be tracked, fine, but give others the choice just like you have.
  2. When privacy is respected and there is a Golden Rule commitment not to use data for secondary purposes without explicit and narrow consent, then companies must build systems that are designed to respect privacy programmatically and procedurally – Privacy by Design.

Where to Start? Separate Primary Data from Secondary Data

Don’t ask for consent to cover your assets when you already have a legitimate reason to process the data.  I am sure your lawyers have you covered in the EULA and TOS for primary data usage (check with them).  This justification requires that you fully know your DEN and have established a legitimate/legal basis for all collection, processing, and usage of data.  Your documentation and justifications should be sufficient for an audit.

Where your business has tried to sneak in secondary purpose usage of the data, remove it from the EULA/TOS and properly, visibly, and transparently ask for explicit consent.  This removal includes sharing data with “third parties” that is not required to fulfil the primary service.  Be prepared to demonstrate how you value consumer’s privacy and what you have put in place to keep their data secure and private.

You will want to throw in some value in your secondary data usage opt-in program in exchange for people letting you use their data.  Remember that digital privacy means that consumers maintain control over their data, and consent can be removed at any time along with that data.

If your company’s business model doesn’t ever need to use secondary data, then maybe Privacy 2.0 is sufficient since you don’t need consent.  Compliance in security and privacy (CPNI, PII, SPI) may be sufficient in those cases.

Privacy Pays

It is time to re-imagine the Privacy Policy in a way that raises Privacy Capital instead of chasing people away.

Privacy Policy 3.0 could change this dead space into the most read and heaviest trafficked page on your site.  This page should be digital bedrock for the two way dynamic relationship companies will have when their audience has a reason to trust them with their data.   In this oil rush, it is privacy or bust.

Comments are disabled here to consolidate comments here on LinkedIn.

Real World Consent Translated to Digital

Tsvi Lev recently commented about the Facebook data violations and the impending regulation by saying,

“This is NOT the end of analytics – it is the dawn of properly used analytics.”

New regulations requiring explicit consent have the potential to significantly change business models in advertising.  Consent may disrupt or consolidate many of the Ad Tech players in the market today who can’t or fail to change their business operations.  It is no secret that since the dawn of the internet cookies, Ad Tech hasn’t really been concerned about privacy.  For those companies, new regulation may be a day of reckoning, but it doesn’t have to be.

Back to the Future

We need to look back at the door to door salesman to understand what digital advertising has ignored as it assembled your shadow digital profile.   In a lot of ways, those salesmen were like internet opportunities.  When the salesman approached a potential customer they had seconds to connect with a customer before the door was shut in their face just like the 10s limit users will give you before moving on to another site.

The movie Secondhand Lions (SHL) nails the process of the door to door salesman getting consent before making a pitch and ultimately winning the sale.  Think about this guy’s sales approach as a parallel model for digital advertising as we walk through the wrong and right approach in three acts.

Act 1 – The Wrong Approach

The SHL scene opens on a rural Texas farm with a long drive that ends in front of a dilapidated wrap around porch.  Two old bachelors are sitting on the porch with their soon to be adopted nephew drinking ice tea.  Word has spread through the small town that these two have more money than they can spend which puts them in the demographic salesmen only dream about.

Each salesman drives on the brother’s private property, gets out of the car and immediately begins to pitch their goods.  Shortly after getting out of the car and beginning to speak, each salesman is greeted with multiple gunshots being fired over their heads as a warning.  The old bachelors are defending their private property from uninvited solicitations, and no sales are made.

The pitch was uninvited, initiated as a trespass on private property, and showed no respect for the two brothers.  The brothers responded by enforcing their privacy rights.

Act 2 – Personalization Done the Right Way

The next sales scene opens with the brothers and nephew on the porch again drinking tea with their shotguns when one brave salesmen comes out for a second attempt at approaching them.  He has a completely new approach.  Now he has a respect for their privacy rights and comes out waving a white flag and asking them not to shoot.  The reaction of Michael Cain’s character is “He’s been here before, this is no ordinary salesman, this guy is good”.

He addresses them by name and asks if they can talk (respect for their privacy and asking for consent).  Now instead of shots being fired, a two way discussion begins to take place.

The conditions for the continued conversation are laid down.  Robert Duvall’s character, Hub, tells the salesman to come out where we can see you (transparency).   The salesman asks them to put their guns down so he can show them what he has brought for their consideration (clear intent).  The salesman tells them to trust him (he treated them like people not demographics and showed respect for their privacy).  Hub agrees to hear what he has to show them, but then says “afterwards we’ll shoot him” (consent is not permanent and can be withdrawn at any time).

The salesman’s opening pitch is a changed approach.  He then goes on to say “due to the unsettling nature of our previous encounter, I searched the world over for the perfect item that would be just right for two exuberant sportsman such as yourself.”  Wow-quite a change, let’s look at what he did:

  1. He admitted that his first approach was wrong
  2. He searched for an item that would be of personal value to them
  3. He segments them into a group, exuberant sportsman, that they want to identify with (more appealing than rich guys with money to burn).

The salesman then shows them something that they didn’t know existed – a sporting clay launcher that even a kid can operate.  He brought them something he knew they would want.  This guy put some thought into analyzing the brothers as people and not customers to be fleeced.  His pitch starts with the statement that until now, only the heads of state could have such a product, and I am bringing to you the most powerful model at a reasonable price.  Needless to say, they buy the product.

This is personalization done right.  The salesman asked permission to speak to them.  He has come with something he knows they will want because he analyzed their first encounter.  He democratizes something previously out of reach to the common person.  He gives them quality at a fair price.  They didn’t even have to leave their house or exert much effort to have it.

Act 3 – The Final Sale

The movie ends with the nephew returning to the house after the brothers have died.  Anchored in small pond in the front of their property is a massive yacht barely floating in the shallow water.  When asked about the yacht in the ridiculously small pond, the nephew responds by saying “There was this traveling salesman…”

This one salesman succeeded where others failed by following a new model of sales that we should imitate in the digital world.  He demonstrated 7 key factors that serve as a template for digital interactions by:

  1. Respecting privacy
  2. Being transparent
  3. Asking for consent, realizing that consent can be withdrawn at any time
  4. Being clear about what he was asking consent for
  5. Earning trust
  6. Offering value – consent and trust got the brother’s “off the porch” and value determined further interaction
  7. Personalizing value and tailoring convenience

So, by asking for consent in a way that gave the potential customer control, choice, and transparency, the salesman gained trust.  He was then able to present them something that was of personal value to them.

These 7 key factors should be translated into the digital advertising world so that the door to personalization is opened by consent.  What was really gained was more than a customer.  These factors started a two way relationship that lasted a lifetime and exponentially multiplied the investment that was made in securing consent by trust and value.

Where does the Digital Advertising world need to go from here?

How do Facebook and digital advertisers move forward in the face of impending regulation?  A good start would be to admit to the public like our salesman “due to the unsettling nature of our previous encounter,” I will respect your privacy, be transparent with how we collect, process, and use your data, ask you if I can use your data, and provide you with something you value.

Each new data breach exposes to the general public how their privacy is being violated, and armed with that knowledge, they are starting to fire warning shots.  When companies realize that privacy by design is a requirement, this change will be “the dawn of properly used analytics” as Tsvi Lev stated.

Let the right personalization begin.

Comments can be posted as replies on Linkedin.

For Your Entertainment

Here are the links below to the Secondhand Lions scenes referenced in this blog.  You will probably find additional advertising insights of your own and a good laugh.

The wrong approach:

https://www.youtube.com/watch?v=fTaVWamH5YY

The right approach:

https://www.youtube.com/watch?v=LjOgreRmbBQ

Disclaimers:

By clicking on the YouTube links, you will leave the DataEDEN blog site.  The Ad Tech world will be sending your personal viewing information back to Google to serve you relevant ads at some point in your digital journey.

I found Secondhand Lions to be entertaining and funny with some mild language.  It illustrates enforcing consent choices with an intersection of  the 2nd Amendment being used to enforce the 4th Amendment which some might find offensive.  I do not endorse certain implied philosophic assertions concerning the foundations of belief, the importance of history, and the nature of truth contained in the movie.

 

Modifying the GDPR

I generally like the General Data Protection Regulation (GDPR).  It acknowledges privacy rights exist in the digital world.  It defines the following three key areas of a comprehensive data protection law:

  1. Data Security
  2. Data Privacy
  3. Data Consent

Security and Privacy were already previously regulated, and all companies that control personal data should already have been taking measures to secure that data internally and externally.  The sad fact is that even big businesses have failed at the most basic data security protections – Equifax.  Self-regulation has failed, and like an old car that costs more to maintain than replace,  the cost of self-regulation failures is too high to not replace it with well thought out legislation.  The Digital Revolution has matured, and like many disruptive industries in transition from new to normal, it needs the right legislation to continue to grow.  We can talk about personalization, automation, and AI all we want, but until we solve the privacy issue, new business models and innovations will be stunted as they are entirely dependent on using data.

Americanizing the GDPR

Since security and privacy have enjoyed a long and healthy discussion, I will focus more on consent related issues.

For all its glory, there is something in the EU’s GDPR that rubs some Americans the wrong way.  I see two structural differences between the U.S. and the EU that may elicit this reaction:

  1. What it means to Anonymize and Aggregate data
  2. The difference between Primary Data and Secondary Data

In general, the American view differs with the European approach to these two topics.  How we define the first difference determines where we categorize it in the second difference.

Anonymous or Pseudonymisation

When Americans think about Anonymization, they think about removing the identity from data so a specific individual cannot be identified.  This may also include aggregation to further obfuscate identifying an individual.  When the EU speaks of anonymizing data, it not only means an individual can’t be identified, but it also means that no two pieces of data can be correlated back to the same individual.  This extreme obfuscation prevents anything from being known about anyone.  While some may see this as a highly desirable state for data privacy, American pragmatism doesn’t consider this a state as it renders data useless for analytics.  We don’t talk about it in these terms because there is no point in even storing such data.  Whereas the European usage is technically correct, the American usage is pragmatically correct.  Think Standard vs. Metric or Fahrenheit vs. Celsius.  They each have their merits.

The GDPR labels useful anonymized data as pseudonymous data.  What it means is that there is enough useful data to correlate data element A created by unknown person #1 with data element B created by unknown person #1.  This correlation yields useful information that allows data processors to analyze it and gain insights of value while shielding the identity of the person.  So, the popular idea of American Anonymization is equal to the European Pseudonymisation.  Not a big deal as long as we understand the equivalency of the two terms.  It becomes relevant in understanding the U.S. Federal Communications Commission’s (FCC) existing regulation (title47sec222) that requires Telecommunications providers to “aggregate customer information” where they practically define the process of Anonymizing (removing identity) and Aggregating (groups of unidentified individuals) data before using it for Secondary Purposes.

The anonymization discussion is really a case of overlapping terms with different definitions and can be resolved functionally by defining what each means by anonymization.  The catch is, however, that the difference of definition also causes Americans to classify data in the Primary or Secondary Data class which in turn merits a different level of explicit regulation than the GDPR provides.  I want explore these distinctions now and arrive at an understanding that will serve as a starting point for determining how data should be regulated.

Primary Data

Let’s define Primary Data as data that is consumer provided or consumer generated in the course of using a discrete service.  For example, I want to consume a postpaid mobile phone service from a communications provider.  To use the service, I have to supply personal information for a credit check and a billing address.  I will need a device which will be addressable by a phone number, device number, and several other telco ids which will all be associated with me as a user of the service. In using the service, I will generate Call Detail Records (CDRs) that the provider will use to calculate my bill, monitor network service, process internal accounting, and understand internal business metrics.

All of these processing events use the data collected from the CDRs which were generated by me, the data subject.  It is personal and it contains Sensitive Personal Information (SPI), Customer Proprietary Network Information (CPNI), and Personally Identifiable Information (PII).   This data includes the history of where you have been (geo-location), who you have talked to, and what you have browsed on your smart phone.

The key point is that all of this data is required to provide me mobile phone service.  As long as the data collector (the communications provider) uses this data (as the data processor) for the fulfilment of the primary service, then it is Primary Data.  Primary Data does not need additional data subject consent beyond an informed and comprehendible End User License Agreement (EULA) or Terms of Service (TOS).  Since the data is required and must be processed to provide this primary service, the consent is implied upon accepting the EULA or TOS.  If you don’t accept those terms, you can’t use the service.

Secondary Data

Secondary Data is defined as taking data generated for a contracted service (via EULA or TOS) and using it for a purpose other than providing that primary service.  For example, a communications company uses my Primary Data to build a profile of my interests based on my browsing history and matches my interests with ads which are sent to my phone as I walk by a place of business catering to my interests.  This same data generated as Primary Data just became Secondary Data when it was processed (to profile my interests) and used for a purpose (generating additional money from advertising) outside of the primary service that it was collected for.

Primary or Secondary Data

What about services like Netflix?  They have Primary Data on what we watch.  They also profile us minimally at the house hold level for our interests.  Optionally, each member can register a separate identity and can be profiled on the individual level.  They process Primary Data and are tracking my interests just like the communications provider does when they want to send me relevant advertising.  Should the use of this Primary Data be classified as Secondary since the processing of the data has the same result of producing a profile of my interests?  No, Netflix is processing that Primary Data as part of the primary service to recommend video content you might be interested in.  These recommendations are aiding the primary service as I do not have 45 minutes to waste searching for content I haven’t seen every time I want to use their service.  If Netflix were to turn around and use that profile of my interests to start showing me advertisements, then they would be using the data for a secondary purpose other than what I am paying them for.

Regulation that doesn’t kill Business

The GDPR doesn’t explicitly delineate between Primary Data usage and Secondary Data usage.  By not distinguishing between the two types of data, data collectors/processors and their lawyers have been left with some confusion on what exactly they need to get explicit consent (opt-in) for.  If you read the entire regulation, I think you will see that most collection and usage of data that I have described as Primary Data has been given an exemption to requiring explicit consent.  IMO, I believe many EU companies are wasting a lot of money to secure the consent for this Primary Data that should be handled as implied consent in the EULA/TOS.  The U.S. legislation can avoid unnecessary expenses and the burdens of consent gathering and enforcement on Primary Data by allowing implied consent to be granted for Primary Data usage in the EULA/TOS.  All Secondary Data usage should be optional and require consent management to be implemented.

Leave comments as replies on Linkedin , or email comments to modifyingtheGDPR@dataEDEN.BlahDeeYada.com .

The Battle for Digital You

It’s About Rights

The creation of the internet and the Digital Revolution is producing a representation of you based on the data that you and your devices produce.  This revolution needs to acknowledge the creation of a virtual Digital You and yield to it the same privacy rights that the real physical you has.

The American Revolution and the subsequent founding of The United States crystalized the social contract of the new country.  The protections of life, liberty, and the pursuit of happiness were laid down, and boundaries of the federal government and unalienable rights of the individual were codified.  These unalienable rights in the physical world were legally protected by the government, and individuals had the right to protection against unlawful searches and seizures.

This foundation is the basis for our privacy laws today specifically looking at the Fourth Amendment which gave people the right to be secure in their persons, houses, papers, and effects.  The Fourth Amendment was later expanded by court interpretations and applied to electronic intrusions such as wiretapping, and the FCC extended applications to commercial intrusions.

We are now several years into the Digital Revolution, but our ethics and rights have not kept pace with the technology.  There is a battle being fought over Digital You, and without a translation of privacy rights protection to the digital realm, we will lose this war.

The Current State

I will leave out the discussion of how Government handles our data here and focus on Technology and Commerce who are waging war on Digital You’s privacy rights.

How did we arrive at this point in time that effectively has allowed Technology and Commerce to violate Digital You’s privacy with no real recourse to date?  The following quotes should have been a warning sign:

1999 – “You already have zero privacy.  Get over it.” –  Scott McNealy CEO of Sun Microsystems Inc.

2009- “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” – Eric Schmidt CEO of Google

Unsurprisingly with these views, Digital You is being built without your consent, control, and in most cases without your knowledge by Technology and Commerce.  How do we take back our unalienable rights that were never consented to be removed in the digital world?

“No Harm, No Foul” vs. Rights

Today, the United States handles the oversight between Commerce and Consumers via the FTC.  The FTC was commissioned to ensure adequate competition among businesses and that consumers were treated fairly by businesses.  This mandate worked in the physical world prior to Technology’s communications advances because invasions of privacy had little to do with commerce.  If someone violated your physical privacy, the trespass was seen.  There was a clear boundary between the areas of commerce and private property.  The Bill of Rights addressed private property and the FTC governed commerce.

The Commerce driven Digital Revolution now bleeds into privacy issues, and the FTC is not equipped with adequate legislation to enforce privacy based on rights.  In order to seek redress by the FTC, courts require proof that you have been harmed by a commerce practice.  This plan works in the physical world where an aggressive salesman might receive a warning shot if he trespassed on private property.  Violations were seen by the individual and they were allowed to protect their private property.

This agreement doesn’t translate into the digital realm where tech companies routinely peer into our private lives uninvited and undetected.  We have no way of defending Digital You from privacy violations where we are unarmed and unaware.  The FTC is of zero use in this case because they are not equipped to enforce rights.

Rights vs. “No Harm, No Foul” Illustrated

Let’s look at a physical example of a privacy issue to illustrate how something is approached based on either a rights issue or fair play.  You have a home and are away for the day.  During that time while you and your family are out, someone wanders into your home, opens the door, walks inside and looks around your house.  From this view, the intruder surmises that you are of moderate wealth, married, and have 2.5 kids.  They understand that you like a variety of music styles, what political party you belong to, and your religious preference.  This person then walks out of the house and records his insights into your lives.  Then you come home and discover that someone has entered into your house and looked around at all of your pictures and rooms.  You learn from your neighbor who it was and you are upset that your privacy has been violated by someone trespassing on their private property and private effects.

The question is has a crime occurred in this illustration?  In real life, the answer is yes, there is a crime of trespassing without consent or unlawful entry.  The basis of this charge is the Fourth Amendment and other clarifying laws.  Now let’s say that the person who improperly walked on to your property was a businessman who was conducting his own research to see what kind of goods you would really need and want so that he could bring them to you to buy.  Would this change your verdict? In the eyes of the law, this would not change the verdict, the man entered the property without consent or permission and was trespassing regardless of intent or harm.  It is a rights issue and a law issue.

If the FTC was in charge of enforcing the privacy of this home, there would be no way to bring a charge against the trespasser unless the property owner could prove that they were harmed by this unconsented act.  This would be ridiculous, but this is exactly the state of how Technology and Commerce are being regulated in areas of Digital You.  Fair play says that as long as tech companies don’t cause demonstrable harm, they can look at whatever they want wherever they can.  If privacy is a right however, you shouldn’t have to prove that harm was done to Digital You, it is enough that there was a trespass without consent to bring charges against the trespasser.

Taking Back Our Rights

Letting the FTC determine what can be done and what can’t be done via lawsuits will not ensure our rights.  Enforcement by fair play lawsuits are inadequate to protect rights which in the physical world are constitutional guarantees.  We need real legislation defining the rights of Digital You so that it is not left in the hands of commerce regulators who do not deal with enforcing and defining unalienable rights.

Privacy Rights are one of the few truly non-partisan issues, and yet, the public perception and debate is hobbled by half informed rhetoric to score points over the opposing party rather than the debate of statesmen.  Due to this inappropriate polarization and the modern habit of passing lengthy bills with inadequate public review, our best course would be to review the EU’s GDPR as a starting point, and Americanize it.  Legislation needs to transfer physical Constitutional privacy rights to our data rights and set up a framework that will establish privacy by design as a guiding principle.   Tech/Commerce will have to get consent to use our data and be held accountable to secure that data.

Americans should be allowed to choose how Digital You is created and used in a way that is informed, gives them control over Digital You, and with transparency into when and how Tech/Commerce uses our digital footprints.   These rights have to be protected by legislation with clear outcome guidelines backed up with significant penalties to ensure adequate data security and privacy by design are baked into all digital handling of our data.

Recent high profile data breaches continue to highlight the need.  Tech/Commerce giants like AppleAT&T, and even Facebook are suggesting we need thoughtful legislation.  The time to translate physical rights to digital rights is now.  The next decade’s quote should read:

2019 – “We no longer settle for choosing between privacy and technology.  Our technology can and will complement our privacy.”

Comments are disabled here to consolidate replies on Linkedin.  You can email your comments to battle4digitalyou@dataeden.blahdeeyada.com