Privacy 2.0: An Uncomfortable Compliance

If the GDPR formalized our natural rights to digital privacy, why does so much of the work around data seem so unnatural in regards to privacy?

This new state of discomfort is manifested in both consumer experience and corporations usage of data.  In a future where everything is data driven, companies have to move beyond compliance to solve this awkward state.

Privacy Policy 2.0 – An Awkward Customer Experience

Anyone tired of consent popups and emails asking for consent to receive an email?  How about being redirected to a Privacy Policy to be told how cookies really work?

Privacy Policies are now in clear and understandable language instead of legalese, but you can’t help feeling like it was inspired by a lawyer protecting the company instead of your rights.

Most privacy policies are not being read.  They weren’t being read in Privacy 1.0, and even with clear language, they are still not being read in Privacy 2.0.  So where is the value?  Is this really what was meant by transparent and informed choice?

Privacy Policy 2.0 – Transparency Alone Falls Short

In fairness, it is no easy task to balance the need to be understandable, transparent, and protect your company from being fined in a compelling and interesting manner.  My own Privacy Policy fell short of what I wanted to do, and I am not even collecting data for secondary usage.

It shouldn’t be this hard to respect privacy and use data appropriately.  So why is it?  What are we missing here?

It has been said that data is the new oil.

If data is the new oil, then Privacy is the new dollar

Companies are coming up short on Privacy Capital, and this uncomfortable compliance can’t pay the bill for the data they want to use.

Where Are We on the Path to Privacy?

We are at the necessary, but temporary state of Privacy 2.0 – Privacy as an Afterthought or Compliance.  The initial emphasis of GDPR enforcement of transparency(start@3:46) is resulting in an attempt to do the right thing the wrong way (patching up 1.0 systems not designed for privacy).  Again, it is necessary, but awkward and it is manifested in Privacy Policies 2.0.

We went from Privacy Policy 1.0 – No privacy “get over it” to hefty monetary penalty avoidance on May 25, 2018.  This change created a scramble to compliance illustrated below as Privacy version 2.0 where most companies tried to remediate 1.0 systems instead of redesigning their systems.

The volume of usage should go down from the Wild West days of Privacy 1.0, and this decrease is to be expected.  The goal, however, ought to be the increase in the legitimate use of data, and Privacy 2.0 won’t get us there.

Facebook’s Privacy Capital Deficit Only Grew with 2.0 Transparency

Facebook’s privacy impact stats show the usage effect of exposing Privacy 1.0 practices.  The Cambridge Analytica revelation resulted in ~25% of Facebook users removing the Facebook app from their phone.  GDPR rights giving Facebook users the ability to download the info collected on them resulted in 47% of those users removing the app from their phone.

Facebook has incurred a growing Privacy Capital deficit which has impacted their stock due to a decreased use of data, increased security costs, and as this stock analysis article cites, impending U.S. privacy regulation.

The breach of trust has to be repaired and privacy as an afterthought won’t do it.  Privacy 2.0 begs the question of how do we get to Privacy 3.0 – Privacy by Design?

Identify Self-Defeating Organizational Factors

What’s preventing you from moving to Privacy 3.0 today?  Your legal team could be too busy worrying about being fined instead of privacy that enables data usage.  Your business wants to hide anything which will reduce the amount of data they can collect and use.   IT has to redesign their end to end data flow with privacy as the default while dealing with unclear guidelines, competing interests, and the lack of will/priority to invest in privacy.  Most organizations have not aligned all three of these groups to rationalize how they use data.

Privacy 2.0 is at best a transition phase.  Trying to duct tape privacy as an afterthought may get you compliant on primary data (regardless if it is needed), but it won’t enable you to use data for secondary purposes.  On the contrary, the longer a company puts off Privacy 3.0, the less data they will have to use and the more likely they will be to have consent leakage.  Consent leakage is when a company unwittingly violates consent choices of their customer because they never designed for privacy.  This approach is Lawsuit by Design, and it is the inevitable result of Privacy 2.0 mindsets.

So what does Privacy 3.0 look like?

  1. It understands and respects data privacy.  This is basic Golden Rule stuff as Senator Durbin pointed out in the Facebook hearing –   Mr. Zuckerberg, “Would you be comfortable sharing with us the name of the hotel you stayed in last night?” –senator no.  Well, then, if you don’t want to be tracked, don’t track others.  If you do want to be tracked, fine, but give others the choice just like you have.
  2. When privacy is respected and there is a Golden Rule commitment not to use data for secondary purposes without explicit and narrow consent, then companies must build systems that are designed to respect privacy programmatically and procedurally – Privacy by Design.

Where to Start? Separate Primary Data from Secondary Data

Don’t ask for consent to cover your assets when you already have a legitimate reason to process the data.  I am sure your lawyers have you covered in the EULA and TOS for primary data usage (check with them).  This justification requires that you fully know your DEN and have established a legitimate/legal basis for all collection, processing, and usage of data.  Your documentation and justifications should be sufficient for an audit.

Where your business has tried to sneak in secondary purpose usage of the data, remove it from the EULA/TOS and properly, visibly, and transparently ask for explicit consent.  This removal includes sharing data with “third parties” that is not required to fulfil the primary service.  Be prepared to demonstrate how you value consumer’s privacy and what you have put in place to keep their data secure and private.

You will want to throw in some value in your secondary data usage opt-in program in exchange for people letting you use their data.  Remember that digital privacy means that consumers maintain control over their data, and consent can be removed at any time along with that data.

If your company’s business model doesn’t ever need to use secondary data, then maybe Privacy 2.0 is sufficient since you don’t need consent.  Compliance in security and privacy (CPNI, PII, SPI) may be sufficient in those cases.

Privacy Pays

It is time to re-imagine the Privacy Policy in a way that raises Privacy Capital instead of chasing people away.

Privacy Policy 3.0 could change this dead space into the most read and heaviest trafficked page on your site.  This page should be digital bedrock for the two way dynamic relationship companies will have when their audience has a reason to trust them with their data.   In this oil rush, it is privacy or bust.

Comments are disabled here to consolidate comments here on LinkedIn.

Secondhand Lions GDPR Cookie Consent Privacy Taste Test

I was challenged by readers in my previous post “Real World Consent Translated to Digital” to describe a digital experience that follows the seven key factors derived from the traveling salesman in Secondhand Lions (SHL).

We are weeks past the start of GDPR (+4) and in addition to being flooded with emails trying to keep you on mailing lists, you have probably noticed some changes in cookie consent banners on web sites.  I’ve mocked up a privacy taste test, if you will, by looking at 4 examples of post GDPR cookie consent to illustrate how well companies are following the SHL traveling salesman’s 7 keys to successful digital interactions.

Cookie Consent is not the end all be all of evaluating a company’s privacy stance, but it is the most immediately visible and why it was chosen to illustrate keys 1-5.  If you are not identified, cookies are mostly collecting semi-pseudonymous data based on your internet browsing.

Official Rules of the SHL Cookie Consent Privacy Taste Test

Below are the seven SHL key factors to advertising digital interactions:

  1. Respecting privacy
  2. Being transparent
  3. Asking for consent, realizing that consent can be withdrawn at any time
  4. Being clear about what he was asking consent for
  5. Earning trust
  6. Offering value – consent and trust got the brother’s “off the porch” and value determined further interaction
  7. Personalizing value and tailoring convenience

The goal of these seven keys is to win consumer trust to begin a conversation in which the advertiser can present something of value to the consumer.  Ultimately, brands and advertisers want to get to a two way conversation with consumers at Key 7.  Brands will be evaluated on how well keys 1-4 earned trust, key 5.  Keys 6 and 7 are for another day and another post that evaluates personally identifiable information.

Key 1 – Respecting privacy:  We need a working definition to evaluate Key 1.  Combining two definitions of privacy from Westin and Wolfe (see post discussing what we mean by the right to privacy) we can define digital privacy as:

The right to control who watches or learns about you and maintaining that control over your personal information.

Key 2 – Being transparent:  I would call this the Hub McCann test after the encounter where Hub tells the traveling salesman hiding behind the car “to come out where we can see you”.  Does the brand show us what they are doing with our data?

Key 3 – Asking for consent that can be withdrawn at any time

Key 4 – Being clear about what you are asking consent for.

The approach and execution of Keys 1-4 will determine the level of trust earned – key 5.  Brands will be scored in each category using the following scale:  0 = Fail; 1 = Low; 2 = Medium; 3 = High

Brand A

Landing on the site of Brand A shows a cookie consent banner on the bottom of the screen on 5/29/2018 seen in figure 1 below.

Figure 1

The site informs you that they use cookies for your experience benefit.  Then they ask you to accept the cookie blindly or choose a cookie preference.  This notice fails in the Key 4 criteria which is why they changed it as of 6/7/2018 to the following notice in figure 2 below.  It really wasn’t clear what they were using the cookies for.

Figure 2

They are now giving us some examples from which we can make an informed decision to “Accept All Cookies”.  This is an improvement on the clear and understandable language of transparency from their original attempt.

You can choose to control how your data is collected and used by clicking on “Cookie Preferences” and a pop up sidebar on the left hand side of the screen appears illustrated in figure 3 below.

Figure 3

Brand A has divided their cookie preferences into 3 categories:

  1. Necessary Cookies –no control
  2. Functional Cookies – You have a choice of all on or all off for functional cookies. It however lacks full control and transparency.  It is one thing to analyze general website usage.  It is another to tell me you are going to suit my needs and improve my user experience.  This sounds like personalization that requires profiling me, but it is unclear.  If I want good website performance, I have to also accept personalization which is implied but not disclosed.
  3. Marketing Cookies – I understand I am going to be profiled if I accept these cookies. Intent is there, but it is not transparent who is going to get my data.  This statement lacks the clarity required by transparency, and the choice is all or nothing based on insufficient information.  Are they sending my data to 5 ad agencies or 250 ad agencies?  It is too general to be transparent, what are they hiding?

Brand A is a good example of doing the minimum to achieve compliance.  They do have a link for more information where we can visit their Privacy Policy which was effective the night before GDPR in clear and understandable language vs. legalese.

Brand B

Brand B’s cookie consent banner below (figure 4) gives the cookie intent with examples and data sharing activity along with a link to a cookie specific policy.  You have the choice to accept the cookie, or visit the cookie settings.

Figure 4

Clicking on Cookie settings opens a Privacy Preference Center (figure 5) with 4 categories of cookies to opt-in to and 2 information tabs.  They are transparent and clear with the intent, and they list the specific cookies used for each category.  I have control of each category, and I don’t have to accept profiling to get good site performance.  This site had 49 targeting cookies which had to be accepted all or nothing.

Figure 5
Brand C

Brand C’s cookie consent banner is brief, but straight forward (Figure 6).   They list 3 ways cookies are used and offer a link to their more detailed cookie notice.  If you don’t agree to the cookies, you can click on manage and a cookie control box pops up (Figure 7).

Figure 6
Figure 7

Brand C gives full control to check which of the 5 cookie categories you opt into or in this case out of.  They also allow you to opt out of each of their 265 cookies spread over 61 screens averaging 4.3 partners per screen.

Brand D

Brand D has really done nothing more than meet the pre-GDPR EU cookie notice pictured below (figure 8) with a link to a privacy policy and an accept button.

Figure 8

They give no real control, and they are not transparent about what is being done.  Your only choice is to accept or deny the cookie – all or nothing.

Taste Test Results

Brand D is not respecting privacy or being transparent and is likely not compliant.  Brand A is the example of doing the minimal for compliance.  As such, Brand A fails to meet the SHL transparency test of “coming out where we can see you” as we have no idea what 3rd parties are getting our data.  Choice and control are also limited.  Overall, Brand A is compliant, but is not respecting privacy or being transparent enough to warrant much trust.

Brands B and C are both respecting privacy and being transparent.  Brand B has the better UX design and is clearer, but Brand C gives more control.  Brand B could use more control as it prevents me from having personalization from say Google but not Facebook.  Both have moved beyond compliance and are winning trust.

Companies who actually respect privacy will make it easy to choose and maintain control of personal information.  Their efforts will be rewarded with greater trust.

Advertising Implications

The take away of this taste test is that compliance with no change in respecting privacy, transparency, control, and consent loses out to those who go beyond compliance.  The foundation of successful digital interactions is trust and merely complying doesn’t win that trust.

Smart companies will respect privacy before they are compelled to do so.  In a Post-GDPR world, this means that people outside of the EU are not treated with less respect for their privacy than EU citizens.  Companies may want to consider the messages they are sending about who they are if they are complying with EU citizens’ data privacy but continue business as usual for those outside the EU.  Respecting privacy wins in the new era of Advertising and Analytics done correctly.  What you waiting for?  Get your audience off the porch with transparent privacy and engage on the level of personalization (SHL Key 7) that leads to higher value relationships.

Comments are disabled here to consolidate replies here on Linkedin.

The DPO – Beyond Compliance

Compliance day for GDPR is just over a month away.  I know that your company is fully compliant, so you can probably skip this for yourself, but perhaps you have a friend who might benefit from it. Wouldn’t it be nice if your friend’s company, that might be slightly less than 100% compliant, could prioritize their GDPR efforts?

The good folks at the IAPP have created an info graphic for the European Supervisory Authorities’ Top 8 GDPR Enforcement Priorities that your friend’s company will need have completed to merit a good faith effort in compliance.  The #1 priority – the DPO.

“Have you appointed a data protection officer (DPO) who is responsible for processing activities,” they ask.

Note that the DPO responsibility is for processing activities.  Out of all of the requirements, why are they focusing on processing activities?

The DPO Priority and Focus

Processing data is the most widely ignored pre-GDPR aspect of Privacy by Design.  It is also the foundation of protecting personal data as the eight enforcement priorities go on to describe.  Prior to the GDPR, there was some general awareness of things like data retention.  The Ad Tech world had cookie consent and tag management which dealt with the collection of data.  Marketing departments and their tools had some notion of outgoing contact consent so that people were not sent offers who didn’t want to receive them.

These efforts dealt with data collection and data usage, but they completely ignored the most important part of internal data privacy which is data processing.  The DPO’s job is to fix this gap as he/she systematically and enterprise wide understands and orchestrates the entire flow of data in the enterprise from collection, to processing, and to usage of the data.  This is no small charge especially for medium to large companies with large data assets.

The primary requirement for the DPO role is to fully understand a company’s Data Ecosystem Network (DEN).  The DEN is the network of connected data that flows into, through, and out of the company.  It is the synthesis of data products that form a cohesive system that includes the collection, engineering, analysis, presentation, and action of or upon data.

Prior to the advent of Big Data, most data was siloed in applications.  While it might have been consolidated in a Data Warehouse, this data was not networked and combined on an ongoing basis to feed a data driven organization.  With the increase in machine learning, IoT, automation, and AI, the networking and combination of data will multiply.  The DPO must be the master of this Data Ecosystem Network and that mastery starts with understanding how data is processed in the DEN.

The DPO must be Independent

The DPO will face many issues in moving a company to compliance.  Not the least of these challenges will be the native tension between The Business, The Legal Team, and IT.  For this reason, the GDPR Recital 97 states that:

the DPO “should be in a position to perform their duties and tasks in an independent manner.”

This tension is illustrated in the following diagram showing the connected relationship between these three groups and the data.  All three have to be aligned to succeed in Privacy by Design and all three will have competing interests that will work against Privacy by Design.

The DPO’s job is to align all three divisions so that Privacy by Design is institutionalized.  Success means that the DPO by aligning these three groups enables the Business to profit from data while enforcing the requirements from Legal in a way that can be rationalized in a cohesive IT architecture in the DEN.  To pull off this balancing act, the DPO will need to understand the business enterprise, the legal regulations, and the technical IT architecture.  The DPO will also have to have authority over how data flows into, through, and out of the company which is why the position has to be independent of Business, Legal, and IT.

The following diagrams show the pitfalls of not having an independent DPO vs. creating an organizational structure with an independent DPO that can implement Privacy by Design.

The green and red examples are not the only way to position the DPO, and there may be other ways to achieve an independent DPO.  The Legal Team and IT need to ultimately be enabling the Business and all three have to be aligned around data protection.  The point is, if you want Privacy by Design, then you need to find a way to give independent authority to the DPO so that the other three can’t derail the DPO’s mandate.  Otherwise, the DPO will be caught in the middle of the three groups, and the company will be the loser.

The DPO Must Know your Data

The IAPP lists the second priority as Data Inventory and Mapping.  A lot of IT departments have been in the process of inventorying their data for the purpose of identifying their data assets for additional use in a Big Data environment.  I think what may be new with the DPO role is the need to classify data not in terms of technical or business meta data, but in terms of ontology.

The DPO will need to find a scalable way to rationalize legitimate processing from non-legitimate processing before privacy can be enforced in the DEN.  Classifying data based on the nature of the data and the purpose of its processing will need to be done alongside the data inventory and mapping.  In my previous post “Modifying the GDPR”, I discuss the difference between Primary Data and Secondary Data and suggest that this is the starting point for an ontological classification of personal data usage.

The DPO – Beyond Compliance

Most of the work being done for GDPR privacy preparations at this point is probably better labeled as Privacy as an Afterthought.  Going beyond May 25, the DPO needs to influence the company culture to move privacy into the business planning stage.  Privacy that enables companies to use Secondary Data will need to be part of the corporate strategy.  The DPO should be sitting at the table with the strategy team.  It only makes sense that with the central role data plays in IoT, automation, and AI, that the DPO becomes a thought leader that drives new revenue while respecting your customer’s privacy rights.

Some of you are saying to yourselves “this guy just blew his credibility as a privacy advocate, privacy and business shouldn’t mix”(Article 38#6).  In response, I would suggest that unless privacy moves beyond compliance and is seen as both a foundational right and as a strategic asset, privacy could become a back office checklist to be ignored or worked around.  I would rather have a DPO sitting at the table changing the mindset around privacy than for privacy to be forgotten after the GDPR anxiety passes.

Beyond changing the business mindset, privacy needs to be inserted into the design process.  The DPO will need to insert data privacy into the Dev/Sec/Ops cycle.  Data driven companies in the middle of digital transformation have to become data centric and as such, must start with data Privacy by Design.   The DPO will need to forge the path to Sec/Priv-Dev/Ops to ensure that privacy starts before development begins.

Who is ready for GDPR?

Well, this was probably a review for you because you have had your DPO in place for a long time, and they have implemented privacy by design beyond just good faith.  You know what the DPO should be doing and are ready for May 25th.  The rest of us should probably review those GDPR Enforcement Priorities.

Comments are disabled here to consolidate replies on Linkedin.  Email comments to

photo credit: stockcatalog

Modifying the GDPR

I generally like the General Data Protection Regulation (GDPR).  It acknowledges privacy rights exist in the digital world.  It defines the following three key areas of a comprehensive data protection law:

  1. Data Security
  2. Data Privacy
  3. Data Consent

Security and Privacy were already previously regulated, and all companies that control personal data should already have been taking measures to secure that data internally and externally.  The sad fact is that even big businesses have failed at the most basic data security protections – Equifax.  Self-regulation has failed, and like an old car that costs more to maintain than replace,  the cost of self-regulation failures is too high to not replace it with well thought out legislation.  The Digital Revolution has matured, and like many disruptive industries in transition from new to normal, it needs the right legislation to continue to grow.  We can talk about personalization, automation, and AI all we want, but until we solve the privacy issue, new business models and innovations will be stunted as they are entirely dependent on using data.

Americanizing the GDPR

Since security and privacy have enjoyed a long and healthy discussion, I will focus more on consent related issues.

For all its glory, there is something in the EU’s GDPR that rubs some Americans the wrong way.  I see two structural differences between the U.S. and the EU that may elicit this reaction:

  1. What it means to Anonymize and Aggregate data
  2. The difference between Primary Data and Secondary Data

In general, the American view differs with the European approach to these two topics.  How we define the first difference determines where we categorize it in the second difference.

Anonymous or Pseudonymisation

When Americans think about Anonymization, they think about removing the identity from data so a specific individual cannot be identified.  This may also include aggregation to further obfuscate identifying an individual.  When the EU speaks of anonymizing data, it not only means an individual can’t be identified, but it also means that no two pieces of data can be correlated back to the same individual.  This extreme obfuscation prevents anything from being known about anyone.  While some may see this as a highly desirable state for data privacy, American pragmatism doesn’t consider this a state as it renders data useless for analytics.  We don’t talk about it in these terms because there is no point in even storing such data.  Whereas the European usage is technically correct, the American usage is pragmatically correct.  Think Standard vs. Metric or Fahrenheit vs. Celsius.  They each have their merits.

The GDPR labels useful anonymized data as pseudonymous data.  What it means is that there is enough useful data to correlate data element A created by unknown person #1 with data element B created by unknown person #1.  This correlation yields useful information that allows data processors to analyze it and gain insights of value while shielding the identity of the person.  So, the popular idea of American Anonymization is equal to the European Pseudonymisation.  Not a big deal as long as we understand the equivalency of the two terms.  It becomes relevant in understanding the U.S. Federal Communications Commission’s (FCC) existing regulation (title47sec222) that requires Telecommunications providers to “aggregate customer information” where they practically define the process of Anonymizing (removing identity) and Aggregating (groups of unidentified individuals) data before using it for Secondary Purposes.

The anonymization discussion is really a case of overlapping terms with different definitions and can be resolved functionally by defining what each means by anonymization.  The catch is, however, that the difference of definition also causes Americans to classify data in the Primary or Secondary Data class which in turn merits a different level of explicit regulation than the GDPR provides.  I want explore these distinctions now and arrive at an understanding that will serve as a starting point for determining how data should be regulated.

Primary Data

Let’s define Primary Data as data that is consumer provided or consumer generated in the course of using a discrete service.  For example, I want to consume a postpaid mobile phone service from a communications provider.  To use the service, I have to supply personal information for a credit check and a billing address.  I will need a device which will be addressable by a phone number, device number, and several other telco ids which will all be associated with me as a user of the service. In using the service, I will generate Call Detail Records (CDRs) that the provider will use to calculate my bill, monitor network service, process internal accounting, and understand internal business metrics.

All of these processing events use the data collected from the CDRs which were generated by me, the data subject.  It is personal and it contains Sensitive Personal Information (SPI), Customer Proprietary Network Information (CPNI), and Personally Identifiable Information (PII).   This data includes the history of where you have been (geo-location), who you have talked to, and what you have browsed on your smart phone.

The key point is that all of this data is required to provide me mobile phone service.  As long as the data collector (the communications provider) uses this data (as the data processor) for the fulfilment of the primary service, then it is Primary Data.  Primary Data does not need additional data subject consent beyond an informed and comprehendible End User License Agreement (EULA) or Terms of Service (TOS).  Since the data is required and must be processed to provide this primary service, the consent is implied upon accepting the EULA or TOS.  If you don’t accept those terms, you can’t use the service.

Secondary Data

Secondary Data is defined as taking data generated for a contracted service (via EULA or TOS) and using it for a purpose other than providing that primary service.  For example, a communications company uses my Primary Data to build a profile of my interests based on my browsing history and matches my interests with ads which are sent to my phone as I walk by a place of business catering to my interests.  This same data generated as Primary Data just became Secondary Data when it was processed (to profile my interests) and used for a purpose (generating additional money from advertising) outside of the primary service that it was collected for.

Primary or Secondary Data

What about services like Netflix?  They have Primary Data on what we watch.  They also profile us minimally at the house hold level for our interests.  Optionally, each member can register a separate identity and can be profiled on the individual level.  They process Primary Data and are tracking my interests just like the communications provider does when they want to send me relevant advertising.  Should the use of this Primary Data be classified as Secondary since the processing of the data has the same result of producing a profile of my interests?  No, Netflix is processing that Primary Data as part of the primary service to recommend video content you might be interested in.  These recommendations are aiding the primary service as I do not have 45 minutes to waste searching for content I haven’t seen every time I want to use their service.  If Netflix were to turn around and use that profile of my interests to start showing me advertisements, then they would be using the data for a secondary purpose other than what I am paying them for.

Regulation that doesn’t kill Business

The GDPR doesn’t explicitly delineate between Primary Data usage and Secondary Data usage.  By not distinguishing between the two types of data, data collectors/processors and their lawyers have been left with some confusion on what exactly they need to get explicit consent (opt-in) for.  If you read the entire regulation, I think you will see that most collection and usage of data that I have described as Primary Data has been given an exemption to requiring explicit consent.  IMO, I believe many EU companies are wasting a lot of money to secure the consent for this Primary Data that should be handled as implied consent in the EULA/TOS.  The U.S. legislation can avoid unnecessary expenses and the burdens of consent gathering and enforcement on Primary Data by allowing implied consent to be granted for Primary Data usage in the EULA/TOS.  All Secondary Data usage should be optional and require consent management to be implemented.

Leave comments as replies on Linkedin , or email comments to .

The Battle for Digital You

It’s About Rights

The creation of the internet and the Digital Revolution is producing a representation of you based on the data that you and your devices produce.  This revolution needs to acknowledge the creation of a virtual Digital You and yield to it the same privacy rights that the real physical you has.

The American Revolution and the subsequent founding of The United States crystalized the social contract of the new country.  The protections of life, liberty, and the pursuit of happiness were laid down, and boundaries of the federal government and unalienable rights of the individual were codified.  These unalienable rights in the physical world were legally protected by the government, and individuals had the right to protection against unlawful searches and seizures.

This foundation is the basis for our privacy laws today specifically looking at the Fourth Amendment which gave people the right to be secure in their persons, houses, papers, and effects.  The Fourth Amendment was later expanded by court interpretations and applied to electronic intrusions such as wiretapping, and the FCC extended applications to commercial intrusions.

We are now several years into the Digital Revolution, but our ethics and rights have not kept pace with the technology.  There is a battle being fought over Digital You, and without a translation of privacy rights protection to the digital realm, we will lose this war.

The Current State

I will leave out the discussion of how Government handles our data here and focus on Technology and Commerce who are waging war on Digital You’s privacy rights.

How did we arrive at this point in time that effectively has allowed Technology and Commerce to violate Digital You’s privacy with no real recourse to date?  The following quotes should have been a warning sign:

1999 – “You already have zero privacy.  Get over it.” –  Scott McNealy CEO of Sun Microsystems Inc.

2009- “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” – Eric Schmidt CEO of Google

Unsurprisingly with these views, Digital You is being built without your consent, control, and in most cases without your knowledge by Technology and Commerce.  How do we take back our unalienable rights that were never consented to be removed in the digital world?

“No Harm, No Foul” vs. Rights

Today, the United States handles the oversight between Commerce and Consumers via the FTC.  The FTC was commissioned to ensure adequate competition among businesses and that consumers were treated fairly by businesses.  This mandate worked in the physical world prior to Technology’s communications advances because invasions of privacy had little to do with commerce.  If someone violated your physical privacy, the trespass was seen.  There was a clear boundary between the areas of commerce and private property.  The Bill of Rights addressed private property and the FTC governed commerce.

The Commerce driven Digital Revolution now bleeds into privacy issues, and the FTC is not equipped with adequate legislation to enforce privacy based on rights.  In order to seek redress by the FTC, courts require proof that you have been harmed by a commerce practice.  This plan works in the physical world where an aggressive salesman might receive a warning shot if he trespassed on private property.  Violations were seen by the individual and they were allowed to protect their private property.

This agreement doesn’t translate into the digital realm where tech companies routinely peer into our private lives uninvited and undetected.  We have no way of defending Digital You from privacy violations where we are unarmed and unaware.  The FTC is of zero use in this case because they are not equipped to enforce rights.

Rights vs. “No Harm, No Foul” Illustrated

Let’s look at a physical example of a privacy issue to illustrate how something is approached based on either a rights issue or fair play.  You have a home and are away for the day.  During that time while you and your family are out, someone wanders into your home, opens the door, walks inside and looks around your house.  From this view, the intruder surmises that you are of moderate wealth, married, and have 2.5 kids.  They understand that you like a variety of music styles, what political party you belong to, and your religious preference.  This person then walks out of the house and records his insights into your lives.  Then you come home and discover that someone has entered into your house and looked around at all of your pictures and rooms.  You learn from your neighbor who it was and you are upset that your privacy has been violated by someone trespassing on their private property and private effects.

The question is has a crime occurred in this illustration?  In real life, the answer is yes, there is a crime of trespassing without consent or unlawful entry.  The basis of this charge is the Fourth Amendment and other clarifying laws.  Now let’s say that the person who improperly walked on to your property was a businessman who was conducting his own research to see what kind of goods you would really need and want so that he could bring them to you to buy.  Would this change your verdict? In the eyes of the law, this would not change the verdict, the man entered the property without consent or permission and was trespassing regardless of intent or harm.  It is a rights issue and a law issue.

If the FTC was in charge of enforcing the privacy of this home, there would be no way to bring a charge against the trespasser unless the property owner could prove that they were harmed by this unconsented act.  This would be ridiculous, but this is exactly the state of how Technology and Commerce are being regulated in areas of Digital You.  Fair play says that as long as tech companies don’t cause demonstrable harm, they can look at whatever they want wherever they can.  If privacy is a right however, you shouldn’t have to prove that harm was done to Digital You, it is enough that there was a trespass without consent to bring charges against the trespasser.

Taking Back Our Rights

Letting the FTC determine what can be done and what can’t be done via lawsuits will not ensure our rights.  Enforcement by fair play lawsuits are inadequate to protect rights which in the physical world are constitutional guarantees.  We need real legislation defining the rights of Digital You so that it is not left in the hands of commerce regulators who do not deal with enforcing and defining unalienable rights.

Privacy Rights are one of the few truly non-partisan issues, and yet, the public perception and debate is hobbled by half informed rhetoric to score points over the opposing party rather than the debate of statesmen.  Due to this inappropriate polarization and the modern habit of passing lengthy bills with inadequate public review, our best course would be to review the EU’s GDPR as a starting point, and Americanize it.  Legislation needs to transfer physical Constitutional privacy rights to our data rights and set up a framework that will establish privacy by design as a guiding principle.   Tech/Commerce will have to get consent to use our data and be held accountable to secure that data.

Americans should be allowed to choose how Digital You is created and used in a way that is informed, gives them control over Digital You, and with transparency into when and how Tech/Commerce uses our digital footprints.   These rights have to be protected by legislation with clear outcome guidelines backed up with significant penalties to ensure adequate data security and privacy by design are baked into all digital handling of our data.

Recent high profile data breaches continue to highlight the need.  Tech/Commerce giants like AppleAT&T, and even Facebook are suggesting we need thoughtful legislation.  The time to translate physical rights to digital rights is now.  The next decade’s quote should read:

2019 – “We no longer settle for choosing between privacy and technology.  Our technology can and will complement our privacy.”

Comments are disabled here to consolidate replies on Linkedin.  You can email your comments to