Five Things Companies Can Do

fb congress

Earlier this week I wrote a long-winded post describing steps companies can take – in light of recent concerns about companies misusing personal data – to make sure their technologies are offering us all something of value.

Here are the five things, in abbreviated form, that companies can start doing now:

  1. Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture, minimize the amount of personal data you will need to provide the service, give the end-user control, and be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.
  2. Value Proposition: Make privacy protections and good practice a central point of differentiation. Make it core to your overall value proposition.
  3. Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data.
  4. Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Consider a products ethics committee, including bringing in an ethicist. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. Remember the last thing you want is for your CEO to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.
  5. Data Use as a Corporate Social Responsibility: Make data use and innovation part of your company’s CSR policies where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders.

Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their client.

Advertisements

Brave New World, Inc.

Minority Report

Earlier this week, Rana Foroohar wrote in the Financial Times that “Companies are the cops in our modern-day dystopia”:

The mass surveillance and technology depicted in the [2002 movie Minority Report] — location-based personalised advertising, facial recognition, newspapers that updated themselves — are ubiquitous today. The only thing director Steven Spielberg got wrong was the need for psychics. Instead, law enforcement can turn to data and technologies provided by companies like Google, Facebook, Amazon and intelligence group Palantir.

The dystopian perspective on these capabilities is worth remembering at a time when the private sector is being pulled ever more deeply into the business of crime fighting and intelligence gathering. Last week, the American Civil Liberties Union and several other rights groups called on Amazon to stop selling its Orwellian-sounding Rekognition image processing system to law enforcement officials, saying it was “primed for abuse in the hands of government”.

the-wire-lester

I have written a few posts already about the potential for governments and private companies to use new technologies such as cryptocurrencies, biometrics and data mining to engage in activities that we would normally associate with the fictional totalitarian regimes of George Orwell or Aldous Huxley. With regards to state actors, like China, using biometrics for crime prevention, I wrote:

But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.

And with regards to cryptocurrencies, I wrote:

Although neither George Orwell or Aldous Huxley’s dystopian futures predicted a world governed by corporations as opposed to authoritarian governments, it may be more plausible to imagine a world where corporations control the money supply, not with coins and bills but cryptocurrencies. In fact, the fad amongst many technologists today is to encourage the disintermediation (or deregulation) of money by moving to Blockchain-based cryptocurrencies like Bitcoin. But instead of removing the middleman, we are more likely – contrary to the idealists’ ambitions — to open the door to empower big tech companies like Amazon, Facebook and Google to tokenize their platforms, replacing one currency regulator with corporate ones.

But private companies are able to do so much more with the data that we so generously (and often naively) hand them. The possibilities for abuse seem endless. To a large degree, the new GDPR mitigates this risk by giving the consumer visibility about and control over how her data is being used, and hopefully building trust between consumers and their service providers.  As stated here before, more important than complying with strict new laws, “to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.”

But what happens if consumers are not given the opportunity to intelligently grant consent or agree to use a service that shares their data? The first GDPR complaints have been filed precisely on these grounds:

Across four complaints, related to Facebook, Instagram, WhatsApp and Google’s Android operating system, European consumer rights organisation Noyb argues that the companies have forced users into agreeing to new terms of service, in breach of the requirement in the law that such consent should be freely given.

Continue reading “Brave New World, Inc.”

Someone Got a Privacy Lawyer (or Didn’t)

LATIMES GDPR

This morning, on the day the GDPR came into effect, I tried to access the Los Angeles Times – one of the US’s leading newspapers – but found myself redirected to this landing page. Apparently, the LA Times is not prepared for the new law. At least they had the sense and seriousness to stop European access. Four percent global worldwide annual revenue or €20 million (whichever is higher) in fines for non-compliance is no joking matter.

So, did the LA Times have a privacy lawyer who said they weren’t ready, or a lawyer smart enough to know that until they got that privacy lawyer, they wouldn’t be ready? Or, did they simply make a business decision that their European market wasn’t big enough to justify the costs of compliance?

UPDATE:

ChicagoT

It looks like the La Times isn’t the only one.  The Chicago Tribune has the exact same landing page, so it appears that all of the publications managed by tronc, inc. are following the same protocol.

GDPR: You’re Now the Most Popular Kid at School

TCsHave you noticed that you’re now receiving dozens of emails from all sorts of services that you forget you ever used? Everyone is suddenly sending you requests to accept their new privacy terms. Yes, that’s the imminent arrival of the EU’s new General Data Protection Regulation, due out in stores on May 25, 2018, talking to you. So if you’re feeling like the prettiest one at the dance or the most popular kid at school, then you have the GDPR to thank (along with its threat of massive fines).

Personally, I am taking advantage of the opportunity to clean house by NOT accepting the terms of all of those services I am no longer interested in using.

Note: The above image compares the length of end user terms of various social media platforms. From left to right: illegible company, Google, Tinder, Twitter, Facebook, Snapchat, and Instagram. I found the image a few weeks back on Twitter but unfortunately cannot attribute the image to the original tweet or its owner.  

Maybe the GDPR Isn’t So Bad for Blockchain After All?

Opps
An innocent “oops”

In two recent posts, thinking that I was really smart and clever, I questioned whether the GDPR posed a major hurdle for Blockchain-based technologies. Again, thinking that I was so smart and clever, I proudly took my arguments to one of my favorite privacy lawyers, confident I was about to impress her.

And as is often the case when I am feeling smart and clever, I was quickly put in my place by someone who actually knew what she was talking about.  First, let’s be fair to me. The GDPR does pose challenges for Blockchain-based technologies, as it does for any service whether on or offline that stores personal data. Data controllers will need to procure consent from data subjects and storage of data will need to be limited in time based on the purpose for which it is being stored.

The concern I originally raised was the conflict between Blockchains’ main feature of creating a permanent and unalterable record with the legal rights of a data subject to be able to modify or delete her personal data upon request (aka, the right to be forgotten). But a much smarter and more clever colleague – let’s call her Jane – explained to me that the right to be forgotten is not absolute.

Imagine you buy property. The local property registrar records the purchase with your name listed as the property owner. You may later sell that property, but you do not have a right under the GDPR to have your name removed from the public records relating to your purchase and ownership of that property. The purpose of registering property ownership is to have a permanent record of chain of ownership.

To the same extent, should you consent to making a transaction through a Blockchain-based service where you have knowledge that the record of that transaction will be permanent, your right to delete your personal data only comes into play when the purpose for retaining your data ceases to exist. For a Blockchain, that will likely be never.

power article

Think of a newspaper that publishes an article which features my name. The newspaper circulates thousands of copies. Like a Blockchain, the newspaper is distributed amongst thousands of people who have copies of the exact same story. We can verify that a single copy of that story has not been manipulated because we can compare it with thousands of other ones. Fake news aside, newspapers have the goal of being official accounts of events or newspapers of record. We should not then expect that upon request, every library and individual who has a copy of that newspaper article be required to destroy it or remove my name. Besides not being practical, it is contrary to the reason for having newspapers in the first place.

This morning I read a recent Grant Thorton report written by the Spanish lawyer Sara Esclapés Membrives on how the GDPR actually presents an opportunity for Blockchain-based technologies. The report corroborates Jane’s interpretation of the law, stating that the challenge for a Blockchain is to find methods for the future removal of an individual’s personal data “when the purpose for which the data were collected has finished.” But as with the newspaper example, the purpose of storing data in the Blockchain is permanency, which means that unless the Blockchain ceases to have activity and a reason for remaining in existence, it should be allowed to continue storing my name without me being able to invoke the right to erase my personal data.

Ultimately Blockchain-based technologies that store personal data need to focus on privacy by design, meaning developing an architecture that maximizes the individual’s ability to grant consent and opt-out of the service while providing the appropriate level of security for the storage of the data. But more importantly to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.