You all are going to think I am the Grim Reaper of new technologies, crying that the sky is falling at every turn. Yes, I am using this blog as a forum – amongst other things — to discuss the difficult decisions that businesses, lawyers and society need to face when looking at how new technologies like Artificial Intelligence, Blockchain and Biometrics may impact our lives. (examples, here, here and here).
Working for a tech company that invests millions in innovation, I am very interested in seeing how we can use new technologies to improve society. But in order to do that, we need to be very vigilant. The consequences of not doing so could be disastrous and significantly change the course of humankind.
More practically, and more immediately, if we want to prevent the concentration of all wealth and power in the hands of a small elite, we must regulate the ownership of data. In ancient times, land was the most important asset, so politics was a struggle to control land. In the modern era, machines and factories became more important than land, so political struggles focused on controlling these vital means of production. In the 21st century, data will eclipse both land and machinery as the most important asset, so politics will be a struggle to control data’s flow.
Unfortunately, we don’t have much experience in regulating the ownership of data, which is inherently a far more difficult task than regulating land or machines. Data are everywhere and nowhere at the same time, they can move at the speed of light, and you can create as many copies of them as you want. Do the data collected about my DNA, my brain, and my life belong to me, or to the government, or to a corporation, or to the human collective?
. . . Currently, humans risk becoming similar to domesticated animals. We have bred docile cows that produce enormous amounts of milk but are otherwise far inferior to their wild ancestors. They are less agile, less curious, and less resourceful. We are now creating tame humans who produce enormous amounts of data and function as efficient chips in a huge data-processing mechanism, but they hardly maximize their human potential. If we are not careful, we will end up with downgraded humans misusing upgraded computers to wreak havoc on themselves and on the world.
If you find these prospects alarming—if you dislike the idea of living in a digital dictatorship or some similarly degraded form of society—then the most important contribution you can make is to find ways to prevent too much data from being concentrated in too few hands, and also find ways to keep distributed data processing more efficient than centralized data processing. These will not be easy tasks. But achieving them may be the best safeguard of democracy.
The world my children and their children will inhabit will be vastly different from ours in ways we cannot even begin to imagine.
Earlier this week I wrote a long-winded post describing steps companies can take – in light of recent concerns about companies misusing personal data – to make sure their technologies are offering us all something of value.
Here are the five things, in abbreviated form, that companies can start doing now:
Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture, minimize the amount of personal data you will need to provide the service, give the end-user control, and be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.
Value Proposition: Make privacy protections and good practice a central point of differentiation. Make it core to your overall value proposition.
Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data.
Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Consider a products ethics committee, including bringing in an ethicist. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. Remember the last thing you want is for your CEO to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.
Data Use as a Corporate Social Responsibility: Make data use and innovation part of your company’s CSR policies where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders.
Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their client.
I just saw this interview with Tim Cooks where he says that privacy is a fundamental right and core to the American identity. With all of my recent ranting and raging about private companies and dystopia, I could start proposing solutions or become real cynical about the Apple CEO’s words. Maybe Apple is desperate to distance itself from Facebook’s recent scandals including news that phone manufacturers reached secret deals with the social media giant to access user data.
Of course, there is also the glaringly stark contrast between the FCC now permitting American ISPs to sell user data to third parties and how the new European data law (the infamous GDPR) — not Americans’ passion for privacy rights — is the primary catalyst for the current public conversation around privacy rights in the digital era.
But instead of complaining, maybe I should look at what Apple is saying as a teachable moment. Likely Apple sees a major marketing opportunity to remind its customers (and promote the fact) that it is not in the data sharing business and that monetizing its customer’s data is contrary to its core values. At a time when companies (for example U.S. ISPs) are licking their fingers at the chance to rake in big bucks by reselling their paying customers’ data, there is a huge niche – analogous to the organic/bio foods business – for privacy-friendly products and technologies.
So taking advantage of this potentially positive turn of events, I should walk-the-walk and do what I keep saying lawyers ought to. Propose solutions! So here it goes:
For lawyers in private practice, I encourage them to continue to play a pivotal and activist role in bringing claims against and putting pressure on companies and governments that misuse our personal data or infringe on our rights. Holding them accountable does not happen by osmosis. You need lawyers leading as change agents (think about the essential role of lawyers in the Civil Rights Movement).
Now as an in-house counsel in a tech company, I would suggest leaving your beret and Che Guevara t-shirt at home. Instead concentrate on how your company can build trust amongst its customers so that they will feel comfortable using your services, and in turn, you will feel comfortable providing them with those services. Here are some basic things, some of which I have mentioned before, that in-house lawyers can propose within their companies:
Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture. Try to minimize the amount of personal data you will need to provide the service, limit what you share with third parties to only what needs to be shared, and give the end user the ability to opt-out of features that share more detailed personal data. Be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.
Value Proposition: As the tech giants, especially ones where consumers have little bargaining power (like ISPs, Facebook, and Google), demand more access to their users’ data, companies can use privacy protections as a strong differentiating value proposition. As mentioned above, the market opportunity is huge. For example, if European consumers are not comfortable with the ease at which their data may be swept up and monitored by American spy agencies (as revealed by Edward Snowden), why not offer European-based services that guarantee greater freedom from the intrusion of a foreign government? As with Apple, if you sell sleek, cool, and lifestyle, the minute your customers perceive that you are no longer any of those things – and btw selling customer data is creepy not cool – then game over.
Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data. For example, offer customers discounted fees (like Amazon does with the Kindle) if they allow the company to monetize their data. Alternatively, how about a giving a piece of the revenue to the customer when the company makes money off her data? I worked for a WiFi sharing start-up where, keeping true to the company’s value of sharing, we shared revenues with users who shared back. If my ISP is making money off my data, then why not demand something in return?
Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Many companies now have data governance committees, but consider a broader products ethics committee made up of a cross section of the company. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. If you build products that resell or rely on processing large volumes of personal data, put an ethicist on staff. Remember no matter who your CEO is or how much of a celebrity he may be, the last thing you want is for him to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.
Data Use as a Corporate Social Responsibility: Include data use and innovation in your company’s CSR policies. Call it your Innovation for Good Policy where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders. For example, at my current company, Amadeus, I am very proud to have been involved in two CSR initiatives with UNICEF where we used technology and aggregated travel data to help suffering children and to predict the spread of diseases like Ebola and Zika.
Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their clients.
The mass surveillance and technology depicted in the [2002 movie Minority Report] — location-based personalised advertising, facial recognition, newspapers that updated themselves — are ubiquitous today. The only thing director Steven Spielberg got wrong was the need for psychics. Instead, law enforcement can turn to data and technologies provided by companies like Google, Facebook, Amazon and intelligence group Palantir.
The dystopian perspective on these capabilities is worth remembering at a time when the private sector is being pulled ever more deeply into the business of crime fighting and intelligence gathering. Last week, the American Civil Liberties Union and several other rights groups called on Amazon to stop selling its Orwellian-sounding Rekognition image processing system to law enforcement officials, saying it was “primed for abuse in the hands of government”.
I have written a few posts already about the potential for governments and private companies to use new technologies such as cryptocurrencies, biometrics and data mining to engage in activities that we would normally associate with the fictional totalitarian regimes of George Orwell or Aldous Huxley. With regards to state actors, like China, using biometrics for crime prevention, I wrote:
But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.
Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.
Although neither George Orwell or Aldous Huxley’s dystopian futures predicted a world governed by corporations as opposed to authoritarian governments, it may be more plausible to imagine a world where corporations control the money supply, not with coins and bills but cryptocurrencies. In fact, the fad amongst many technologists today is to encourage the disintermediation (or deregulation) of money by moving to Blockchain-based cryptocurrencies like Bitcoin. But instead of removing the middleman, we are more likely – contrary to the idealists’ ambitions — to open the door to empower big tech companies like Amazon, Facebook and Google to tokenize their platforms, replacing one currency regulator with corporate ones.
But private companies are able to do so much more with the data that we so generously (and often naively) hand them. The possibilities for abuse seem endless. To a large degree, the new GDPR mitigates this risk by giving the consumer visibility about and control over how her data is being used, and hopefully building trust between consumers and their service providers. As stated here before, more important than complying with strict new laws, “to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.”
But what happens if consumers are not given the opportunity to intelligently grant consent or agree to use a service that shares their data? The first GDPR complaints have been filed precisely on these grounds:
Across four complaints, related to Facebook, Instagram, WhatsApp and Google’s Android operating system, European consumer rights organisation Noyb argues that the companies have forced users into agreeing to new terms of service, in breach of the requirement in the law that such consent should be freely given.
This morning, on the day the GDPR came into effect, I tried to access the Los Angeles Times – one of the US’s leading newspapers – but found myself redirected to this landing page. Apparently, the LA Times is not prepared for the new law. At least they had the sense and seriousness to stop European access. Four percent global worldwide annual revenue or €20 million (whichever is higher) in fines for non-compliance is no joking matter.
So, did the LA Times have a privacy lawyer who said they weren’t ready, or a lawyer smart enough to know that until they got that privacy lawyer, they wouldn’t be ready? Or, did they simply make a business decision that their European market wasn’t big enough to justify the costs of compliance?
UPDATE:
It looks like the La Times isn’t the only one. The Chicago Tribune has the exact same landing page, so it appears that all of the publications managed by tronc, inc. are following the same protocol.
Have you noticed that you’re now receiving dozens of emails from all sorts of services that you forget you ever used? Everyone is suddenly sending you requests to accept their new privacy terms. Yes, that’s the imminent arrival of the EU’s new General Data Protection Regulation, due out in stores on May 25, 2018, talking to you. So if you’re feeling like the prettiest one at the dance or the most popular kid at school, then you have the GDPR to thank (along with its threat of massive fines).
Personally, I am taking advantage of the opportunity to clean house by NOT accepting the terms of all of those services I am no longer interested in using.
Note: The above image compares the length of end user terms of various social media platforms. From left to right: illegible company, Google, Tinder, Twitter, Facebook, Snapchat, and Instagram. I found the image a few weeks back on Twitter but unfortunately cannot attribute the image to the original tweet or its owner.
Last week on Linkedin, I saw a post written by Miguel Benavides, calling for data use to be deemed part of companies’ corporate social responsibility policies:
It took decades of pressure for social agents to persuade businesses and regulators that companies should be responsible for their impact on society and on the environment. The idea of Corporate Social Responsibility (CSR), with all its variances in names and forms, established the principle of ethics and accountability of companies as social agents. That caused all sorts of watchdog agencies to emerge. Businesses even included CSR as a permanent component in their strategy, allowing it to move towards stronger concepts like Corporate Citizenship, Shared Value…
Well, if we talk about data being “the new oil”, shouldn’t we need new Environmental Protection Agencies watching data leaks, new Labor Protection Agencies to ensure new labor models meet the minimum social protection criteria. What about Community Protection Agencies to watch how using all that data affects communities, social life, or even human behavior?
Data privacy is not like a consumer good, where you click “I accept” and all is well. Data privacy is more like air quality or safe drinking water, a public good that cannot be effectively regulated by trusting in the wisdom of millions of individual choices. A more collective response is needed.
In theory this is what the new GDPR hopes to achieve (at least in Europe), but will that be enough?
It will be interesting to see not just how legislators and regulators react, but more importantly how consumers’ online behavior and expectations of privacy will change, if at all. Will we become more demanding or more resigned? Regardless of whether you are a wanting online exhibitionist, the unavoidable truth is that we consumers are getting played. Sure there is a huge benefit in getting personalized offers and content, but consumers give it all away for free to companies who are laughing all the way to the bank.
In two recentposts, thinking that I was really smart and clever, I questioned whether the GDPR posed a major hurdle for Blockchain-based technologies. Again, thinking that I was so smart and clever, I proudly took my arguments to one of my favorite privacy lawyers, confident I was about to impress her.
And as is often the case when I am feeling smart and clever, I was quickly put in my place by someone who actually knew what she was talking about. First, let’s be fair to me. The GDPR does pose challenges for Blockchain-based technologies, as it does for any service whether on or offline that stores personal data. Data controllers will need to procure consent from data subjects and storage of data will need to be limited in time based on the purpose for which it is being stored.
The concern I originally raised was the conflict between Blockchains’ main feature of creating a permanent and unalterable record with the legal rights of a data subject to be able to modify or delete her personal data upon request (aka, the right to be forgotten). But a much smarter and more clever colleague – let’s call her Jane – explained to me that the right to be forgotten is not absolute.
Imagine you buy property. The local property registrar records the purchase with your name listed as the property owner. You may later sell that property, but you do not have a right under the GDPR to have your name removed from the public records relating to your purchase and ownership of that property. The purpose of registering property ownership is to have a permanent record of chain of ownership.
To the same extent, should you consent to making a transaction through a Blockchain-based service where you have knowledge that the record of that transaction will be permanent, your right to delete your personal data only comes into play when the purpose for retaining your data ceases to exist. For a Blockchain, that will likely be never.
Think of a newspaper that publishes an article which features my name. The newspaper circulates thousands of copies. Like a Blockchain, the newspaper is distributed amongst thousands of people who have copies of the exact same story. We can verify that a single copy of that story has not been manipulated because we can compare it with thousands of other ones. Fake news aside, newspapers have the goal of being official accounts of events or newspapers of record. We should not then expect that upon request, every library and individual who has a copy of that newspaper article be required to destroy it or remove my name. Besides not being practical, it is contrary to the reason for having newspapers in the first place.
This morning I read a recent Grant Thorton report written by the Spanish lawyer Sara Esclapés Membrives on how the GDPR actually presents an opportunity for Blockchain-based technologies. The report corroborates Jane’s interpretation of the law, stating that the challenge for a Blockchain is to find methods for the future removal of an individual’s personal data “when the purpose for which the data were collected has finished.” But as with the newspaper example, the purpose of storing data in the Blockchain is permanency, which means that unless the Blockchain ceases to have activity and a reason for remaining in existence, it should be allowed to continue storing my name without me being able to invoke the right to erase my personal data.
Ultimately Blockchain-based technologies that store personal data need to focus on privacy by design, meaning developing an architecture that maximizes the individual’s ability to grant consent and opt-out of the service while providing the appropriate level of security for the storage of the data. But more importantly to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.
Last week I wrote about the upcoming GDPR and mentioned that it posed a potential risk for Blockchain-based technologies:
The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?
The bloc’s General Data Protection law, which will come into effect in a few months’ time, says people must be able to demand that their personal data is rectified or deleted under many circumstances. A blockchain is essentially a growing, shared record of past activity that’s distributed across many computers, and the whole point is that this chain of transactions (or other fragments of information) is in practice unchangeable – this is what ensures the reliability of the information stored in the blockchain.
For blockchain projects that involve the storage of personal data, these two facts do not mix well. And with sanctions for flouting the GDPR including fines of up to €20 million or 4 percent of global revenues, many businesses may find the ultra-buzzy blockchain trend a lot less palatable than they first thought.
€20 million is a great incentive for technologists to find creative ways to keep personal data outside of their Blockchain aspirations. Start the brainstorming now!
The GDPR is coming, the GDPR is coming. At a recent offsite leadership meeting I attended with the business unit I support, I was dubbed Mr. GDPR. They all knew it was coming and because I am their lawyer, I became their GDPR guy.
To be honest, I am no GDPR expert and certainly don’t want to become one. I have these really great privacy lawyers who sit next to me. They answers my questions but more importantly help steer our company in the right direction to make sure that data privacy is one of the key value propositions we offer our customers.
Because the GDPR is coming, it is worth saying something about it here. Today I read eMarketer’s Western European Digital Trends for 2018 which gave an excellent summary of how the new law will affect companies and consumers:
GDPR requires that any entity collecting or handling consumers’ personal data must know how and where those processes take place, what data is kept, where it is kept, where it goes if it is distributed further, and how data integrity is preserved at every point where that entity is responsible—and be prepared to divulge those details. The rules also require digital devices and browsers to make consumers aware that their data is about to be collected, and let users make a single decision about how their data can be gathered and handled, which all companies, websites and apps must adhere to. Individuals will be able to refuse any entity access to their personal data. Individuals will also be empowered to access, manage and delete their personal data held in digital databases. Firms failing to comply face a fine of €20 million ($22.1 million) or up to 4% of global revenues, whichever is greater.
In a December 2017 blog post, Jean-Michel Franco, senior director of product marketing at Talend, wrote that “the stakes go well beyond regulatory compliance. In this data-driven world, trust has become the new currency. Now that insights and innovations depend on big data, there’s no option but to have total control [over] your data, otherwise, your customer won’t buy in. … Most of the privacy rules that come with GDPR were already expressed in former regulations, but the principle of accountability makes it game-changing.”
This will likely pose a challenge to companies like Google and Facebook who want you to give it away when using their platform. I mean give it all away: your photos, your posts, your instant messages with very limited ability to opt-out without having to forgo using the entire platform. This is from an article on how the GDPR will disrupt Google and Facebook from last summer:
Google and Facebook cannot confront their users with broad, non-specific, consent requests that cover the entire breadth of their activities. Data protection regulators across the EU have made clear what they expect:
“A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”.
A business cannot, for example, collect more data for a purpose than it needs and then retroactively ask to use those data for additional purposes.[4]
It will be necessary to ask for consent, or present an opt-out choice, at different times, and for different things. This creates varying levels of risk. We estimate these risks on the “GDPR scale”, shown below.
The scale ranges from zero to five. Five, at the high end of the scale, describes the circumstances that many adtech companies that have no direct relationship with Internet users will find themselves in. They need to get the consent of the people whose data they rely on. But they have no channel of communication through which they can do so.
Four, next highest on the scale, refers to companies that have direct relationships with users, and can use this to ask for consent. However, users have little incentive to “opt-in” to being tracked for advertising. Whereas a user might opt-in to some form of profiling that comes with tangible benefits, such as a loyalty scheme, the same user might not be willing to opt-in to more extensive profiling that yields no benefit. The extensiveness of the profiling is important because, as the note at the bottom of this page shows, users will be aware of the uses of their data when consent is sought. Thus adtech tracking across the web might rank as four, but a loyalty scheme might rank as three on the GDPR scale.
A slightly more attractive prospect, from Google and Facebook’s perspective, is to inform a user about what they want to do with the personal data, and give the user a chance to “opt-out” beforehand.[5] This is two on the scale. This opt-out approach has the benefit – from the company’s perspective – that some users’ inaction may allow their data to be used. The GDPR permits the opt-out approach when the purposes that the companies want to use the data for are “compatible” with the original purpose for which personal data were shared by users. In addition to the opt-out notice, users also have to be told of their right to object at any time to the use of their data for direct marketing.
One on the scale refers to activities that currently involve the processing of personal data, but that do not need to do so. With modification, these activities could be put beyond the scope of the Regulation.
Activities at the zero end of the scale are outside the scope of the Regulation, because they use no personal data.
The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?