Why Technology Favors Tyranny

Harari Technology Tyranny.png

You all are going to think I am the Grim Reaper of new technologies, crying that the sky is falling at every turn. Yes, I am using this blog as a forum – amongst other things — to discuss the difficult decisions that businesses, lawyers and society need to face when looking at how new technologies like Artificial Intelligence, Blockchain and Biometrics may impact our lives. (examples, here, here and here).

Working for a tech company that invests millions in innovation, I am very interested in seeing how we can use new technologies to improve society. But in order to do that, we need to be very vigilant. The consequences of not doing so could be disastrous and significantly change the course of humankind.

Am I exaggerating? In a must read article in The Atlantic, Yuval Noah Harari (author of Sapiens: A Brief History of Humankind and Homo Deux: A Brief History of Tomorrow) makes precisely that argument:

More practically, and more immediately, if we want to prevent the concentration of all wealth and power in the hands of a small elite, we must regulate the ownership of data. In ancient times, land was the most important asset, so politics was a struggle to control land. In the modern era, machines and factories became more important than land, so political struggles focused on controlling these vital means of production. In the 21st century, data will eclipse both land and machinery as the most important asset, so politics will be a struggle to control data’s flow.

Unfortunately, we don’t have much experience in regulating the ownership of data, which is inherently a far more difficult task than regulating land or machines. Data are everywhere and nowhere at the same time, they can move at the speed of light, and you can create as many copies of them as you want. Do the data collected about my DNA, my brain, and my life belong to me, or to the government, or to a corporation, or to the human collective?

. . . Currently, humans risk becoming similar to domesticated animals. We have bred docile cows that produce enormous amounts of milk but are otherwise far inferior to their wild ancestors. They are less agile, less curious, and less resourceful. We are now creating tame humans who produce enormous amounts of data and function as efficient chips in a huge data-processing mechanism, but they hardly maximize their human potential. If we are not careful, we will end up with downgraded humans misusing upgraded computers to wreak havoc on themselves and on the world.

If you find these prospects alarming—if you dislike the idea of living in a digital dictatorship or some similarly degraded form of society—then the most important contribution you can make is to find ways to prevent too much data from being concentrated in too few hands, and also find ways to keep distributed data processing more efficient than centralized data processing. These will not be easy tasks. But achieving them may be the best safeguard of democracy.

The world my children and their children will inhabit will be vastly different from ours in ways we cannot even begin to imagine.

Five Things Companies Can Do

fb congress

Earlier this week I wrote a long-winded post describing steps companies can take – in light of recent concerns about companies misusing personal data – to make sure their technologies are offering us all something of value.

Here are the five things, in abbreviated form, that companies can start doing now:

  1. Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture, minimize the amount of personal data you will need to provide the service, give the end-user control, and be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.
  2. Value Proposition: Make privacy protections and good practice a central point of differentiation. Make it core to your overall value proposition.
  3. Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data.
  4. Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Consider a products ethics committee, including bringing in an ethicist. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. Remember the last thing you want is for your CEO to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.
  5. Data Use as a Corporate Social Responsibility: Make data use and innovation part of your company’s CSR policies where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders.

Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their client.

I’m going to stop whining. Here are a few simple things companies can do

I just saw this interview  with Tim Cooks where he says that privacy is a fundamental right and core to the American identity. With all of my recent ranting and raging about private companies and dystopia, I could start proposing solutions or become real cynical about the Apple CEO’s words. Maybe Apple is desperate to distance itself from Facebook’s recent scandals including news that phone manufacturers reached secret deals with the social media giant to access user data.

immendorffOf course, there is also the glaringly stark contrast between the FCC now permitting American ISPs to sell user data to third parties and how the new European data law (the infamous GDPR) — not Americans’ passion for privacy rights — is the primary catalyst for the current public conversation around privacy rights in the digital era.

But instead of complaining, maybe I should look at what Apple is saying as a teachable moment. Likely Apple sees a major marketing opportunity to remind its customers (and promote the fact) that it is not in the data sharing business and that monetizing its customer’s data is contrary to its core values. At a time when companies (for example U.S. ISPs) are licking their fingers at the chance to rake in big bucks by reselling their paying customers’ data, there is a huge niche – analogous to the organic/bio foods business – for privacy-friendly products and technologies.

So taking advantage of this potentially positive turn of events, I should walk-the-walk and do what I keep saying lawyers ought to. Propose solutions! So here it goes:

For lawyers in private practice, I encourage them to continue to play a pivotal and activist role in bringing claims against and putting pressure on companies and governments that misuse our personal data or infringe on our rights. Holding them accountable does not happen by osmosis. You need lawyers leading as change agents (think about the essential role of lawyers in the Civil Rights Movement).

tshirtsNow as an in-house counsel in a tech company, I would suggest leaving your beret and Che Guevara t-shirt at home. Instead concentrate on how your company can build trust amongst its customers so that they will feel comfortable using your services, and in turn, you will feel comfortable providing them with those services. Here are some basic things, some of which I have mentioned before, that in-house lawyers can propose within their companies:

Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture. Try to minimize the amount of personal data you will need to provide the service, limit what you share with third parties to only what needs to be shared, and give the end user the ability to opt-out of features that share more detailed personal data. Be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.

Value Proposition: As the tech giants, especially ones where consumers have little bargaining power (like ISPs, Facebook, and Google), demand more access to their users’ data, companies can use privacy protections as a strong differentiating value proposition. As mentioned above, the market opportunity is huge. For example, if European consumers are not comfortable with the ease at which their data may be swept up and monitored by American spy agencies (as revealed by Edward Snowden), why not offer European-based services that guarantee greater freedom from the intrusion of a foreign government? As with Apple, if you sell sleek, cool, and lifestyle, the minute your customers perceive that you are no longer any of those things – and btw selling customer data is creepy not cool – then game over.

Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data. For example, offer customers discounted fees (like Amazon does with the Kindle) if they allow the company to monetize their data. Alternatively, how about a giving a piece of the revenue to the customer when the company makes money off her data? I worked for a WiFi sharing start-up where, keeping true to the company’s value of sharing, we shared revenues with users who shared back. If my ISP is making money off my data, then why not demand something in return?

fb congress

Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Many companies now have data governance committees, but consider a broader products ethics committee made up of a cross section of the company. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. If you build products that resell or rely on processing large volumes of personal data, put an ethicist on staff. Remember no matter who your CEO is or how much of a celebrity he may be, the last thing you want is for him to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.

Data Use as a Corporate Social Responsibility: Include data use and innovation in your company’s CSR policies. Call it your Innovation for Good Policy where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders. For example, at my current company, Amadeus, I am very proud to have been involved in two CSR initiatives with UNICEF where we used technology and aggregated travel data to help suffering children and to predict the spread of diseases like Ebola and Zika.

Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their clients.

Brave New World, Inc.

Minority Report

Earlier this week, Rana Foroohar wrote in the Financial Times that “Companies are the cops in our modern-day dystopia”:

The mass surveillance and technology depicted in the [2002 movie Minority Report] — location-based personalised advertising, facial recognition, newspapers that updated themselves — are ubiquitous today. The only thing director Steven Spielberg got wrong was the need for psychics. Instead, law enforcement can turn to data and technologies provided by companies like Google, Facebook, Amazon and intelligence group Palantir.

The dystopian perspective on these capabilities is worth remembering at a time when the private sector is being pulled ever more deeply into the business of crime fighting and intelligence gathering. Last week, the American Civil Liberties Union and several other rights groups called on Amazon to stop selling its Orwellian-sounding Rekognition image processing system to law enforcement officials, saying it was “primed for abuse in the hands of government”.

the-wire-lester

I have written a few posts already about the potential for governments and private companies to use new technologies such as cryptocurrencies, biometrics and data mining to engage in activities that we would normally associate with the fictional totalitarian regimes of George Orwell or Aldous Huxley. With regards to state actors, like China, using biometrics for crime prevention, I wrote:

But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.

And with regards to cryptocurrencies, I wrote:

Although neither George Orwell or Aldous Huxley’s dystopian futures predicted a world governed by corporations as opposed to authoritarian governments, it may be more plausible to imagine a world where corporations control the money supply, not with coins and bills but cryptocurrencies. In fact, the fad amongst many technologists today is to encourage the disintermediation (or deregulation) of money by moving to Blockchain-based cryptocurrencies like Bitcoin. But instead of removing the middleman, we are more likely – contrary to the idealists’ ambitions — to open the door to empower big tech companies like Amazon, Facebook and Google to tokenize their platforms, replacing one currency regulator with corporate ones.

But private companies are able to do so much more with the data that we so generously (and often naively) hand them. The possibilities for abuse seem endless. To a large degree, the new GDPR mitigates this risk by giving the consumer visibility about and control over how her data is being used, and hopefully building trust between consumers and their service providers.  As stated here before, more important than complying with strict new laws, “to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.”

But what happens if consumers are not given the opportunity to intelligently grant consent or agree to use a service that shares their data? The first GDPR complaints have been filed precisely on these grounds:

Across four complaints, related to Facebook, Instagram, WhatsApp and Google’s Android operating system, European consumer rights organisation Noyb argues that the companies have forced users into agreeing to new terms of service, in breach of the requirement in the law that such consent should be freely given.

Continue reading “Brave New World, Inc.”

GDPR: You’re Now the Most Popular Kid at School

TCsHave you noticed that you’re now receiving dozens of emails from all sorts of services that you forget you ever used? Everyone is suddenly sending you requests to accept their new privacy terms. Yes, that’s the imminent arrival of the EU’s new General Data Protection Regulation, due out in stores on May 25, 2018, talking to you. So if you’re feeling like the prettiest one at the dance or the most popular kid at school, then you have the GDPR to thank (along with its threat of massive fines).

Personally, I am taking advantage of the opportunity to clean house by NOT accepting the terms of all of those services I am no longer interested in using.

Note: The above image compares the length of end user terms of various social media platforms. From left to right: illegible company, Google, Tinder, Twitter, Facebook, Snapchat, and Instagram. I found the image a few weeks back on Twitter but unfortunately cannot attribute the image to the original tweet or its owner.  

Re Facebook, Cambridge Analytica, the GDPR and CSR

Messi CR
When you know they are watching

A political data firm, Cambridge Analytica, was able to access the private information of 50 million Facebook users for marketing campaigns during the 2016 presidential election without the knowledge of those users. Facebook is now under pressure to explain “what the social network knew about the misuse of its data ‘to target political advertising and manipulate voters’”.

Last week on Linkedin, I saw a post written by Miguel Benavides, calling for data use to be deemed part of companies’ corporate social responsibility policies:

It took decades of pressure for social agents to persuade businesses and regulators that companies should be responsible for their impact on society and on the environment. The idea of Corporate Social Responsibility (CSR), with all its variances in names and forms, established the principle of ethics and accountability of companies as social agents. That caused all sorts of watchdog agencies to emerge. Businesses even included CSR as a permanent component in their strategy, allowing it to move towards stronger concepts like Corporate Citizenship, Shared Value…

Well, if we talk about data being “the new oil”, shouldn’t we need new Environmental Protection Agencies watching data leaks, new Labor Protection Agencies to ensure new labor models meet the minimum social protection criteria. What about Community Protection Agencies to watch how using all that data affects communities, social life, or even human behavior?

Along these same lines, tech writer Zeynep Tufecki wrote recently in the New York Times that

Data privacy is not like a consumer good, where you click “I accept” and all is well. Data privacy is more like air quality or safe drinking water, a public good that cannot be effectively regulated by trusting in the wisdom of millions of individual choices. A more collective response is needed.

In theory this is what the new GDPR hopes to achieve (at least in Europe), but will that be enough?

It will be interesting to see not just how legislators and regulators react, but more importantly how consumers’ online behavior and expectations of privacy will change, if at all. Will we become more demanding or more resigned? Regardless of whether you are a wanting online exhibitionist, the unavoidable truth is that we consumers are getting played. Sure there is a huge benefit in getting personalized offers and content, but consumers give it all away for free to companies who are laughing all the way to the bank.

Maybe the GDPR Isn’t So Bad for Blockchain After All?

Opps
An innocent “oops”

In two recent posts, thinking that I was really smart and clever, I questioned whether the GDPR posed a major hurdle for Blockchain-based technologies. Again, thinking that I was so smart and clever, I proudly took my arguments to one of my favorite privacy lawyers, confident I was about to impress her.

And as is often the case when I am feeling smart and clever, I was quickly put in my place by someone who actually knew what she was talking about.  First, let’s be fair to me. The GDPR does pose challenges for Blockchain-based technologies, as it does for any service whether on or offline that stores personal data. Data controllers will need to procure consent from data subjects and storage of data will need to be limited in time based on the purpose for which it is being stored.

The concern I originally raised was the conflict between Blockchains’ main feature of creating a permanent and unalterable record with the legal rights of a data subject to be able to modify or delete her personal data upon request (aka, the right to be forgotten). But a much smarter and more clever colleague – let’s call her Jane – explained to me that the right to be forgotten is not absolute.

Imagine you buy property. The local property registrar records the purchase with your name listed as the property owner. You may later sell that property, but you do not have a right under the GDPR to have your name removed from the public records relating to your purchase and ownership of that property. The purpose of registering property ownership is to have a permanent record of chain of ownership.

To the same extent, should you consent to making a transaction through a Blockchain-based service where you have knowledge that the record of that transaction will be permanent, your right to delete your personal data only comes into play when the purpose for retaining your data ceases to exist. For a Blockchain, that will likely be never.

power article

Think of a newspaper that publishes an article which features my name. The newspaper circulates thousands of copies. Like a Blockchain, the newspaper is distributed amongst thousands of people who have copies of the exact same story. We can verify that a single copy of that story has not been manipulated because we can compare it with thousands of other ones. Fake news aside, newspapers have the goal of being official accounts of events or newspapers of record. We should not then expect that upon request, every library and individual who has a copy of that newspaper article be required to destroy it or remove my name. Besides not being practical, it is contrary to the reason for having newspapers in the first place.

This morning I read a recent Grant Thorton report written by the Spanish lawyer Sara Esclapés Membrives on how the GDPR actually presents an opportunity for Blockchain-based technologies. The report corroborates Jane’s interpretation of the law, stating that the challenge for a Blockchain is to find methods for the future removal of an individual’s personal data “when the purpose for which the data were collected has finished.” But as with the newspaper example, the purpose of storing data in the Blockchain is permanency, which means that unless the Blockchain ceases to have activity and a reason for remaining in existence, it should be allowed to continue storing my name without me being able to invoke the right to erase my personal data.

Ultimately Blockchain-based technologies that store personal data need to focus on privacy by design, meaning developing an architecture that maximizes the individual’s ability to grant consent and opt-out of the service while providing the appropriate level of security for the storage of the data. But more importantly to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.