Re Facebook, Cambridge Analytica, the GDPR and CSR

Messi CR
When you know they are watching

A political data firm, Cambridge Analytica, was able to access the private information of 50 million Facebook users for marketing campaigns during the 2016 presidential election without the knowledge of those users. Facebook is now under pressure to explain “what the social network knew about the misuse of its data ‘to target political advertising and manipulate voters’”.

Last week on Linkedin, I saw a post written by Miguel Benavides, calling for data use to be deemed part of companies’ corporate social responsibility policies:

It took decades of pressure for social agents to persuade businesses and regulators that companies should be responsible for their impact on society and on the environment. The idea of Corporate Social Responsibility (CSR), with all its variances in names and forms, established the principle of ethics and accountability of companies as social agents. That caused all sorts of watchdog agencies to emerge. Businesses even included CSR as a permanent component in their strategy, allowing it to move towards stronger concepts like Corporate Citizenship, Shared Value…

Well, if we talk about data being “the new oil”, shouldn’t we need new Environmental Protection Agencies watching data leaks, new Labor Protection Agencies to ensure new labor models meet the minimum social protection criteria. What about Community Protection Agencies to watch how using all that data affects communities, social life, or even human behavior?

Along these same lines, tech writer Zeynep Tufecki wrote recently in the New York Times that

Data privacy is not like a consumer good, where you click “I accept” and all is well. Data privacy is more like air quality or safe drinking water, a public good that cannot be effectively regulated by trusting in the wisdom of millions of individual choices. A more collective response is needed.

In theory this is what the new GDPR hopes to achieve (at least in Europe), but will that be enough?

It will be interesting to see not just how legislators and regulators react, but more importantly how consumers’ online behavior and expectations of privacy will change, if at all. Will we become more demanding or more resigned? Regardless of whether you are a wanting online exhibitionist, the unavoidable truth is that we consumers are getting played. Sure there is a huge benefit in getting personalized offers and content, but consumers give it all away for free to companies who are laughing all the way to the bank.

Advertisements

Maybe the GDPR Isn’t So Bad for Blockchain After All?

Opps
An innocent “oops”

In two recent posts, thinking that I was really smart and clever, I questioned whether the GDPR posed a major hurdle for Blockchain-based technologies. Again, thinking that I was so smart and clever, I proudly took my arguments to one of my favorite privacy lawyers, confident I was about to impress her.

And as is often the case when I am feeling smart and clever, I was quickly put in my place by someone who actually knew what she was talking about.  First, let’s be fair to me. The GDPR does pose challenges for Blockchain-based technologies, as it does for any service whether on or offline that stores personal data. Data controllers will need to procure consent from data subjects and storage of data will need to be limited in time based on the purpose for which it is being stored.

The concern I originally raised was the conflict between Blockchains’ main feature of creating a permanent and unalterable record with the legal rights of a data subject to be able to modify or delete her personal data upon request (aka, the right to be forgotten). But a much smarter and more clever colleague – let’s call her Jane – explained to me that the right to be forgotten is not absolute.

Imagine you buy property. The local property registrar records the purchase with your name listed as the property owner. You may later sell that property, but you do not have a right under the GDPR to have your name removed from the public records relating to your purchase and ownership of that property. The purpose of registering property ownership is to have a permanent record of chain of ownership.

To the same extent, should you consent to making a transaction through a Blockchain-based service where you have knowledge that the record of that transaction will be permanent, your right to delete your personal data only comes into play when the purpose for retaining your data ceases to exist. For a Blockchain, that will likely be never.

power article

Think of a newspaper that publishes an article which features my name. The newspaper circulates thousands of copies. Like a Blockchain, the newspaper is distributed amongst thousands of people who have copies of the exact same story. We can verify that a single copy of that story has not been manipulated because we can compare it with thousands of other ones. Fake news aside, newspapers have the goal of being official accounts of events or newspapers of record. We should not then expect that upon request, every library and individual who has a copy of that newspaper article be required to destroy it or remove my name. Besides not being practical, it is contrary to the reason for having newspapers in the first place.

This morning I read a recent Grant Thorton report written by the Spanish lawyer Sara Esclapés Membrives on how the GDPR actually presents an opportunity for Blockchain-based technologies. The report corroborates Jane’s interpretation of the law, stating that the challenge for a Blockchain is to find methods for the future removal of an individual’s personal data “when the purpose for which the data were collected has finished.” But as with the newspaper example, the purpose of storing data in the Blockchain is permanency, which means that unless the Blockchain ceases to have activity and a reason for remaining in existence, it should be allowed to continue storing my name without me being able to invoke the right to erase my personal data.

Ultimately Blockchain-based technologies that store personal data need to focus on privacy by design, meaning developing an architecture that maximizes the individual’s ability to grant consent and opt-out of the service while providing the appropriate level of security for the storage of the data. But more importantly to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.

Hey Mr. Blockchain, the GDPR is Coming at You Fast

etch a sketch

Last week I wrote about the upcoming GDPR and mentioned that it posed a potential risk for Blockchain-based technologies:

The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?

Just now I read that Blockchain is on a collision course with the new GDPR, making my same exact point:

The bloc’s General Data Protection law, which will come into effect in a few months’ time, says people must be able to demand that their personal data is rectified or deleted under many circumstances. A blockchain is essentially a growing, shared record of past activity that’s distributed across many computers, and the whole point is that this chain of transactions (or other fragments of information) is in practice unchangeable – this is what ensures the reliability of the information stored in the blockchain.

For blockchain projects that involve the storage of personal data, these two facts do not mix well. And with sanctions for flouting the GDPR including fines of up to €20 million or 4 percent of global revenues, many businesses may find the ultra-buzzy blockchain trend a lot less palatable than they first thought.

€20 million is a great incentive for technologists to find creative ways to keep personal data outside of their Blockchain aspirations. Start the brainstorming now!

The GDPR is Coming

Paul_Revere's_ride
A very American icon for a very European law

The GDPR is coming, the GDPR is coming. At a recent offsite leadership meeting I attended with the business unit I support, I was dubbed Mr. GDPR. They all knew it was coming and because I am their lawyer, I became their GDPR guy.

To be honest, I am no GDPR expert and certainly don’t want to become one. I have these really great privacy lawyers who sit next to me. They answers my questions but more importantly help steer our company in the right direction to make sure that data privacy is one of the key value propositions we offer our customers.

Because the GDPR is coming, it is worth saying something about it here. Today I read eMarketer’s Western European Digital Trends for 2018 which gave an excellent summary of how the new law will affect companies and consumers:

GDPR requires that any entity collecting or handling consumers’ personal data must know how and where those processes take place, what data is kept, where it is kept, where it goes if it is distributed further, and how data integrity is preserved at every point where that entity is responsible—and be prepared to divulge those details. The rules also require digital devices and browsers to make consumers aware that their data is about to be collected, and let users make a single decision about how their data can be gathered and handled, which all companies, websites and apps must adhere to. Individuals will be able to refuse any entity access to their personal data. Individuals will also be empowered to access, manage and delete their personal data held in digital databases. Firms failing to comply face a fine of €20 million ($22.1 million) or up to 4% of global revenues, whichever is greater.

In a December 2017 blog post, Jean-Michel Franco, senior director of product marketing at Talend, wrote that “the stakes go well beyond regulatory compliance. In this data-driven world, trust has become the new currency. Now that insights and innovations depend on big data, there’s no option but to have total control [over] your data, otherwise, your customer won’t buy in. … Most of the privacy rules that come with GDPR were already expressed in former regulations, but the principle of accountability makes it game-changing.”

This will likely pose a challenge to companies like Google and Facebook who want you to give it away when using their platform. I mean give it all away: your photos, your posts, your instant messages with very limited ability to opt-out without having to forgo using the entire platform. This is from an article on how the GDPR will disrupt Google and Facebook from last summer:

Google and Facebook cannot confront their users with broad, non-specific, consent requests that cover the entire breadth of their activities. Data protection regulators across the EU have made clear what they expect:

“A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”.

A business cannot, for example, collect more data for a purpose than it needs and then retroactively ask to use those data for additional purposes.[4]

It will be necessary to ask for consent, or present an opt-out choice, at different times, and for different things. This creates varying levels of risk. We estimate these risks on the “GDPR scale”, shown below.

GDPR-scale_001-small-1

The scale ranges from zero to five. Five, at the high end of the scale, describes the circumstances that many adtech companies that have no direct relationship with Internet users will find themselves in. They need to get the consent of the people whose data they rely on. But they have no channel of communication through which they can do so.

Four, next highest on the scale, refers to companies that have direct relationships with users, and can use this to ask for consent. However, users have little incentive to “opt-in” to being tracked for advertising. Whereas a user might opt-in to some form of profiling that comes with tangible benefits, such as a loyalty scheme, the same user might not be willing to opt-in to more extensive profiling that yields no benefit. The extensiveness of the profiling is important because, as the note at the bottom of this page shows, users will be aware of the uses of their data when consent is sought. Thus adtech tracking across the web might rank as four, but a loyalty scheme might rank as three on the GDPR scale.

A slightly more attractive prospect, from Google and Facebook’s perspective, is to inform a user about what they want to do with the personal data, and give the user a chance to “opt-out” beforehand.[5] This is two on the scale. This opt-out approach has the benefit – from the company’s perspective – that some users’ inaction may allow their data to be used. The GDPR permits the opt-out approach when the purposes that the companies want to use the data for are “compatible” with the original purpose for which personal data were shared by users. In addition to the opt-out notice, users also have to be told of their right to object at any time to the use of their data for direct marketing.

One on the scale refers to activities that currently involve the processing of personal data, but that do not need to do so. With modification, these activities could be put beyond the scope of the Regulation.

Activities at the zero end of the scale are outside the scope of the Regulation, because they use no personal data.

The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?

Do We Want our IDs Verified on a Blockchain?

pexels-photo-786801.jpegOne of the use cases most commonly discussed today for Blockchain is identity verification  or authentication. This could come in the form of storing bits of encrypted data on a Blockchain that would facilitate identifying individuals for any number of purposes from buying groceries to making online purchases, validating a state issued ID (like a passport or driver’s license), checking in at a hotel, passing security at an airport, or voting in an election.

The argument, as always with Blockchain, is that by having a distributed database of encrypted and validated entries, you are able to create trusted and secure transactions, avoid fraud, reduce errors, save money, and leave an indelible trace of activities.

Personally, I think that the Blockchain use case for identify verification is fantastic for voting, especially where we can quickly validate a citizen is authorized to vote while avoiding revealing how she voted.

But what about other types of transactions? One area where I am struggling with is whether consumers will be comfortable leaving immutable traces of their movements and activities on a Blockchain, even if their ID is revocable (meaning that the individual could change her passport, ID, or biometric). From a consumer-centric standpoint, one would think that a person would want to be able to remove, not just revoke, her biometric or public ID. Will consumers want the right to have their bad biometric selfies or other transactions “forgotten”?

Just because it can go into a Blockchain, doesn’t automatically mean it should.

What do you think?

The Legal Implications are Not My First Concern

home aloneWhenever I look at a new product, business model or technology, the legal implications are never my first concern. I prefer to focus on whether there is a viable business model, whether we can actually deliver the product or service, and how end users will feel about the product or services.

This short article lists the main legal implications of using Artificial Intelligence:

  • Personal Data
  • IP
  • Liablity

To be honest, for us who are working with these issues every day, this article isn’t particularly informative. Whether we’re talking about AI, Blockchain, Biometrics or some other new service, I would argue that I am much less concerned about those issues than the article is, mainly because I work with very capable privacy and IP specialists and know that both of those issues can be addressed in the product’s design and contract drafting.

For privacy what is very important, is not so much the law, but that if your product involves processing personal data, that the end users’ interests are at the heart of the design (ie, what is called privacy by design).

With regards to liability, we will have worked closely with the business to define our risk profile, factoring it into the business case and then reflecting that in the liability clauses. In other words, the liabilities and indemnities clauses will look pretty much the same as they do in any other IT supply agreement.

What I will be most concerned about is reputation. Will our service actually work? Will end users whose data is being processed through our service feel comfortable with their data being used? Assuming we have leverage, we can draft circles around our contractual risk to protect our intellectual property, limit our liability in case of our service failure, and define our privacy obligations. But what happens if our service doesn’t meet up to expectations or if users find it creepy? Will anyone want to contract with us in the future?

That’s reputation, pure and simple. And nothing you draft in a contract is going to save a bad reputation. So first figure out if you can deliver, put the end user at the center of the product architecture, get your business case in order, and then you can do the easy part which is to put together the contract.

Big Brother, Cars, Face Recognition and Riding Like the Wind

pexels-photo-720815.jpegSince its inception the automobile has always been a romantic figure in American popular culture and emblematic of the American way of life. In short, the automobile has been freedom incarnate. On our sixteenth birthdays, we Americans take the day off and go straight to the DMV to take our driver’s exam. With our newly minted license, we are set free from the bounds of our parents and their ever-watching eye. It is our first right of passage.

As explained in an article in yesterday’s Washington Post, car companies can now store and mine lots of information about car owners, from how fast you are driving to where and when you are traveling. That means it becomes much harder for you to use your wheels to be free. Your spouse or your parents may not know where you are going or where you have been, but your car company does. If you’re cheating, you better hope your spouse doesn’t have a friend at your car company. And what if the police get a warrant to search your car company’s data on your vehicle’s performance? Forget contesting that speeding ticket in court. Who needs the Fifth Amendment when your car can incriminate against you instead?

Am I overreacting? Maybe, but the U.S. Senate did just approve with support from Democrats the extension and expansion of Donald Trump’s ability to spy on U.S. citizens and that includes reading your emails without a warrant. In fact, there were Democrats who said the matter didn’t even deserve to be debated. I would imagine that means mining data from our car companies as well.

Earlier this month, the Washington Post also reported on China’s intention to use facial recognition technology to keep a watchful eye on all citizens to help predict and combat crime. We should all be concerned about the government and private companies as Big Brother, but with facial recognition there is also the issue of accuracy.

facial recognition

False positives can mean that certain people are regularly stopped and potentially harassed by the police. Now imagine that the biometric engineers who set the algorithms are all from the same racial and ethnic groups, whether on purpose or not, their biases will be factored into the accuracy of the results. This will likely translate into minority groups taking the brunt of the false positives. For artificial intelligence and machine learning to be effective, it needs to be accurate at least 80% of the time. When that happens it will always be better than humans. But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.