Hey Mr. Blockchain, the GDPR is Coming at You Fast

etch a sketch

Last week I wrote about the upcoming GDPR and mentioned that it posed a potential risk for Blockchain-based technologies:

The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?

Just now I read that Blockchain is on a collision course with the new GDPR, making my same exact point:

The bloc’s General Data Protection law, which will come into effect in a few months’ time, says people must be able to demand that their personal data is rectified or deleted under many circumstances. A blockchain is essentially a growing, shared record of past activity that’s distributed across many computers, and the whole point is that this chain of transactions (or other fragments of information) is in practice unchangeable – this is what ensures the reliability of the information stored in the blockchain.

For blockchain projects that involve the storage of personal data, these two facts do not mix well. And with sanctions for flouting the GDPR including fines of up to €20 million or 4 percent of global revenues, many businesses may find the ultra-buzzy blockchain trend a lot less palatable than they first thought.

€20 million is a great incentive for technologists to find creative ways to keep personal data outside of their Blockchain aspirations. Start the brainstorming now!

Advertisements

Challenges Blockchain May Face

Sara PavanWe keep hearing that governments will kill cryptocurrencies or at least regulate them to death. They fear what they cannot control (or tax). I tend to keep a more open mind.

But according to my colleague Sara Pavan, Blockchain currently has other pressing challenges:

Scalability is a major concern: most existing blockchains (Bitcoin, Ethereum, etc.) have significant scalability challenges. For example, the Bitcoin network can only process about 7 [transactions] per second. In comparison, Amadeus processes 100,000 end-user transactions per second in peak times.

Transaction cost will be another issue to consider. Blockchains typically require a lot of computing resources given that data is held multiple times and there is significant cryptographic computation to be undertaken.

In systems like Bitcoin and Ethereum, this means there is often a prohibitive fee associated with each transaction, which can represent several percentages of the value being exchanged, making them inappropriate for many use cases.

Finally, integration with existing systems will be another major hurdle. Today it is hard to make blockchain interoperable with existing IT systems.

If a hotel booking is made on a blockchain system, how will it integrate with a system that isn’t on blockchain?

Even if Blockchain technology could eventually be faster and less expensive, one of Blockchain’s key raisons d’être – to squeeze out the middleman – may have its own challenge. From what Sara says intermediation isn’t going anywhere and will continue to be central to the value chain in a Blockchain world.  Beside needing someone to establish and evolve a Blockchain’s governance and drive critical mass, you will always need someone to build interfaces and applications, and support, maintain and enhance them. It’s like the Internet. We love it, not because it has value in and of itself, but because of the Facebooks, Amazons, and millions of other intermediaries that deliver content, products and services to our screens (or with Siri, Alexa etc to our ears).

And what happens when Blockchain’s other key selling point of secure and trusted transactions is blown to smithereens by faster encryption-busting technologies? What if what we are hearing about quantum computing is true?

. . . the advent of quantum computing will jeopardize the security of all existing cryptographic encryption methods, including RSA tokens. Quantum computers will affect the security of the entire finance and banking industry, not just the blockchain.

Even so, I’m surprised that security is not a more common conversation throughout the blockchain community. For a group deeply rooted in futurism, this seems shockingly shortsighted. It feels as if we’re building the blockchain for the next 50 years, but what if we only get to the next five or 10? What can be done to ensure that blockchain is dynamic enough to outlive quantum computing?

I tend to be an optimistic skeptic, meaning that I don’t trust the hype but trust that there is always a solution to every problem. As a lawyer, though, the solution is usually an imperfect negotiated one.

The GDPR is Coming

Paul_Revere's_ride
A very American icon for a very European law

The GDPR is coming, the GDPR is coming. At a recent offsite leadership meeting I attended with the business unit I support, I was dubbed Mr. GDPR. They all knew it was coming and because I am their lawyer, I became their GDPR guy.

To be honest, I am no GDPR expert and certainly don’t want to become one. I have these really great privacy lawyers who sit next to me. They answers my questions but more importantly help steer our company in the right direction to make sure that data privacy is one of the key value propositions we offer our customers.

Because the GDPR is coming, it is worth saying something about it here. Today I read eMarketer’s Western European Digital Trends for 2018 which gave an excellent summary of how the new law will affect companies and consumers:

GDPR requires that any entity collecting or handling consumers’ personal data must know how and where those processes take place, what data is kept, where it is kept, where it goes if it is distributed further, and how data integrity is preserved at every point where that entity is responsible—and be prepared to divulge those details. The rules also require digital devices and browsers to make consumers aware that their data is about to be collected, and let users make a single decision about how their data can be gathered and handled, which all companies, websites and apps must adhere to. Individuals will be able to refuse any entity access to their personal data. Individuals will also be empowered to access, manage and delete their personal data held in digital databases. Firms failing to comply face a fine of €20 million ($22.1 million) or up to 4% of global revenues, whichever is greater.

In a December 2017 blog post, Jean-Michel Franco, senior director of product marketing at Talend, wrote that “the stakes go well beyond regulatory compliance. In this data-driven world, trust has become the new currency. Now that insights and innovations depend on big data, there’s no option but to have total control [over] your data, otherwise, your customer won’t buy in. … Most of the privacy rules that come with GDPR were already expressed in former regulations, but the principle of accountability makes it game-changing.”

This will likely pose a challenge to companies like Google and Facebook who want you to give it away when using their platform. I mean give it all away: your photos, your posts, your instant messages with very limited ability to opt-out without having to forgo using the entire platform. This is from an article on how the GDPR will disrupt Google and Facebook from last summer:

Google and Facebook cannot confront their users with broad, non-specific, consent requests that cover the entire breadth of their activities. Data protection regulators across the EU have made clear what they expect:

“A purpose that is vague or general, such as for instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future research’ will – without further detail – usually not meet the criteria of being ‘specific’”.

A business cannot, for example, collect more data for a purpose than it needs and then retroactively ask to use those data for additional purposes.[4]

It will be necessary to ask for consent, or present an opt-out choice, at different times, and for different things. This creates varying levels of risk. We estimate these risks on the “GDPR scale”, shown below.

GDPR-scale_001-small-1

The scale ranges from zero to five. Five, at the high end of the scale, describes the circumstances that many adtech companies that have no direct relationship with Internet users will find themselves in. They need to get the consent of the people whose data they rely on. But they have no channel of communication through which they can do so.

Four, next highest on the scale, refers to companies that have direct relationships with users, and can use this to ask for consent. However, users have little incentive to “opt-in” to being tracked for advertising. Whereas a user might opt-in to some form of profiling that comes with tangible benefits, such as a loyalty scheme, the same user might not be willing to opt-in to more extensive profiling that yields no benefit. The extensiveness of the profiling is important because, as the note at the bottom of this page shows, users will be aware of the uses of their data when consent is sought. Thus adtech tracking across the web might rank as four, but a loyalty scheme might rank as three on the GDPR scale.

A slightly more attractive prospect, from Google and Facebook’s perspective, is to inform a user about what they want to do with the personal data, and give the user a chance to “opt-out” beforehand.[5] This is two on the scale. This opt-out approach has the benefit – from the company’s perspective – that some users’ inaction may allow their data to be used. The GDPR permits the opt-out approach when the purposes that the companies want to use the data for are “compatible” with the original purpose for which personal data were shared by users. In addition to the opt-out notice, users also have to be told of their right to object at any time to the use of their data for direct marketing.

One on the scale refers to activities that currently involve the processing of personal data, but that do not need to do so. With modification, these activities could be put beyond the scope of the Regulation.

Activities at the zero end of the scale are outside the scope of the Regulation, because they use no personal data.

The more I think about it, the more I see the GDPR posing a problem for a Blockchain’s permanent, irreversible and inerasable ledger whenever any personal data (even when encrypted) is included in a node. Individuals will have the right to delete their data and be forgotten. If one of the values of Blockchain technology is that no one person or entity can modify a node, then the Blockchain will need to modify its architecture and governance to allow for such node modification. And if it is a public Blockchain with no centralized intermediation, then who is the data controller? And who will be able to delete your data upon your request and protect your rights? Will each miner become a data controller, potentially subject to fines?

Do We Want our IDs Verified on a Blockchain?

pexels-photo-786801.jpegOne of the use cases most commonly discussed today for Blockchain is identity verification  or authentication. This could come in the form of storing bits of encrypted data on a Blockchain that would facilitate identifying individuals for any number of purposes from buying groceries to making online purchases, validating a state issued ID (like a passport or driver’s license), checking in at a hotel, passing security at an airport, or voting in an election.

The argument, as always with Blockchain, is that by having a distributed database of encrypted and validated entries, you are able to create trusted and secure transactions, avoid fraud, reduce errors, save money, and leave an indelible trace of activities.

Personally, I think that the Blockchain use case for identify verification is fantastic for voting, especially where we can quickly validate a citizen is authorized to vote while avoiding revealing how she voted.

But what about other types of transactions? One area where I am struggling with is whether consumers will be comfortable leaving immutable traces of their movements and activities on a Blockchain, even if their ID is revocable (meaning that the individual could change her passport, ID, or biometric). From a consumer-centric standpoint, one would think that a person would want to be able to remove, not just revoke, her biometric or public ID. Will consumers want the right to have their bad biometric selfies or other transactions “forgotten”?

Just because it can go into a Blockchain, doesn’t automatically mean it should.

What do you think?

The Legal Implications are Not My First Concern

home aloneWhenever I look at a new product, business model or technology, the legal implications are never my first concern. I prefer to focus on whether there is a viable business model, whether we can actually deliver the product or service, and how end users will feel about the product or services.

This short article lists the main legal implications of using Artificial Intelligence:

  • Personal Data
  • IP
  • Liablity

To be honest, for us who are working with these issues every day, this article isn’t particularly informative. Whether we’re talking about AI, Blockchain, Biometrics or some other new service, I would argue that I am much less concerned about those issues than the article is, mainly because I work with very capable privacy and IP specialists and know that both of those issues can be addressed in the product’s design and contract drafting.

For privacy what is very important, is not so much the law, but that if your product involves processing personal data, that the end users’ interests are at the heart of the design (ie, what is called privacy by design).

With regards to liability, we will have worked closely with the business to define our risk profile, factoring it into the business case and then reflecting that in the liability clauses. In other words, the liabilities and indemnities clauses will look pretty much the same as they do in any other IT supply agreement.

What I will be most concerned about is reputation. Will our service actually work? Will end users whose data is being processed through our service feel comfortable with their data being used? Assuming we have leverage, we can draft circles around our contractual risk to protect our intellectual property, limit our liability in case of our service failure, and define our privacy obligations. But what happens if our service doesn’t meet up to expectations or if users find it creepy? Will anyone want to contract with us in the future?

That’s reputation, pure and simple. And nothing you draft in a contract is going to save a bad reputation. So first figure out if you can deliver, put the end user at the center of the product architecture, get your business case in order, and then you can do the easy part which is to put together the contract.