Solutions Not Problems

Dr._No_-_UK_cinema_posterI have written before about how effective lawyers focus on providing solutions, as opposed to acting like a mere traffic light that says stop or go. A few months back, I was to talk about an experience that had an impact on my career development. I told the following story about my first day at a tech start-up more than a decade ago:

On my first day at the start-up where I previously worked, the CEO and founder who had literally made hundreds of millions selling businesses he had founded, proposed a certain marketing campaign. I sent him a lengthy email with a very well-reasoned description of why his idea was misguided. He called me into his office and said, “Eric, I know you are very smart, but I didn’t hire you to tell me that my ideas are bad. Either you make them better, come up with a better idea, or don’t come back to work tomorrow.” I was very embarrassed because I knew he was right. So often we lawyers get caught up in telling the clients what they shouldn’t do, when we need to focus on providing solutions. Whenever I find myself becoming a nay-sayer, I remember that conversation and what my role is supposed to be.

Here is a good check list of things legal teams can do to avoid becoming the “Department of No”. Nevertheless, finding solutions is easier said than done.

When I had written about what it takes to be an effective in-house lawyer back in April, one of my favorite business-side colleagues sent me a message asking to put my money where my mouth was on a particular transaction we were working on. My team had reviewed a prospective customer contract and identified dozens of potentially unacceptable terms, and my internal clients were not impressed. In that instance, I had to explain that we weren’t trying to “kill” the deal, just describe the ugly commercial conditions hidden in the contract, a document that they likely had not reviewed yet. Contracts aren’t really about the law. They are mainly about the business, so often times lawyers are simply informing our clients of the inconvenient reality of what the terms of the deal are.

But point taken. Lawyers shouldn’t aspire to be mere messengers and contract readers. We need to find ways to make a deal better than the one sitting on our desks. In the last two weeks, I have had to remind myself of that story from my start-up days to make sure I was part of the solution, not the problem. Just always keep in mind that if you ignore the problems, then you are not providing a solution either.

Five Things Companies Can Do

fb congress

Earlier this week I wrote a long-winded post describing steps companies can take – in light of recent concerns about companies misusing personal data – to make sure their technologies are offering us all something of value.

Here are the five things, in abbreviated form, that companies can start doing now:

  1. Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture, minimize the amount of personal data you will need to provide the service, give the end-user control, and be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.
  2. Value Proposition: Make privacy protections and good practice a central point of differentiation. Make it core to your overall value proposition.
  3. Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data.
  4. Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Consider a products ethics committee, including bringing in an ethicist. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. Remember the last thing you want is for your CEO to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.
  5. Data Use as a Corporate Social Responsibility: Make data use and innovation part of your company’s CSR policies where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders.

Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their client.

I’m going to stop whining. Here are a few simple things companies can do

I just saw this interview  with Tim Cooks where he says that privacy is a fundamental right and core to the American identity. With all of my recent ranting and raging about private companies and dystopia, I could start proposing solutions or become real cynical about the Apple CEO’s words. Maybe Apple is desperate to distance itself from Facebook’s recent scandals including news that phone manufacturers reached secret deals with the social media giant to access user data.

immendorffOf course, there is also the glaringly stark contrast between the FCC now permitting American ISPs to sell user data to third parties and how the new European data law (the infamous GDPR) — not Americans’ passion for privacy rights — is the primary catalyst for the current public conversation around privacy rights in the digital era.

But instead of complaining, maybe I should look at what Apple is saying as a teachable moment. Likely Apple sees a major marketing opportunity to remind its customers (and promote the fact) that it is not in the data sharing business and that monetizing its customer’s data is contrary to its core values. At a time when companies (for example U.S. ISPs) are licking their fingers at the chance to rake in big bucks by reselling their paying customers’ data, there is a huge niche – analogous to the organic/bio foods business – for privacy-friendly products and technologies.

So taking advantage of this potentially positive turn of events, I should walk-the-walk and do what I keep saying lawyers ought to. Propose solutions! So here it goes:

For lawyers in private practice, I encourage them to continue to play a pivotal and activist role in bringing claims against and putting pressure on companies and governments that misuse our personal data or infringe on our rights. Holding them accountable does not happen by osmosis. You need lawyers leading as change agents (think about the essential role of lawyers in the Civil Rights Movement).

tshirtsNow as an in-house counsel in a tech company, I would suggest leaving your beret and Che Guevara t-shirt at home. Instead concentrate on how your company can build trust amongst its customers so that they will feel comfortable using your services, and in turn, you will feel comfortable providing them with those services. Here are some basic things, some of which I have mentioned before, that in-house lawyers can propose within their companies:

Privacy by Design (and security by design): Put the end user at the center of your technology’s architecture. Try to minimize the amount of personal data you will need to provide the service, limit what you share with third parties to only what needs to be shared, and give the end user the ability to opt-out of features that share more detailed personal data. Be transparent. If you concentrate on what the end user will be comfortable with and empower her with control over her data, then you are on the right track.

Value Proposition: As the tech giants, especially ones where consumers have little bargaining power (like ISPs, Facebook, and Google), demand more access to their users’ data, companies can use privacy protections as a strong differentiating value proposition. As mentioned above, the market opportunity is huge. For example, if European consumers are not comfortable with the ease at which their data may be swept up and monitored by American spy agencies (as revealed by Edward Snowden), why not offer European-based services that guarantee greater freedom from the intrusion of a foreign government? As with Apple, if you sell sleek, cool, and lifestyle, the minute your customers perceive that you are no longer any of those things – and btw selling customer data is creepy not cool – then game over.

Business Model. Re-think the business model. Propose different fee structures or revenue sharing options that give end users more control and something of value in return for handing over their data. For example, offer customers discounted fees (like Amazon does with the Kindle) if they allow the company to monetize their data. Alternatively, how about a giving a piece of the revenue to the customer when the company makes money off her data? I worked for a WiFi sharing start-up where, keeping true to the company’s value of sharing, we shared revenues with users who shared back. If my ISP is making money off my data, then why not demand something in return?

fb congress

Product Ethics: Before thinking about the legality of a new product or service, focus on it from an ethical viewpoint. Many companies now have data governance committees, but consider a broader products ethics committee made up of a cross section of the company. Look not just at data use but the potential for a product or service to be misused (even if hacked) with results that are contrary to the company’s values. If you build products that resell or rely on processing large volumes of personal data, put an ethicist on staff. Remember no matter who your CEO is or how much of a celebrity he may be, the last thing you want is for him to have to sit in front of lawmakers struggling to explain why your service was linked to a major human rights violation, political scandal, or massive leak of sensitive personal data.

Data Use as a Corporate Social Responsibility: Include data use and innovation in your company’s CSR policies. Call it your Innovation for Good Policy where you commit to (i) not use the personal data and technology at your disposal in a way that has a negative effect on your community and stakeholders, and (ii) affirmatively use technology and innovation for the good of your community and stakeholders. For example, at my current company, Amadeus, I am very proud to have been involved in two CSR initiatives with UNICEF where we used technology and aggregated travel data to help suffering children and to predict the spread of diseases like Ebola and Zika.

Put all together, the most important thing a company can do is to take the time to have open, internal conversations about the effects that its products and services may have on users and society. That way senior management can make informed decisions in line with the companies core values and identity. Lawyers don’t like surprises, and neither do their clients.

Brave New World, Inc.

Minority Report

Earlier this week, Rana Foroohar wrote in the Financial Times that “Companies are the cops in our modern-day dystopia”:

The mass surveillance and technology depicted in the [2002 movie Minority Report] — location-based personalised advertising, facial recognition, newspapers that updated themselves — are ubiquitous today. The only thing director Steven Spielberg got wrong was the need for psychics. Instead, law enforcement can turn to data and technologies provided by companies like Google, Facebook, Amazon and intelligence group Palantir.

The dystopian perspective on these capabilities is worth remembering at a time when the private sector is being pulled ever more deeply into the business of crime fighting and intelligence gathering. Last week, the American Civil Liberties Union and several other rights groups called on Amazon to stop selling its Orwellian-sounding Rekognition image processing system to law enforcement officials, saying it was “primed for abuse in the hands of government”.

the-wire-lester

I have written a few posts already about the potential for governments and private companies to use new technologies such as cryptocurrencies, biometrics and data mining to engage in activities that we would normally associate with the fictional totalitarian regimes of George Orwell or Aldous Huxley. With regards to state actors, like China, using biometrics for crime prevention, I wrote:

But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.

And with regards to cryptocurrencies, I wrote:

Although neither George Orwell or Aldous Huxley’s dystopian futures predicted a world governed by corporations as opposed to authoritarian governments, it may be more plausible to imagine a world where corporations control the money supply, not with coins and bills but cryptocurrencies. In fact, the fad amongst many technologists today is to encourage the disintermediation (or deregulation) of money by moving to Blockchain-based cryptocurrencies like Bitcoin. But instead of removing the middleman, we are more likely – contrary to the idealists’ ambitions — to open the door to empower big tech companies like Amazon, Facebook and Google to tokenize their platforms, replacing one currency regulator with corporate ones.

But private companies are able to do so much more with the data that we so generously (and often naively) hand them. The possibilities for abuse seem endless. To a large degree, the new GDPR mitigates this risk by giving the consumer visibility about and control over how her data is being used, and hopefully building trust between consumers and their service providers.  As stated here before, more important than complying with strict new laws, “to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.”

But what happens if consumers are not given the opportunity to intelligently grant consent or agree to use a service that shares their data? The first GDPR complaints have been filed precisely on these grounds:

Across four complaints, related to Facebook, Instagram, WhatsApp and Google’s Android operating system, European consumer rights organisation Noyb argues that the companies have forced users into agreeing to new terms of service, in breach of the requirement in the law that such consent should be freely given.

Continue reading “Brave New World, Inc.”