Brave New World, Inc.

Minority Report

Earlier this week, Rana Foroohar wrote in the Financial Times that “Companies are the cops in our modern-day dystopia”:

The mass surveillance and technology depicted in the [2002 movie Minority Report] — location-based personalised advertising, facial recognition, newspapers that updated themselves — are ubiquitous today. The only thing director Steven Spielberg got wrong was the need for psychics. Instead, law enforcement can turn to data and technologies provided by companies like Google, Facebook, Amazon and intelligence group Palantir.

The dystopian perspective on these capabilities is worth remembering at a time when the private sector is being pulled ever more deeply into the business of crime fighting and intelligence gathering. Last week, the American Civil Liberties Union and several other rights groups called on Amazon to stop selling its Orwellian-sounding Rekognition image processing system to law enforcement officials, saying it was “primed for abuse in the hands of government”.

the-wire-lester

I have written a few posts already about the potential for governments and private companies to use new technologies such as cryptocurrencies, biometrics and data mining to engage in activities that we would normally associate with the fictional totalitarian regimes of George Orwell or Aldous Huxley. With regards to state actors, like China, using biometrics for crime prevention, I wrote:

But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.

And with regards to cryptocurrencies, I wrote:

Although neither George Orwell or Aldous Huxley’s dystopian futures predicted a world governed by corporations as opposed to authoritarian governments, it may be more plausible to imagine a world where corporations control the money supply, not with coins and bills but cryptocurrencies. In fact, the fad amongst many technologists today is to encourage the disintermediation (or deregulation) of money by moving to Blockchain-based cryptocurrencies like Bitcoin. But instead of removing the middleman, we are more likely – contrary to the idealists’ ambitions — to open the door to empower big tech companies like Amazon, Facebook and Google to tokenize their platforms, replacing one currency regulator with corporate ones.

But private companies are able to do so much more with the data that we so generously (and often naively) hand them. The possibilities for abuse seem endless. To a large degree, the new GDPR mitigates this risk by giving the consumer visibility about and control over how her data is being used, and hopefully building trust between consumers and their service providers.  As stated here before, more important than complying with strict new laws, “to be commercially viable, these technologies need to gain consumers’ confidence and trust. Otherwise consumers will not be comfortable sharing their data and will simply not use the service.”

But what happens if consumers are not given the opportunity to intelligently grant consent or agree to use a service that shares their data? The first GDPR complaints have been filed precisely on these grounds:

Across four complaints, related to Facebook, Instagram, WhatsApp and Google’s Android operating system, European consumer rights organisation Noyb argues that the companies have forced users into agreeing to new terms of service, in breach of the requirement in the law that such consent should be freely given.

Continue reading “Brave New World, Inc.”

Advertisements

Big Brother, Cars, Face Recognition and Riding Like the Wind

pexels-photo-720815.jpegSince its inception the automobile has always been a romantic figure in American popular culture and emblematic of the American way of life. In short, the automobile has been freedom incarnate. On our sixteenth birthdays, we Americans take the day off and go straight to the DMV to take our driver’s exam. With our newly minted license, we are set free from the bounds of our parents and their ever-watching eye. It is our first right of passage.

As explained in an article in yesterday’s Washington Post, car companies can now store and mine lots of information about car owners, from how fast you are driving to where and when you are traveling. That means it becomes much harder for you to use your wheels to be free. Your spouse or your parents may not know where you are going or where you have been, but your car company does. If you’re cheating, you better hope your spouse doesn’t have a friend at your car company. And what if the police get a warrant to search your car company’s data on your vehicle’s performance? Forget contesting that speeding ticket in court. Who needs the Fifth Amendment when your car can incriminate against you instead?

Am I overreacting? Maybe, but the U.S. Senate did just approve with support from Democrats the extension and expansion of Donald Trump’s ability to spy on U.S. citizens and that includes reading your emails without a warrant. In fact, there were Democrats who said the matter didn’t even deserve to be debated. I would imagine that means mining data from our car companies as well.

Earlier this month, the Washington Post also reported on China’s intention to use facial recognition technology to keep a watchful eye on all citizens to help predict and combat crime. We should all be concerned about the government and private companies as Big Brother, but with facial recognition there is also the issue of accuracy.

facial recognition

False positives can mean that certain people are regularly stopped and potentially harassed by the police. Now imagine that the biometric engineers who set the algorithms are all from the same racial and ethnic groups, whether on purpose or not, their biases will be factored into the accuracy of the results. This will likely translate into minority groups taking the brunt of the false positives. For artificial intelligence and machine learning to be effective, it needs to be accurate at least 80% of the time. When that happens it will always be better than humans. But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.

Between companies, governments and new technologies, the potential for opportunities, efficiencies and abuse are endless. It is a Brave New World.