Tom Upchurch, Wired:
One person unsurprised by the unfolding data scandals surrounding Cambridge Analytica and Facebook is Cathy O’Neil. In 2016 Cathy published her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. In the book O’Neil reveals how a silent bureaucracy governed by algorithms and big data is emerging across every corner of society.
This new bureaucracy is increasingly deciding who gets a job, who gets credit (and at what rate), who goes to prison and what information people read. Some of these systems may be making accurate decisions. However, Cathy argues that accuracy and efficiency alone are not sufficient metrics for success. Fairness, equity and other social considerations need to be built into the algorithms. Until this is done, these secretive and unaudited algorithms will continue to make unfair, biased and discriminatory decisions, on a systemic level.
First, I just want to make it clear that a Hippocratic oath alone is insufficient for the task that lies ahead, because at the end of the day data scientists are not corporations. They work within corporations, and they get fired if they don’t do what the corporations tell them to do. So, I don’t want to make it seem like once this ethical framework has been set up we’re going to be good, because it’s just not true.
Having said that, so many of the data scientists that are in work right now think of themselves as technicians and think that they can blithely follow textbook definitions of optimisation, without considering the wider consequences of their work. So, when they choose to optimise to some kind of ratio of false positives or false negatives, for example, they are not required by their bosses or their educational history to actually work out what that will mean to the people affected by the algorithms they’re optimising. Which means that they don’t really have any kind of direct connection to the worldly consequences of their work.
…I think the biggest thrust of a Hippocratic oath would be to realise that we have the ability and the potential to have an enormous amount of influence on society but without the wisdom to understand the true impact of this influence.
…we’re still focusing on the wrong things. We’re not measuring the actual damage being done to democracy, because it is hard to measure democracy. Instead, we’ll focus on things like pedestrian deaths with self-driving cars. That is of course a tragedy when it happens, but it is not actually the biggest, systemic problem. The biggest problem is all the invisible failures of algorithms that we’re not keeping track of at all.
I witnessed the pathetic response to the illegal goings-on of the banks during, before and even after the financial crisis and I saw the numerous settlements where penalty fees were charged that were less than the profits they made on their fraudulent dealing. I think it is fair to say that this is not a very strong incentive to keep them from doing it in the future.
So, I think the point is that you have to threaten these organisations with a meaningful and existential threat.