Algorithms are often presented as if they’re silver bullets, inherently fair and morally neutral thanks to their mathematical nature. But that’s not true. Rather, they should be seen as social constructs that have been formalized and automated.
The irony is that algorithms, typically introduced to make decisions cleaner and more consistent, end up obfuscating important moral aspects and embedding difficult issues of inequality, privacy and justice. As a result, they don’t bypass much hard work. If we as a society want to make the best use of them, we’ll have to grapple with the tough questions before applying algorithms – particularly in an area as sensitive as the lives of our children.