I’m currently really interested in advancements in AI – algorithms in particular are pretty fascinating and pretty terrifying. Especially since I don’t really have that much knowledge about how they work.
This article definitely falls into the “shit – we’re living in a dystopian science fiction novel” camp – you’re arrested for a crime, asked a series of questions and a computer will decide on how much of a “risk” you are to society – your score will affect your sentencing.
I think this is only being used in the US – I hope it never comes to the UK. The idea that my entire future could be decided by a computer program really scares me. Just imagine; you are asked a set of standardised questions and based on your answers a computer decides whether you are a danger to society or not.
Or maybe it decides whether you get a job… or a loan.
As per my recent experiences; it’s scary enough being asked a standard set of questions by a person who has power over your freedom. If my mental state was assessed by a computer… well who knows if I’d have been able to get out of hospital.
Now people are imperfect. We make bad decisions all the time. But our decisions can be challenged – the decision of a computer though, how do you challenge that without advanced knowledge of how a certain algorithm works? You can’t appeal to an algorithm’s humanity or empathy because it doesn’t have any. You can’t try and change an algorithms viewpoint through reasoned argument. In fact, unless you’re allowed to be reassessed… that’s it. Computer says dangerous. End of.
Yesterday I watched a Ted talk about algorithmic biases. Programmer and campaigner Joy Buolamwini demonstrates how bias can be found in computer programs and how it can proliferate around the world at the click of a button. She refers to this inherent bias as “The Coded Gaze.”
You can get involved in her campaign – Algorithmic Justice League to help spread awareness and contribute to research to ensure that the algorithm’s being used to make important decisions about our future are inclusive and fair.