Math washing

Are algorithms more neutral than people? No. Algorithms can have hidden biases. This makes them a popular way to hide power and avoid scrutiny.

The discriminatory effects of algorithms are becoming increasingly visible in the news, like when research showed that Google’s algorithms were presenting job-seeking women with less high-paying jobs than men. But most people, especially in the technology and policy fields, still remain blissfully unaware.

To address this problem I created the website www.mathwashing.com, which explains the issues of fairness and transparency that surround algorithms and the data that is fed into them.

This is important, as we increasingly employ algorithms to give advise or make important decisions. An example is our justice system, where there is a lot of excitement about ‘predictive policing’ algorithms, which are still seen as ‘neutral tools’. While these algorithms are actually a new form of law that takes the form of computer code, and is not subject to democratic checks.

Talks and workshops

I regularly give talks about algorithmic issues and ‘data discrimination’. If you or your organization are interested in this please contact me.

The term ‘math washing’ was coined by Fred Benenson.