Is MATHS creating an unfair society? How algorithms decide our insurance costs, where police patrol – and even how long we’re left on hold
Masses of information is collected about each one of us all the time – what we buy and where we shop, how we’re driving, where we live, who we’re friends with, where we go, how healthy we are.
A lot of the time this information is used to our benefit: it helps us get lower insurance premiums, for example, if we can show we can drive well, it means we receive promotions for things that we might actually want, it means so much more of what we consume is ‘personalised’.
But there is a darker side to the algorithms that process this information, says mathematician Cathy O’Neil. The so-called ‘big data’ that is being accumulated also means we might be ruled out of getting jobs we apply for because we’re not similar to existing, successful employees. It means we might be kept waiting on hold for longer, because a company has data to suggest we’re not such a valuable customer. It can also mean that police target specific areas, and in so doing drive up crime rates in those same areas – which again feeds into the algorithms.
In this latest episode of the Big Money Questions, Cathy O’Neil talks through these issues and her latest book: Weapons of Math Destruction.
Cathy is the author of the blog mathbabe.org, has a PhD in mathematics from Harvard and is a data specialist.