Cathy O’Neil’s book, Weapons of Math Destruction, makes people think and may even shock you. Even the subtitle of her book (i.e., “How Big Data Increases Inequality and Threatens Democracy”) will get your attention.
As someone who routinely quizzes software vendors regarding their Big Data and analytic efforts, I knew some of the problems in this world. But, I was surprised at the insidiousness of so many other facets she writes about.
Ms. O’Neil is a quant/data scientist whose career has taken her into some rarified data analytics and algorithmic avenues. Her career started on Wall Street with a hedge fund and moved into other spaces. Along the way, she came to realize many things.
For starters, algorithms (fed by Big Data) are all over the place. Credit scores are a computed value that lenders rely on to determine interest rates and whether or not to extend credit. Mrs. O’Neil shows why the better credit scores are not weapons of math destruction (WMD) as the scoring factors are transparent to those impacted by them.
People know the factors that can positively or negatively affect one’s credit score. Late or missed payments, for example, are a well-known adverse scoring factor. Additionally, the credit scoring process has a corrective feedback mechanism where affected persons can get errors remedied in a timely fashion.
Now, let’s contrast that with many of the analytics and algorithms that HR vendors are unleashing on the market. Some products (like candidate screening/resume screening) eliminate a number of potential jobseekers based on factors that are unknown to the jobseeker.
Should people get removed from consideration because of their credit score, zip code, personality test score or other poorly correlated factors? I’d argue “no way” but vendors keep rolling more of this out and companies keep buying it.
What makes these products become WMD’s is that the decision factors are not transparent to job seekers – they are opaque. Worse, there is no mechanism for firms to correct the errors in their algorithms. Employers have no idea whether a discarded resume might have come from a great person after all. Instead, these tools simply reinforce past prejudices. These analytic or algorithmic tools don’t learn and don’t improve.
When you read this book, you’ll find all kinds of misuses of Big Data powered algorithms. Police departments, school districts, Wall Street, and more are designing and rolling out poorly thought out tools that do create inequities.
The smart firms and executives out there will read this book and will ensure that their mechanisms are fair, transparent, learning systems. Those that don’t will likely find themselves on the wrong end of some publicly embarrassing litigation.
There are a number of great WMD examples in this book as well as guidance in evaluating whether a tool is a WMD or not. If I could, I’d send a copy of this book to every HR technology vendor CEO out there. The HR space has always felt like there are far too many amateurs out there playing with fire. These are people that need to read this book.
Business executives should read this too. This isn’t a dry, dusty, dense math book. It takes readers through a number of industries (e.g., Insurance, Education, Law Enforcement) to make powerful, but easy to understand, arguments as to how these new data sets and technologies are creating untold amounts of harm.
I doubt that many of today’s data scientists think they are creating problems for others. But education and awareness might make more of them hip to the sometimes unseen and unintended consequences of their creations. If this book teaches us anything, we need to look at these tools a lot more carefully.
Weapons of Math Destruction comes highly recommended.
Bonus points: Cathy talks about her book on this video:
Image credit - via Amazon