Algorithms are at the heart of the Big Data/machine learning/AI changes that are propelling computerized decision-making. In their book, The Ethical Algorithm, Michael Kearns and Aaron Roth, two Computer Science professors at Penn, flag some of the social and ethical choices these changes are forcing upon us. My interview with them touches on many of the hot-button issues surrounding algorithmic decision-making.
I have long suspected that much of the fuss over bias in machine learning is a way of smuggling racial and gender quotas and other academic social values into the algorithmic outputs. Michael and Aaron may not agree with that formulation, but the conversation provides a framework for testing it – and leaves me more skeptical about claims that “AI bias" is the problem it's been portrayed.
Less controversial, but equally fun, is our dive into the ways in which Big Data and algorithms defeat old-school anonymization – and the ways in which that problem can be solved. The cheating husbands of Philadelphia help me understand the value and technique of differential privacy.
And if you wondered why, say, much of the social science and nutrition research of the last 50 years doesn’t hold up to scrutiny, blame Big Data and algorithms that reliably generate a significant correlation once in every 20 tries.
Michael and Aaron also take us into the unexpected social costs of algorithmic optimization. It turns out that a recommendation engine that produces exactly what we want, even when we didn’t know we wanted it, is great for the user, at least in the moment, but maybe not so great for society. In this regard, it's a little like creating markets in areas once governed by social norms. The switch to market pricing instead of societal mores often optimizes individual choice but at considerable social cost. It turns out that algorithms can do the same – optimize individual gratification in the moment while roiling our social and political order in unpredictable ways. We would react badly to a proposal that dating choices be turned into more efficient microeconomic transactions (otherwise known as prostitution) but we don’t feel the same way about reducing them to algorithms.
Maybe we should.
As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!
The views expressed in this podcast are those of the speakers and do not reflect the opinions of the speakers' families, friends, a growing number of former friends, clients, or institutions. Or spouses. I've been instructed to specifically mention spouses.