Beyond Transparency and Accountability: Three Additional Features Algorithm Designers Should Build into Intelligent Platforms
35 Pages Posted: 17 Aug 2020 Last revised: 3 Feb 2021
Date Written: August 9, 2020
In the age of artificial intelligence, innovative businesses are eager to deploy intelligent platforms to detect and recognize patterns, predict customer choices and shape user preferences. Yet such deployment has brought along the widely documented problems of automated systems, including coding errors, corrupt data, algorithmic biases, accountability deficits and dehumanizing tendencies. In response to these problems, policymakers, commentators and consumer advocates have increasingly called on businesses seeking to ride the artificial intelligence wave to build transparency and accountability into algorithmic designs.
While acknowledging these calls for action and appreciating the benefits and urgency of building transparency and accountability into algorithmic designs, this article highlights the complications the growing use of artificial intelligence and intelligent platforms has brought to this area. Commissioned for the 2020 Northeastern University Law Review Symposium entitled "Eyes on Me: Innovation and Technology in Contemporary Times," this article argues that owners of intelligent platforms should pay greater attention to three I’s: inclusivity, intervenability and interoperability.
This article begins with a brief background on the black box designs that have now dominated intelligent platforms. It then explains why the I in AI has greatly complicated the ongoing efforts to build transparency and accountability into algorithmic designs. The article further identifies three additional I’s that owners of intelligent platforms should build into these designs: inclusivity, intervenability and interoperability. These in-built design features will achieve win-win outcomes that help innovative businesses to be both socially responsible and commercially successful.
Suggested Citation: Suggested Citation