Danilo Bortoli
1 min readApr 20, 2021

--

No The Verge,

“It’s important to hold yourself accountable for your algorithm’s performance. Our recommendations for transparency and independence can help you do just that. But keep in mind that if you don’t hold yourself accountable, the FTC may do it for you,” writes Jillson.

Artificial intelligence holds the potential to mitigate human bias in processes like hiring, but it can also reproduce or exaggerate that bias, particularly if it’s trained in data that reflects it. Facial recognition, for instance, produces less accurate results for Black subjects — potentially encouraging false identifications and arrests when police use it. In 2019, researchers found that a popular health care algorithm made Black patients less likely to receive important medical care, reflecting preexisting disparities in the system. Automated gender recognition tech can use simplistic methods that misclassify transgender or nonbinary people. And automated processes — which are frequently proprietary and secret — can create “black boxes” where it’s difficult to understand or challenge faulty results.

--

--