Can auditing eliminate bias from algorithms?

This post was originally published on this site


For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets a loan — even who has priority for COVID-19 vaccines. Rather than remove bias, one algorithm after another has codified and perpetuated it, as companies have simultaneously continued to more or less shield their algorithms from public scrutiny. The big question ever since: How do we solve this problem? Lawmakers and researchers have advocated for algorithmic audits, which would dissect and stress-test algorithms to see how they work…

This story continues at The Next Web

Leave a Reply

Your email address will not be published. Required fields are marked *