It basically represents my transition from thinking that algorithms were basically fair and fine, to thinking they’re biased because people are biased and so bias is baked in eg through bad data, to realising the are a very wide variety of ways that algorithms can unintentionally discriminate.
They’re not a particularly EA-related pair of papers, but they are very interesting.
In 2015, we learned that Google ads are unintentionally sexist: https://www.degruyter.com/downloadpdf/j/popets.2015.1.issue-1/popets-2015-0007/popets-2015-0007.xml
And in 2019, we learned why: https://pubsonline.informs.org/doi/abs/10.1287/mnsc.2018.3093
Can you expand on how this influenced you?
It basically represents my transition from thinking that algorithms were basically fair and fine, to thinking they’re biased because people are biased and so bias is baked in eg through bad data, to realising the are a very wide variety of ways that algorithms can unintentionally discriminate.
They’re not a particularly EA-related pair of papers, but they are very interesting.