Edit: I didn’t see how old this post was! It came up on my feed somehow and I’d assumed it was recent.
Thanks for this. I’ve only skimmed the report, and don’t have expertise in the area. So the below is lightly held.
Section 4.2.3 talks about negative wellbeing effects. I think these are a serious downside risk, but other than noting that severe harms are indeed faced by guest workers, the report’s response is based on a single paper (Clemens 2018) and the idea that the intervention could improve things via surveys and ratings. Most of the benefit/harms considered in the rest of the report are financial (it appears on a skim).
I think the risk of facilitating severe harms against individuals (participating in the ‘repugnant transaction’) is very unsettling, and would be my main reason not to donate to such a charity. If I were a prospective donor I would want to see deeper exploration and red teaming of this worry.
I’d also note that this issue has characteristics that EA/AIM is likely to systematically underrate:
harms that are hard to quantify and compare against financial benefits, and
harms that may be wrong in a deontological sense to inflict, and not properly appreciated or respected by an all things considered cost-effectiveness analysis.
Edit: I didn’t see how old this post was! It came up on my feed somehow and I’d assumed it was recent.
Thanks for this. I’ve only skimmed the report, and don’t have expertise in the area. So the below is lightly held.
Section 4.2.3 talks about negative wellbeing effects. I think these are a serious downside risk, but other than noting that severe harms are indeed faced by guest workers, the report’s response is based on a single paper (Clemens 2018) and the idea that the intervention could improve things via surveys and ratings. Most of the benefit/harms considered in the rest of the report are financial (it appears on a skim).
I think the risk of facilitating severe harms against individuals (participating in the ‘repugnant transaction’) is very unsettling, and would be my main reason not to donate to such a charity. If I were a prospective donor I would want to see deeper exploration and red teaming of this worry.
I’d also note that this issue has characteristics that EA/AIM is likely to systematically underrate:
harms that are hard to quantify and compare against financial benefits, and
harms that may be wrong in a deontological sense to inflict, and not properly appreciated or respected by an all things considered cost-effectiveness analysis.