For the issue in general (not specific to the area of IIDM or how EAIF thinks about things), thereās some discussion from 80k here and in other parts of that article. Probably also in some other posts tagged accidental harm.
(Though note that 80k include various caveats and counterpoints, and conclude the article with:
We hope this discussion of ways to do bad hasnāt been demotivating. We think most projects that have gone forward in the EA community have had a positive expected value and when we hear about new projects weāre typically excited, not wary. Even projects that are ill-conceived to start with typically improve over time as the founders get feedback and learn from experience. So whenever you consider these risks (and our advice for mitigating them) make sure to weigh them against the potentially massive benefits of working on some of the worldās most pressing problems.
I say that just to avoid people being overly discouraged by reading a single section from the middle of that article, without the rest of the context. I donāt say this to imply I disagree with Maxās comments on the IIDM grant.)
Thanks! The 80,000Hours article kind of makes it sound like itās not supposed to be a big consideration and can be addressed by things IIDM has clearly done, right?
Get advice from people you trust to be honest about whether youāre a reasonable fit for the project youāre considering. Ask around to see if anybody else in your field has similar plans; maybe you should merge projects, collaborate, or coordinate on which project should move forward.
My impression is that the IIDM group is happy for any people interested in collaborating and called for collaboration a year ago or so, and the space of improving institutions also seems very big (in comparison to 80kās examples of career advice for EAs and local EA chapters).
(Disclaimer: speaking for myself here, not the IIDM group.)
My understanding is that Max is concerned about something fairly specific here, which is a situation in which we are successful in capturing a significant share of the EA communityās interest, talent, and/āor funding, yet failing to either imagine or execute on the best ways of leveraging those resources.
While I could imagine something like this happening, itās only really a big problem if either a) the ways in which weāre falling short remain invisible to the relevant stakeholders, or b) our group proves to be difficult to influence. Iām not especially worried about a) given that critical feedback is pretty much the core competency of the EA community and most of our work will have some sort of public-facing component. b) is something we can control and, while itās not always easy to judge how to balance external feedback against our inside-view perspectives, as youāve pointed out weāve been pretty intentional about trying to work well with other people in the space and cede responsibility/āconsider changing direction where it seems appropriate to do so.
For the issue in general (not specific to the area of IIDM or how EAIF thinks about things), thereās some discussion from 80k here and in other parts of that article. Probably also in some other posts tagged accidental harm.
(Though note that 80k include various caveats and counterpoints, and conclude the article with:
I say that just to avoid people being overly discouraged by reading a single section from the middle of that article, without the rest of the context. I donāt say this to imply I disagree with Maxās comments on the IIDM grant.)
Thanks! The 80,000Hours article kind of makes it sound like itās not supposed to be a big consideration and can be addressed by things IIDM has clearly done, right?
My impression is that the IIDM group is happy for any people interested in collaborating and called for collaboration a year ago or so, and the space of improving institutions also seems very big (in comparison to 80kās examples of career advice for EAs and local EA chapters).
(Disclaimer: speaking for myself here, not the IIDM group.)
My understanding is that Max is concerned about something fairly specific here, which is a situation in which we are successful in capturing a significant share of the EA communityās interest, talent, and/āor funding, yet failing to either imagine or execute on the best ways of leveraging those resources.
While I could imagine something like this happening, itās only really a big problem if either a) the ways in which weāre falling short remain invisible to the relevant stakeholders, or b) our group proves to be difficult to influence. Iām not especially worried about a) given that critical feedback is pretty much the core competency of the EA community and most of our work will have some sort of public-facing component. b) is something we can control and, while itās not always easy to judge how to balance external feedback against our inside-view perspectives, as youāve pointed out weāve been pretty intentional about trying to work well with other people in the space and cede responsibility/āconsider changing direction where it seems appropriate to do so.