I think you make some reasonable points in your post, but I don’t think that you make a strong argument for what appears to be your central point, that more uncertainty would and should lead to greater diversity in cause areas.
I think I’d like to see your models for the following points to buy your conclusion
How much EA uncertainty does the current amount of diversity predict that we have, is this less than you think we ‘should’ have? My sense is that you’re getting more of a vibe we should have more causes, but
Why does more diversity fall out of more uncertainty? This seems to kind of be assumed but I think the only argument that was made here was the timelines thing which feels like the wrong way to think about this (at least to me).
A few concrete claims about uncertainty over some crux and why you think that means we are missing [specific cause area].
(You do point to a few reasons for why the diversity of causes may not exist which I think is very helpful—although I probably disagree with the object level takes)
By diversity do you mean diversity of cultural origin and gender, or diversity of career focus, or both?
I’d say my view is that the optimal distribution of career focuses would have more people working on less popular EA causes (eg—democracy / IDM, climate change, supervolcanoes, other stuff on 80K’s lists) than we have now. I don’t have a particular cause area in mind which EA is entirely ignoring.
No, I mean roughly the total number of cause areas.
It’s a bit different to total number of causes as each cause area adds more diversity if it is uncorrelated from other cause areas. Maybe a better operationalisation is ‘total amount of the cause area space covered’.
Thanks for writing this!
I think you make some reasonable points in your post, but I don’t think that you make a strong argument for what appears to be your central point, that more uncertainty would and should lead to greater diversity in cause areas.
I think I’d like to see your models for the following points to buy your conclusion
How much EA uncertainty does the current amount of diversity predict that we have, is this less than you think we ‘should’ have? My sense is that you’re getting more of a vibe we should have more causes, but
Why does more diversity fall out of more uncertainty? This seems to kind of be assumed but I think the only argument that was made here was the timelines thing which feels like the wrong way to think about this (at least to me).
A few concrete claims about uncertainty over some crux and why you think that means we are missing [specific cause area].
(You do point to a few reasons for why the diversity of causes may not exist which I think is very helpful—although I probably disagree with the object level takes)
Thanks for your comment!
By diversity do you mean diversity of cultural origin and gender, or diversity of career focus, or both?
I’d say my view is that the optimal distribution of career focuses would have more people working on less popular EA causes (eg—democracy / IDM, climate change, supervolcanoes, other stuff on 80K’s lists) than we have now. I don’t have a particular cause area in mind which EA is entirely ignoring.
No, I mean roughly the total number of cause areas.
It’s a bit different to total number of causes as each cause area adds more diversity if it is uncorrelated from other cause areas. Maybe a better operationalisation is ‘total amount of the cause area space covered’.