Diversity is always a very interesting word and it’s interesting that the call for more comes after two of the three scandals mentioned in the opening posts are about EA being diverse along an axis that many EAs disagree with.
Similarly, it’s very strange that a post that talks a lot about the problems of EAs caring too much about other people being value aligned and afterward talk in the recommendations about how there should be more scrutiny to checking whether funders are aligned with certain ethical values.
This gives me the impression that the main thesis of the post is that EA values differ from woke values and should be changed to be more aligned with woke values.
The post doesn’t seem to have any self-awareness about pushing in different axis. If your goal is to convince people to think differently about diversity or about the importance of value alignment, it would make sense to make arguments that are more self-aware.
When Stuart Russell argues that AI could pose an existential threat to humanity, he is held up as someone worth listening to –”He wrote the book on AI, you know!” However, if someone of comparable standing in Climatology or Earth-Systems Science, e.g. Tim Lenton or Johan Rockström, says the same for their field, they are ignored, or even pilloried.
To me, this looks like it mistakes why people hold the views that they do and strawman people.
Saying “Stuart Russell” is worth listening to because of his book boils down to “If you actually want to understand why AI is an existential threat to humanity, read his book it’s likely to convince you”. On the other hand, Tim Lenton or Johan Rockström have not written books that make arguments for the importance of climate change that seem convincing to many EAs.
Quantification
When it comes to the topic of quantification, it seems that this post criticizes at the same time that EAs quantify everything and that they don’t quantify the value of paying community organizers relatively high salaries.
EAs seem to me, very willing to do a lot of thing, especially in the field of long-termism without quantification being central. Generally, EA thought leaders don’t tend to hold positions naive positions on topics like diversity or quantification but complex ones. If you want to change views you need to be more clear about cruxes and how you think about the underlying tradeofffs.
Diversity is always a very interesting word and it’s interesting that the call for more comes after two of the three scandals mentioned in the opening posts are about EA being diverse along an axis that many EAs disagree with.
Similarly, it’s very strange that a post that talks a lot about the problems of EAs caring too much about other people being value aligned and afterward talk in the recommendations about how there should be more scrutiny to checking whether funders are aligned with certain ethical values.
This gives me the impression that the main thesis of the post is that EA values differ from woke values and should be changed to be more aligned with woke values.
The post doesn’t seem to have any self-awareness about pushing in different axis. If your goal is to convince people to think differently about diversity or about the importance of value alignment, it would make sense to make arguments that are more self-aware.
To me, this looks like it mistakes why people hold the views that they do and strawman people.
Saying “Stuart Russell” is worth listening to because of his book boils down to “If you actually want to understand why AI is an existential threat to humanity, read his book it’s likely to convince you”. On the other hand, Tim Lenton or Johan Rockström have not written books that make arguments for the importance of climate change that seem convincing to many EAs.
When it comes to the topic of quantification, it seems that this post criticizes at the same time that EAs quantify everything and that they don’t quantify the value of paying community organizers relatively high salaries.
EAs seem to me, very willing to do a lot of thing, especially in the field of long-termism without quantification being central. Generally, EA thought leaders don’t tend to hold positions naive positions on topics like diversity or quantification but complex ones. If you want to change views you need to be more clear about cruxes and how you think about the underlying tradeofffs.