Specific cause areas like AI safety and pandemic preparedness were generally better liked than broader concepts like EA or longtermism.
The summary was generally good, but I wouldn’t say the above exactly. In the one study where we tested specific causes against broader concepts, AI Safety and Pandemic preparedness were roughly neck and neck with the general broader concept Global catastrophic risk reduction. Those three were more popular than Climate change (specific), Effective Altruism and Effective Giving (broader), which were neck and neck with each other. And all were more effective than Longtermism. So there wasn’t a clear difference between specific cause area vs broader concept distinction.
The summary was generally good, but I wouldn’t say the above exactly. In the one study where we tested specific causes against broader concepts, AI Safety and Pandemic preparedness were roughly neck and neck with the general broader concept Global catastrophic risk reduction. Those three were more popular than Climate change (specific), Effective Altruism and Effective Giving (broader), which were neck and neck with each other. And all were more effective than Longtermism. So there wasn’t a clear difference between specific cause area vs broader concept distinction.