In future, I’d like to see changes in the ‘other causes’ over time and across engagement level, if possible. For instance, it would be interesting to see if causes such as IIDM or S-risk are becoming more or less popular over time, or are mainly being suggested by new or experienced EAs.
Yeh, I agree that would be interesting. Unfortunately, if we were basing it on open comment “Other” responses, it would be extremely noisy due to low n, as well as some subjectivity in identifying categories. (Fwiw, it seemed like people mentioning S-risk were almost exclusively high engagement, which is basically what I’d expect, since I think it requires some significant level of engagement before people would usually be exposed to these ideas).
I think that it would be very interesting if we could compare the EA communities results on this survey against a sample of ‘people who don’t identify as EAs’ and people who identify as being in one or more ‘activist groups’ (e.g., vegan/climate etc) and explore the extent of our similarities and differences in values (and how these are changing over time).
I agree this would be interesting. I’m particularly interested in examing differences in attitudes between EA and non-EA audiences. Examining differences in cause ratings directly might be more challenging due to a conceptual gap between EA understanding of certain causes and the general population (who may not even be familiar with what some of these terms mean). I think surveying more general populations on their support for different things (e.g. long-termist interventions, suitably explained) and observing changes in these across would be valuable though. Another way to examine differences in cause prioritisation would be to look at differences in the charitable portolios of the EA community vs wider donors, since that aggregate data is more widely available.
Unfortunately, if we were basing it on open comment “Other” responses, it would be extremely noisy due to low n, as well as some subjectivity in identifying categories.
Thanks for explaining. I see what you mean. If it seems worth it (i.e., more people than me care!), you could potentially add a closed ended ‘other potential cause areas’ item. These options could be generated from the most popular options in the prior year’s open ended responses. E.g., you could have IIDM and S-risk as close ended ‘other options’ for that question next year (in addition to n other common responses) . You could keep the ‘open ended other potential cause areas’ as an ‘Are there any other causes you feel should be priorities for the EA community that we haven’t mentioned’ open ended option. You could also grow the closed ended items list year as needed each year.
I agree this would be interesting. I’m particularly interested in examing differences in attitudes between EA and non-EA audiences. Examining differences in cause ratings directly might be more challenging due to a conceptual gap between EA understanding of certain causes and the general population (who may not even be familiar with what some of these terms mean). I think surveying more general populations on their support for different things (e.g. long-termist interventions, suitably explained) and observing changes in these across would be valuable though.
Yes, I agree
Another way to examine differences in cause prioritisation would be to look at differences in the charitable portolios of the EA community vs wider donors, since that aggregate data is more widely available.
Thanks—this link is interesting. Great to see that religious institutions get the most. That’s definitely ideal :)
I hadn’t thought about comparing donation portfolio trends. That could be very useful data if we had good data!
If it seems worth it (i.e., more people than me care!), you could potentially add a closed ended ‘other potential cause areas’ item. These options could be generated from the most popular options in the prior year’s open ended responses. E.g., you could have IIDM and S-risk as close ended ‘other options’ for that question next year
Yeh that seems like it could be useful. It’s useful to know what kinds of things people find valuable, because space in the survey is always very tight.
Yeh, I agree that would be interesting. Unfortunately, if we were basing it on open comment “Other” responses, it would be extremely noisy due to low n, as well as some subjectivity in identifying categories. (Fwiw, it seemed like people mentioning S-risk were almost exclusively high engagement, which is basically what I’d expect, since I think it requires some significant level of engagement before people would usually be exposed to these ideas).
I agree this would be interesting. I’m particularly interested in examing differences in attitudes between EA and non-EA audiences. Examining differences in cause ratings directly might be more challenging due to a conceptual gap between EA understanding of certain causes and the general population (who may not even be familiar with what some of these terms mean). I think surveying more general populations on their support for different things (e.g. long-termist interventions, suitably explained) and observing changes in these across would be valuable though. Another way to examine differences in cause prioritisation would be to look at differences in the charitable portolios of the EA community vs wider donors, since that aggregate data is more widely available.
Thanks for explaining. I see what you mean. If it seems worth it (i.e., more people than me care!), you could potentially add a closed ended ‘other potential cause areas’ item. These options could be generated from the most popular options in the prior year’s open ended responses. E.g., you could have IIDM and S-risk as close ended ‘other options’ for that question next year (in addition to n other common responses) . You could keep the ‘open ended other potential cause areas’ as an ‘Are there any other causes you feel should be priorities for the EA community that we haven’t mentioned’ open ended option. You could also grow the closed ended items list year as needed each year.
Yes, I agree
Thanks—this link is interesting. Great to see that religious institutions get the most. That’s definitely ideal :)
I hadn’t thought about comparing donation portfolio trends. That could be very useful data if we had good data!
Yeh that seems like it could be useful. It’s useful to know what kinds of things people find valuable, because space in the survey is always very tight.