Announcing EA Pulse, large monthly US surveys on EA
Rethink Priorities is excited to announce EA Pulse—a large, monthly survey of the US population aimed at measuring and understanding public perceptions of Effective Altruism and EA-aligned cause areas! This project has been made possible by a grant from the FTX Future Fund.
What is EA Pulse?
EA Pulse aims to serve two primary purposes:
Tracking changes in responses to key questions relevant to EA and longtermism over time (e.g. awareness of and attitudes towards EA and longtermism, and support for different cause areas).
Running ad hoc questions requested by EA orgs (e.g. support for particular policies, responses to different messages EAs are considering).
We welcome requests for questions to include in the survey of either of these types. Please comment below or e-mail david@rethinkpriorities.org, ideally by October 20th.
By tracking beliefs and attitudes towards issues related to effective altruism and longtermism, we can better get our finger on the pulse of movement building efforts over time, and potentially identify unforeseen risks to the movement. We will also be able to determine whether particular subgroups of the population appear to be missed or turned off by our outreach efforts.
We also believe that surveying the broader public can provide a new window for looking at how the ideas generated by the EA community are being taken up by the wider population. In turn, it can help us communicate more effectively and efficiently about what matters most.
Due to space constraints this survey is best suited to asking about relatively short, straightforward questions. If you are interested in surveys with more complex designs, a larger number of questions or experimental manipulations, complex instructions, or which involve asking respondents to read lengthy text or view videos, we are potentially able to accommodate these in separate surveys (funding permitting). Please feel free to reach out to discuss possibilities.
- Rethink Priorities’ 2022 Impact, 2023 Strategy, and Funding Gaps by 25 Nov 2022 5:37 UTC; 108 points) (
- Rethink Priorities is inviting expressions of interest for (co)leading a longtermist project/organization incubator by 13 Feb 2023 11:17 UTC; 74 points) (
- Re-announcing Pulse by 4 Sep 2023 17:00 UTC; 53 points) (
- 1 Dec 2022 2:21 UTC; 38 points) 's comment on Public views of EA: some quick-and-dirty Twitter polls by (
- Rethink Priorities is hiring an Events and Office Coordinator by 2 Feb 2023 16:58 UTC; 30 points) (
- EA & LW Forums Weekly Summary (19 − 25 Sep 22′) by 28 Sep 2022 20:13 UTC; 25 points) (
- EA Organization Updates: October 2022 by 14 Oct 2022 13:36 UTC; 23 points) (
- EA & LW Forums Weekly Summary (19 − 25 Sep 22′) by 28 Sep 2022 20:18 UTC; 16 points) (LessWrong;
- Monthly Overload of EA—October 2022 by 1 Oct 2022 12:32 UTC; 13 points) (
- 22 Sep 2022 15:52 UTC; 4 points) 's comment on The Hundred Billion Dollar Opportunity that EAs mostly ignore by (
Straight into my veins.
(slang which means “This is very good and I want more of it”)
We feel the same!
I’m really excited to see this survey idea getting developed. Congratulations to the Rethink team on securing funding for this!
A few questions on design, content and purpose:
Who are the users for this survey, how will they be involved with the design, and how will findings be communicated with them?
In previous living / repeated survey work that I’ve done (SCRUB COVID-19), having research users involved in the design was crucial for it to influence their decision-making. This also got complex when the survey became successful and there were different groups of research users, all of whom had different needs
Because “what gets measured, gets managed”, there is a risk / opportunity in who decides which questions should be included in order to measure “awareness and attitudes towards EA and longtermism”.
Will data, materials, code and documentation from the survey be made available for replication, international adaptation, and secondary analysis?
This could include anonymised data, Qualtrics survey instruments, R code, Google docs of data documentation, etc
Secondary analysis could significantly boost the current and long-term value of the project by opening it up to other interested researchers to explore hypotheses relevant to EA
Providing materials and good code & documentation can help international replication and adapation.
Was there a particular reason to choose a monthly cycle for the survey? Do you have an end date in mind or are you hoping to continue indefinitely?
Do you anticipate that attitudes and beliefs would change that rapidly? In other successful ‘pulse’ style national surveys, it’s more common to see yearly or even less frequent measurement (here’s one great example of a longitudinal values survey from New Zealand)
Is there capacity to effectively design, conduct, analyse, and communicate at this pace? In previous work I’ve found that this cycle—especially in communicating with / managing research users, survey panel companies, etc—can become exhausting, especially if the idea is to run the survey indefinitely.
In terms of specific questions to add, my main thought is to include behavioural items, not just attitudes and beliefs.
Ways of measuring this could include “investigated the effectiveness of a charity before donating on the last occasion you had a chance”, or “donated to effective charity in past 12 months”, or “number of days in the past week that you ate only plant-based products (no meat, seafood, dairy or eggs)
Through the SCRUB COVID-19 project, we (several of us at Ready) ran a survey of 1700 Australians every 3 weeks for about 15 months (2020-2021) in close consultation with state policymakers and their research users. Please reach out if you’d like to discuss / share experiences.
Thanks Alexander! I appreciate the offer to meet to talk about your experiences, that sounds very useful!
We envisage the main users of the survey being EA orgs and decision-makers. We’ve already been in touch with some of the main groups and will reach out to some key ones to co-ordinate again now that we’ve formally announced. That said, we’re also keen to receive suggestions and requests from a broader set of stakeholders in the community (hence this announcement).
The exact composition of the survey, in terms of serving different users, will depend on how many priority requests we get from different groups, so we’ll be working that out over the course of the next month as different groups make requests.
Related to the above, we don’t know exactly how much we’ll be making public, because we don’t know how much of the survey will be part of the core public tracker vs bespoke requests from particular decision makers (which may or may not be private/confidential). That said, I’m optimistic we’ll be able to make a large amount public (or shared with relevant researchers) regarding the core tracker (e.g. for things we are reporting publicly).
We’re essentially trialing this for 12 months, to see how useful it is and how much demand there seems to be for it, after which, if all goes well, we would be looking to continue and/or expand.
The monthly cadence is influenced by multiple considerations. One is that, ideally, we would be able to detect changes over relatively short time-scales (e.g. in response to media coverage), and part of this trial will be to identify what is feasible and useful. Another consideration is that running more surveys within the time span will allow us to include more ad hoc time sensitive requests by orgs (i.e. things they want to know within a given month, rather than things we are tracking across time). I think it’s definitely quite plausible we might switch to a different cadence later, perhaps due to resource constraints (including availability of respondents).
I would agree that more general or fundamental attitudes are unlikely to change on a monthly cadence. I think it’s more plausible to see changes on a short time-frame for some of the more specific things we’re looking at (e.g awareness of or attitude towards particular (currently) low salience issues or ideas).
Looking forward to talking more about this.
This is amazing! I never knew how much I want to know about changes in people’s attitudes over time 😊
Some questions that interest me here, changes in time of the following [written with haste]
moral attitudes (utilitarianism vs deontology, welfare vs preferential, human vs animal, maximizing vs satisficing, …)
Particular moral values like nationalism, globalism, pacifism, consumerism, pursuit of happiness, importance of freedom, justice, equality, …
Trust in institutions (government, academia, types of media, large tech companies, other countries [China, Russia])
attitude about the future (extinction? AI? pandemics? war? settling other star systems? utopia? digital life?)
attitude toward philanthropy, EA, other social movements, particular and nonparticular billionaires
Great ideas.
For options in 3, there is some overlap with what some other large and established surveys cover—you may for example be interested to look up the General Social Survey (GSS), which has been going for decades and covers a range of social issues like trust in the medical establishment, trust in ‘science’, general social trust, alongside all sorts of other things—you can try searching for variables here: https://gssdataexplorer.norc.org/variables/vfilter
Thanks!💕
Some quick thoughts:
Thanks for all your work on this, it’s really great to see it finally happening! Would love it if the survey can identify and compare ‘social movement’ subgroups such as EA, Social justice, socialism, animal welfare etc. Could be assessed in terms of activism/participation in the subgroups and or awareness/attitude towards them.
This would be helpful in several ways. As an example, I think that it will be very helpful to better understand the relative differences in values and receptiveness to messages etc that exist between such groups and how this changes over time.
It could be interesting to explore how it changes with such groups when new books and articles are widely publicized etc.
From a movement building and impact perspective, it seems important to really understand our adjacent social movements. Where are the overlaps and disconnects in shared values? What are each groups major gripes/misconceptions etc.
I’d welcome any attempt to eventually grow this service to the point where it will allow EA orgs and researchers to easily and affordably survey large samples of key audiences (e.g., AI professionals, policy makers etc). I think that the absence of this is an upstream barrier to lots of important research and message testing.
Agreed, I think “EAs surveying and engaging with the general public on issues we care about” is still relatively neglected.
I’m so so happy to see this project growing this way!
I’d be interested in surveying on whether people believe that AI [could presently/might one day] do a better job governing the [United States/major businesses/US military/other important institutions] than [elected leaders/CEOs/generals/other leaders].
Great initiative!
It may be worth including a question to assess the fraction of people who think their lives are positive (e.i. whether they would rather not have been born or not, neglecting effects on other beings).
Did EA Pulse every happen? Was this project abandoned?
Considering the events of November 2022 I wouldn’t be surprised if it was de-prioritized, but I’d thought I’d check in, since I haven’t heard of it since this initial announcement.
Thanks for asking. We just re-announced it!
It was originally going to be supported by the FTX Future Fund and was therefore delayed while we sought alternative funding. We have now acquired alternative funding for this project for one year. However, the project will now be running on a quarterly basis, rather than monthly, to make the most efficient use of limited funds.
You might be interested in the polling that Data for Progress has recently done on longtermism among U.S. voters (disclosure: I work for them). The organization is interested in EA (we’ve attended EA Global several times), have a substantial polling infrastructure that is lower cost than our typical competitors, and routinely run repeated surveys on the same questions to track trends over months to years. Feel free to reach out if we could be helpful!
Many thanks! This is all very helpful.
Was this grant associated with the Clearer Thinking regrants?
Hi! Co-CEO of Rethink Priorities here.
FTXFF funded our EA Pulse which covered funding for 0.3 FTE staff as well as the costs of implementing the survey and compensating participants. We are still seeking funding for the remainder of the staff costs of our survey team, which would assist in implementing EA Pulse as well as implement other projects we’d like to do to understand how people think about EA and associated topics. This would allow us to dive into more detail on findings we see in EA Pulse, give us funding for the overhead needed to manage and maintain EA Pulse (e.g., management, operations), and allow us to explore work unrelated to EA Pulse. One of the funders we are seeking this of is Clearer Thinking’s regranting program.
Thanks for asking. No, this was supported by the FTX Future Fund.
Sounds excellent! Roughly how large is large?
Several thousand per month (not double figure thousands)
I am very glad this will exist!
Thanks, we’re excited about it!
Sounds good! Do you plan to publish the results each month on the forum, or if not what is a good way to get a quick summary of the results each month?
Thanks for asking. We’re thinking of having a public dashboard showing the results for each month. At present, we’re not thinking of posting each month’s results on the Forum, but rather posting key results and intermittent updates. We think separate Forum posts every month might be unnecessary, since many of the results of the monthly tracker element of EA Pulse will make most sense in the context of multiple months having been run.
OK great, I’d be keen to bookmark the dashboard to check each month or get an email reminder if you set up a mailing list.
Sounds worthwhile. I’m curious why this is restricted to the US population?
Hi Craig—the current restriction to the US population certainly doesn’t represent that we’re not ultimately interested in expanding further. The current funding covers the US, and we started here for several reasons:
Logistical reasons to do with access to samples that is not so feasible in a lot of countries
Lots of up-to-date and readily available info on the US population demographics that make analysis-related choices/possibilities to do with getting to a representative sample more feasible
Lots of existing organisations are particularly interested in US in relation to public opinion/how this might inform policy
But other countries are of interest too for sure!