Co-founder of High Impact Engineers. Background in materials science and mechanical engineering. Also chat to me about community building, career decisions, etc.
Jessica Wen
How Engineers can Contribute to Civilisation Resilience
Announcing Two Events around EAG London in Collaboration with the STEM (Science, Technology, Engineering, and Mathematics) Communities
Thanks for this post – as a woman in STEM and a community builder, gender diversity is something I’m trying to improve. I don’t have any answers, but I want to say that I wholeheartedly support this initiative and would love to hear any suggestions on how to appeal to and retain minority groups in EA!
I was originally skeptical of drawing a direct analogy between economic mobility and impact mobility, but after reading the paper I think the mechanisms seem pretty similar: upward income mobility comes from increased inter-economic-status exposure, which increases the exposure of lower-income people to opportunities outside of their communities and ways to attain them – this shapes aspirations and provides access to these opportunities.
This mechanism seems similar to the process I went through to start doing EA work: I met specifically one person who was doing something really cool and impactful and then realised this was something that was achievable for someone like me. Then I met more people, started a project, and now I’m still doing that.
I think EA equivalents for inter-status exposure could be through things like reading groups, fellowships, and conferences; friending bias can be reduced through activities like speed-friending, mentoring, and meet-ups, but I think there could definitely be more programs to introduce “new EAs” to people doing impactful work. For larger groups, perhaps a coffee roulette would do the trick?
Also, this line in the paper caught my eye:
For other outcomes [other than increasing economic mobility], other social capital indices that we construct here may be stronger predictors. For example, differences in life expectancy among individuals with low income across counties are more strongly predicted by network cohesiveness measures (clustering coefficients and support ratios) than EC [economic connectedness].
I wonder if there could be a tenuous analogy from a prediction of life expectancy in this study to something like the longevity of engagement with EA. Highly unsure about this – the mechanisms are likely to be very different!
41 out of ~1800 seems like an extremely low response rate – one would usually expect ~10% response rate, from what I’ve heard. Combining that with the singular female respondent, it seems to me that this survey is not particularly representative of their “STEM community”.
Thanks for putting together this survey and sharing it on the forum! A brief background to me: I co-direct High Impact Engineers, where we aim to help (non-software) engineers do more high-impact work. If the Foresight Institute does decide to go ahead with the outlined ideas/next steps for bridging the gap between EA and STEM, I can see some collaborations being beneficial. I expand on this in my general comments below. (The rest of this comment is from a personal capacity, rather than on behalf of HI-Eng)
General comments (not very heavily edited, apologies for the length and for any rambling/unclear parts. Main points in bold):
Although people in STEM share a lot of important attributes (numerical and analytical skills, problem-solving ability, etc.), there isn’t really such a thing as a STEM community as they’re split out into many different domains of expertise, interests, applications, skills, etc.
It seems unlikely that the survey respondents are omni-STEM, and I think grouping them all under the “STEM” banner obscures a lot of useful information, e.g. in which fields professionals think EA and STEM are already well-integrated, and which aren’t; which skills professionals think would be useful to develop technology to mitigate X-risks/do DTD research projects, etc.
It would be useful to be able to see what the breakdown of the respondents’ fields is – I suspect that life sciences are under-represented here as those fields tend to skew less male-dominated.
Somewhat related: I think we also need more social scientists in EA (understanding how people behave, especially around new technologies, is probably very useful and important for X-risk mitigation – and I’d argue that psychologists and sociologists belong in the S for Science in the STEM acronym), but I assume there were very few of these scientists in this survey considering the survey was sent to a group of people in mostly “hard” tech/sciences.
In terms of demographics, the majority were male and living in the US. A few respondents were based in the UK, and only one woman answered.
If only one woman responded, this survey obviously doesn’t cover the whole STEM community. Hot take: this seems to me to be more a survey of men rather than a survey of the “STEM community” (in quotes due to point 1 and also because a survey of the STEM community would probably be too broad to be useful).
If this is going to become a community-building effort, I think we need to be careful if we spend more time, effort, and money building communities in the respondents’ communities. EA is already overwhelmingly white and male; I somewhat worry that increasing overlap in the respondents’ communities will cause EA to become a boy’s club.
Since the majority of the respondents were US- (and some UK-) based, I would also be interested in seeing the ethnic breakdown (if this was collected).
A final demographic that I’d like to see from this survey is the age breakdown – mid-career professionals, in my experience, tend to have important and useful perspectives that EA generally lacks.
Like Linda, I’d also like to know the response rate for this survey.[EDIT]: 41/~1800 seems like an awfully low response rate, so doesn’t seem particularly representative of Foresight’s STEM community, which makes me question the validity of these conclusions.
Technology is very important in solving X-risk challenges: Many respondents believe that technology has the potential to help mitigate X-risks. They feel that getting technologists involved in the conversation is essential to developing effective solutions.
What does technology mean here? Usually it means information technology, but I think “physical” technologies also have the potential to help mitigate X-risks. It feels obvious to me that getting technologists involved will help solve problems effectively – this is what technologists do!
I agree with the conclusions and recommended actions, and my intuition says that it’s better to focus on the ways to get STEM professionals working on X-risks and EA cause areas without integrating the EA and STEM communities.
High Impact Engineers has already been doing point 3 of the recommended actions (introducing existential risks and EA cause areas for the STEM community) at universities as a low-risk test of whether people with an engineering background respond well to these ideas – we’ve found that they do! We plan to do more outreach to professional engineering institutes in the next 6 months, and this is a project where collaboration could be beneficial. Happy to discuss more.
Specific technological areas you deem promising for bridge-building
At HI-Eng, common areas that we’ve found to be in-demand for biosecurity, civilisation resilience, AI governance, differential neurotechnology development, and other X-risk mitigation seem to be electrical/electronic engineers, mechanical engineers, and materials engineers (particularly experts in semiconductors). Bioengineers and biomedical engineers seem to be useful for biosecurity. Happy to discuss this further, and we also expect to have a presentation/poster to share on this soon.
My overall takeaway: a survey like this is probably useful and important, but I would need to know more about the types of people surveyed for the data to be useful and for the actions to be convincing.
This isn’t specifically my area of expertise, but in my research for “physical” engineering roles I often find accompanying software engineering roles, such as bioreactors for lab-grown meat, open sustainable technology projects, companies like Zipline working in low-income countries, start-ups that have gone through Y Combinator’s non-profit program, and various biosecurity orgs/companies need bioinformatics/data scientists/software engineers. You can find more ideas on the 80k career review of software engineering.
Thanks for these reflections—I think I generally agree with the points where you’ve expanded on Thorn’s lack of nuance. However, I think where MacAskill mentions reproductive rights in WWOTF it not only lacks force, but seems completely out of the blue: it’s not backed up by the reasoning he’s been using up until that point, and is never expanded on afterwards. This means it comes across as lacking conviction and seems to be a throwaway statement.
“Observation is the chief source of knowledge” falls under the Empiricism school of thought in epistemology, as opposed to Rationalism, which is perhaps where my misunderstanding came about.
(A minor gripe I have about LW, and EA by extension, is that words with a specific meaning in philosophy are misused and therefore take on a different meaning – take “epistemic status”, which has grown out of its original intended meaning of how confident one is in one’s claim and is now used more to describe someone’s background and raise general caveats and flags for where someone might have blind spots.)
In general, I’d agree that using different tools to help you better understand the world and succeed in life is a good thing; however, my point here is that LW and the Rationality community in general view certain tools and ways of looking at the world as “better” (or are only exposed to these tools and ways of looking at the world, and therefore don’t come across other methods). I have further thoughts on this that I might write a post about in the future, but in short, I think that this leads to the Rationality community (and EA to some extent) to tend to be biased in certain ways that could be mitigated by increasing recognition of the value of other tools and worldviews, and maybe even some reading of academic philosophy (although I recognise not everyone has the time for this).
Weird, does it work now?
Hey, I’ve written up my thoughts here. I thought the criticisms were fair on the whole. Would be keen to hear your thoughts!
Watched it as it premiered, got distracted by the live chat, and had to watch it again! I’ve written up my thoughts and reactions here.
On Philosophy Tube’s Video on Effective Altruism
Hi Elliot, I replied to Markus’ tweet but here’s High Impact Engineers’ situation to hopefully give a bit more info:
We received a message that EV had paused their grantmaking and payment process three weeks ago even though our grant had been approved back in September.
We got confirmation last week that they were going to continue processing our grant and signed a grant agreement.
We heard yesterday that there’s still a delay in our funds being sent through by the finance team but no further details were provided.
I’m very excited about this idea because the main barrier to prototyping/testing physical things is cheap access to equipment, materials, and space. As far as I know, there is a network of makerspaces across the UK that might be a low-cost place to start.
This is a great idea! Would also add Magnify Mentoring, which provides (free!) services to support more people with mentorship, particularly those from traditionally underrepresented groups.
That pretty accurately describes my thoughts on this :)
Updated to reflect that, thanks!
Thanks Vlad! We spoke at EAG SF but wanted to put a comment here for anyone interested in contributing to or viewing of the map of the technical problem landscape to reach out to me or Vlad :) we will have updates soon so keep posted!
I would be interested in more data about what people in EA’s professional backgrounds are to see if the gender ratios are different from the gender ratio of the wider (non-EA) professions. It seems plausible to me that the framing of EA (heavily data-centric, emotionally detached) could attract/appeal to men more, whereas the focus on social impact and doing good might attract more women.