I think it’s good if I comment, as it’s been an explicit strategy of mine for a while. I’ll answer the questions but first give some reasons why I decided to mostly focus on working in ‘non-EA orgs’. (I won’t back them up unless asked by someone so that I can just write without worrying about spending time.)
1) I think there are many more neglected opportunities to do good in non-EA orgs, where you will truly be irreplaceable, and see so much low hanging fruit to make impact in so many places. When I say impact here I do mean it with an EA lens. 2) There is lots more optionality if I default to non-EA org jobs, and see EA orgs as just another place to work in the world, to be evaluated alongside other opportunities. I don’t understand why I would ever limit my options (except when I have to in order to progress). 3) I can bring so much more to a place that has less EA ideas in it, as someone who has thought about EA ideas a bit—what I bring will be more highly valued and novel for people (though I usually won’t talk about EA explicitly, I’ll just imbue the underlying values because turns out lots of ‘non-EAs’ share them when you get down to it). 4) I will learn about which other communities are also working on doing good and be able to work with them and also to make and facilitate connections between EA and ‘non EA’ and have a better map of who is making what impact where. Limiting this just to the EA space seems so unnecessarily limiting. 5) Related to 4) I actually see integrating EA and non EA as itself very important work to do if EA is to achieve many of its goals. Sometimes it might be better to have a more inclusive branding than the EA one, both to attract a more diverse set of views and also simply instrumental to EA goals. 6) So far, I have found lots of EA ideas in ‘non-EA’ orgs and spaces—yes, fragments of them, worded differently, or just bubbling under the surface, but that is such an opportunity!
Where do you work, and what do you do?
I work in the civil service as a data scientist. In practice my role also involves tech policy stuff. I code, do data analysis, talk to people, try to understand what’s really going on both internally and in the world on the real world problems we work on. I try to get on projects I think are impactful, or that will help me get to a place where I can be more impactful, or ideally both.
What are some things you’ve worked on that you consider impactful?
In my current role I’ve done research and advising on digital markets which (definitely not just due to me though I am there to contribute/learn) has led to the establishment of the first big tech independent sector regulator that I know of (Digital Markets Unit). As this unit grows legs over the next couple of years, I’m excited to be part of its initial shape-taking and think there will be lots of opportunities to have impact (if you’re interested in it send me a message).
I’ve also done quite a few significant efficiency-enhancing things that make impact I guess, but I feel like I’m more replaceable on those so less sure about impact.
For the most part though, I see my current role as upskilling/preparing/positioning for making impact—and lots of the impact I make might just be getting in the room and saying why we should not do something, and also might not the kinds of things I can just talk about on the internet.
Separately, I founded a not-for-profit educational/talent programme which aims to in 5 years or so make a significant contribution to the number of highly-capable individuals on their way to making lots of impact on pressing problems. You might consider this an EA project but from the perspective of my own career development it’s not really. I didn’t join an existing EA org, but more like I made ‘non-EA’ connections at a university and went from there (with lots of help from some awesome volunteers some of whom may identify as EA though). Also we want to be inclusive so don’t explicitly identify EA as an organisation.
What are a few ways in which you bring EA ideas/mindsets to your current job?
The following is not EA ideas only, but I would say they pretty relevant to EA:
I try to work on projects related to big tech, from a perspective of risks of emerging technologies, focused on but not limited to AI. This is where the bulk of my time goes.
I’m trying to help develop the impact measurement/evaluation activities we do, in various parts of the organisation, although haven’t made that much progress so far, mostly because there’s just too much to do and there are competing priorities.
I met with the head of risk, and we discussed launching a monthly ‘mistakes monday’ where we will have presentations from different teams on mistakes that were made internally, and prizes for the biggest/best. I think this culture is quite important for good internal decision-making culture of of a regulator.
I brought in Ozzie to talk about forecasting, and his tools are being used by our team now (so far just on things of no consequence though, sorry Ozzie, little steps though).
Networking across departments to find out where is tractable to make impact, and use my role to make connections with people at other organisations.
Overall, I do think that firstly looking outside of EA to make impact is a good strategy for many people. It’s not that I’m closed to EA jobs, just that I’m not primarily focusing on them. Having said this, I’m not sure where we are right now as a community on this, but I’d be really sad if we ended up over-correcting as a community and EA orgs stopped getting a good supply of talent. Obviously I really care about this as I spend most of my ‘side-project’ time on talent pipeline.
I think overall the best idea is to talk to lots of people about your specific situation to get a good variety of career advice, and to make sure the advice you are getting is varied enough (find someone who will give you the advice you don’t want to hear!) - however, I will expect a lot of people involved in the EA community to have social motivators towards jobs at EA orgs, and think a little bit of correction in this direction is still due (probably? how can we measure this?).
OK I’ll stop there as this is getting long. Thanks for reading if you made it this far, hope it was useful.
Small nitpick: I responded to this post even though I’d prefer if it was framed as people who are into EA, or EA-aligned people or who work in EA orgs or something rather than ‘EAs’, as I personally prefer that framing—but probably just being a pedant.
I think it’s good if I comment, as it’s been an explicit strategy of mine for a while. I’ll answer the questions but first give some reasons why I decided to mostly focus on working in ‘non-EA orgs’. (I won’t back them up unless asked by someone so that I can just write without worrying about spending time.)
1) I think there are many more neglected opportunities to do good in non-EA orgs, where you will truly be irreplaceable, and see so much low hanging fruit to make impact in so many places. When I say impact here I do mean it with an EA lens.
2) There is lots more optionality if I default to non-EA org jobs, and see EA orgs as just another place to work in the world, to be evaluated alongside other opportunities. I don’t understand why I would ever limit my options (except when I have to in order to progress).
3) I can bring so much more to a place that has less EA ideas in it, as someone who has thought about EA ideas a bit—what I bring will be more highly valued and novel for people (though I usually won’t talk about EA explicitly, I’ll just imbue the underlying values because turns out lots of ‘non-EAs’ share them when you get down to it).
4) I will learn about which other communities are also working on doing good and be able to work with them and also to make and facilitate connections between EA and ‘non EA’ and have a better map of who is making what impact where. Limiting this just to the EA space seems so unnecessarily limiting.
5) Related to 4) I actually see integrating EA and non EA as itself very important work to do if EA is to achieve many of its goals. Sometimes it might be better to have a more inclusive branding than the EA one, both to attract a more diverse set of views and also simply instrumental to EA goals.
6) So far, I have found lots of EA ideas in ‘non-EA’ orgs and spaces—yes, fragments of them, worded differently, or just bubbling under the surface, but that is such an opportunity!
Where do you work, and what do you do?
I work in the civil service as a data scientist. In practice my role also involves tech policy stuff. I code, do data analysis, talk to people, try to understand what’s really going on both internally and in the world on the real world problems we work on. I try to get on projects I think are impactful, or that will help me get to a place where I can be more impactful, or ideally both.
What are some things you’ve worked on that you consider impactful?
In my current role I’ve done research and advising on digital markets which (definitely not just due to me though I am there to contribute/learn) has led to the establishment of the first big tech independent sector regulator that I know of (Digital Markets Unit). As this unit grows legs over the next couple of years, I’m excited to be part of its initial shape-taking and think there will be lots of opportunities to have impact (if you’re interested in it send me a message).
I’ve also done quite a few significant efficiency-enhancing things that make impact I guess, but I feel like I’m more replaceable on those so less sure about impact.
For the most part though, I see my current role as upskilling/preparing/positioning for making impact—and lots of the impact I make might just be getting in the room and saying why we should not do something, and also might not the kinds of things I can just talk about on the internet.
Separately, I founded a not-for-profit educational/talent programme which aims to in 5 years or so make a significant contribution to the number of highly-capable individuals on their way to making lots of impact on pressing problems. You might consider this an EA project but from the perspective of my own career development it’s not really. I didn’t join an existing EA org, but more like I made ‘non-EA’ connections at a university and went from there (with lots of help from some awesome volunteers some of whom may identify as EA though). Also we want to be inclusive so don’t explicitly identify EA as an organisation.
What are a few ways in which you bring EA ideas/mindsets to your current job?
The following is not EA ideas only, but I would say they pretty relevant to EA:
I try to work on projects related to big tech, from a perspective of risks of emerging technologies, focused on but not limited to AI. This is where the bulk of my time goes.
I’m trying to help develop the impact measurement/evaluation activities we do, in various parts of the organisation, although haven’t made that much progress so far, mostly because there’s just too much to do and there are competing priorities.
I met with the head of risk, and we discussed launching a monthly ‘mistakes monday’ where we will have presentations from different teams on mistakes that were made internally, and prizes for the biggest/best. I think this culture is quite important for good internal decision-making culture of of a regulator.
I brought in Ozzie to talk about forecasting, and his tools are being used by our team now (so far just on things of no consequence though, sorry Ozzie, little steps though).
Networking across departments to find out where is tractable to make impact, and use my role to make connections with people at other organisations.
Overall, I do think that firstly looking outside of EA to make impact is a good strategy for many people. It’s not that I’m closed to EA jobs, just that I’m not primarily focusing on them. Having said this, I’m not sure where we are right now as a community on this, but I’d be really sad if we ended up over-correcting as a community and EA orgs stopped getting a good supply of talent. Obviously I really care about this as I spend most of my ‘side-project’ time on talent pipeline.
I think overall the best idea is to talk to lots of people about your specific situation to get a good variety of career advice, and to make sure the advice you are getting is varied enough (find someone who will give you the advice you don’t want to hear!) - however, I will expect a lot of people involved in the EA community to have social motivators towards jobs at EA orgs, and think a little bit of correction in this direction is still due (probably? how can we measure this?).
OK I’ll stop there as this is getting long. Thanks for reading if you made it this far, hope it was useful.
Small nitpick: I responded to this post even though I’d prefer if it was framed as people who are into EA, or EA-aligned people or who work in EA orgs or something rather than ‘EAs’, as I personally prefer that framing—but probably just being a pedant.