Agreement karma indicates agreement, separate from overall quality.
How did you end up choosing to go to DarwinAI? Why not something else like GR in GPR or FAAANG?
Iād say it was kind of decided for me, since those other options were ruled out at the time. I applied to internships at some EA orgs, but didnāt have any luck. Then, I did a Masterās in computational math. Then, I started working part-time at a machine learning lab at the university while I looked for full-time work. I applied to AI internships at the big tech companies, but didnāt have any luck. I got my job at DarwinAI because I was working for two of its cofounders at the lab. I had no industry experience before that.
Iām currently trying to transition to effective animal advocacy research, reading more research, offering to review research before publication, applying to internships and positions at the orgs, and studying more economics/āstats, one of the bottlenecks discussed here, with quantitative finance as a second choice, and back to deep learning in the industry as my third. I feel that EA orgs have been a bit weak on causal inference (from observational data), which falls under econometrics/āstats.
Iām currently trying to transition to effective animal advocacy
research, reading more research, offering to review research before
publication, applying to internships and positions at the orgs, and
studying more economics/āstats, one of the bottlenecks discussed here,
Your options sounds solid. I guess your 28 and can thus still get into
relatively different quantitative Finance.
But, how did you decide that it is best for you to dedicate your time
to AAR? You could be working at GiveWell/āOpen Phil as a GR, or in
OpenAI/āMIRI in AI safety research (especially with your CS and Math
background), you could also be working in ETG at the FAANG. Also
80khours no where seems to suggest that AAR of all the things are
āhigh-impact-careersā nor does the EA survey say anything about it. In
fact the survey talks about GR and AI safety.
And did you account for replaceability and other factors? If so, how did you arrive at these numbers?
I feel that EA orgs have been a bit weak on causal inference (from
observational data), which falls under econometrics/āstats.
So you hope to apply causal inference in AAR?
Lastly I want to thank you from the heart for taking your time and effot to
respond to me. Appreciate it brother.
I guess your 28 and can thus still get into relatively different quantitative Finance.
26, but 2 years isnāt a big difference. :)
But, how did you decide that it is best for you to dedicate your time to AAR? You could be working at GiveWell/āOpen Phil as a GR, or in OpenAI/āMIRI in AI safety research (especially with your CS and Math background), you could also be working in ETG at the FAANG. Also 80khours no where seems to suggest that AAR of all the things are āhigh-impact-careersā nor does the EA survey say anything about it. In fact the survey talks about GR and AI safety.
So Iām choosing AAR over other causes due to my cause prioritization, which depends on both my ethical views (Iām suffering-focused) and empirical views (I have reservations about longtermist interventions, since thereās little feedback, and I donāt feel confident in any of their predictions and hence cost-effectiveness estimates). 80,000 Hours is very much pushing longtermism now. Iām more open to being convinced about suffering risks, specifically.
Iām leaning against a job consisting almost entirely of programming, since I came to not enjoy it that much, so I donāt think Iād be motivated to work hard enough to make it to $200K/āyear in income. I like reading and doing research, though, so AI research and quantitative finance might still be good options, even if they involve programming.
And did you account for replaceability and other factors? If so, how did you arrive at these numbers?
(...)
So you hope to apply causal inference in AAR?
I didnāt do any explicit calculations. The considerations I wrote about replaceability in my post and the discussion here have had me thinking that I should take ETG to donate to animal charities more seriously.
I think econometrics is not very replaceable in animal advocacy research now, and it could impact the grants made by OPP and animal welfare funds, as well as ACEās recommendations.
Iāll try a rough comparison now. I think thereās more than $20 million going around each year in effective animal advocacy, largely from OPP. I could donate ~1% ($200K) of that in ETG if Iām lucky. On the other hand, if do research for which Iād be hard to replace and that leads to different prioritization of interventions, I could counterfactually shift a good chunk of that money to (possibly far) more cost-effective opportunities. Iād guess that corporate campaigns alone are taking >20% of EEAās resources; good intervention research (on corporate campaigns or other interventions) could increase or decrease that considerably. Currently only a few people at Humane League Labs and a few (other) economists (basically studying the effects of reforms in California) have done or are doing this kind of econometrics and causal inference research. Maybe around the equivalent of 4 working on this full-time now. So my guess is that another person working on this could counterfactually shift > 1% of EAA funding in expectation to opportunities twice as cost-effective. This seems to beat ETG donating $200k/āyear.
Lastly I want to thank you from the heart for taking your time and effot to respond to me. Appreciate it brother.
Happy to help! This was useful for me, too. :)
(Oh, besides economics, Iām also considering grad school in philosophy, perhaps for research on population ethics, suffering-focused views and consciousness.)
3 votes
Overall karma indicates overall quality.
Total points: 0
Agreement karma indicates agreement, separate from overall quality.
Iād say it was kind of decided for me, since those other options were ruled out at the time. I applied to internships at some EA orgs, but didnāt have any luck. Then, I did a Masterās in computational math. Then, I started working part-time at a machine learning lab at the university while I looked for full-time work. I applied to AI internships at the big tech companies, but didnāt have any luck. I got my job at DarwinAI because I was working for two of its cofounders at the lab. I had no industry experience before that.
Iām currently trying to transition to effective animal advocacy research, reading more research, offering to review research before publication, applying to internships and positions at the orgs, and studying more economics/āstats, one of the bottlenecks discussed here, with quantitative finance as a second choice, and back to deep learning in the industry as my third. I feel that EA orgs have been a bit weak on causal inference (from observational data), which falls under econometrics/āstats.
1 vote
Overall karma indicates overall quality.
Total points: 0
Agreement karma indicates agreement, separate from overall quality.
Your options sounds solid. I guess your 28 and can thus still get into relatively different quantitative Finance.
But, how did you decide that it is best for you to dedicate your time to AAR? You could be working at GiveWell/āOpen Phil as a GR, or in OpenAI/āMIRI in AI safety research (especially with your CS and Math background), you could also be working in ETG at the FAANG. Also 80khours no where seems to suggest that AAR of all the things are āhigh-impact-careersā nor does the EA survey say anything about it. In fact the survey talks about GR and AI safety.
And did you account for replaceability and other factors? If so, how did you arrive at these numbers?
So you hope to apply causal inference in AAR?
Lastly I want to thank you from the heart for taking your time and effot to respond to me. Appreciate it brother.
3 votes
Overall karma indicates overall quality.
Total points: 0
Agreement karma indicates agreement, separate from overall quality.
26, but 2 years isnāt a big difference. :)
So Iām choosing AAR over other causes due to my cause prioritization, which depends on both my ethical views (Iām suffering-focused) and empirical views (I have reservations about longtermist interventions, since thereās little feedback, and I donāt feel confident in any of their predictions and hence cost-effectiveness estimates). 80,000 Hours is very much pushing longtermism now. Iām more open to being convinced about suffering risks, specifically.
Iām leaning against a job consisting almost entirely of programming, since I came to not enjoy it that much, so I donāt think Iād be motivated to work hard enough to make it to $200K/āyear in income. I like reading and doing research, though, so AI research and quantitative finance might still be good options, even if they involve programming.
I didnāt do any explicit calculations. The considerations I wrote about replaceability in my post and the discussion here have had me thinking that I should take ETG to donate to animal charities more seriously.
I think econometrics is not very replaceable in animal advocacy research now, and it could impact the grants made by OPP and animal welfare funds, as well as ACEās recommendations.
Iāll try a rough comparison now. I think thereās more than $20 million going around each year in effective animal advocacy, largely from OPP. I could donate ~1% ($200K) of that in ETG if Iām lucky. On the other hand, if do research for which Iād be hard to replace and that leads to different prioritization of interventions, I could counterfactually shift a good chunk of that money to (possibly far) more cost-effective opportunities. Iād guess that corporate campaigns alone are taking >20% of EEAās resources; good intervention research (on corporate campaigns or other interventions) could increase or decrease that considerably. Currently only a few people at Humane League Labs and a few (other) economists (basically studying the effects of reforms in California) have done or are doing this kind of econometrics and causal inference research. Maybe around the equivalent of 4 working on this full-time now. So my guess is that another person working on this could counterfactually shift > 1% of EAA funding in expectation to opportunities twice as cost-effective. This seems to beat ETG donating $200k/āyear.
Happy to help! This was useful for me, too. :)
(Oh, besides economics, Iām also considering grad school in philosophy, perhaps for research on population ethics, suffering-focused views and consciousness.)