LW is a rape cult
If you wouldn’t bail out a bank then would you bail out EA?
LW is a rape cult
If you wouldn’t bail out a bank then would you bail out EA?
Thanks for the encouragement, Ryan!
I’ve been tentatively considering a career in the actuarial sciences recently. It seems like the field compensates people pretty well, is primarily merit-based, doesn’t require much, if any programming ability (which I don’t really have), and doesn’t have very many prerequisites to get into, other than strong mathematical ability and a commitment to taking the actuarial exams.
Also, actuarial work seems much slower paced than the work done in many careers that are frequently discussed on 80K Hours, which would make me super happy. I’m a bit burnt out on life right now, and I really don’t want to go into a high-stress job, or a job with unusually long hours after I graduate at the end of this semester. I guess that if I wasn’t a failure, I would have figured out what I was doing after graduation by now.
Are there any actuaries in the EA movement, or does anyone have any insights about this field that I might not have? My main concern regarding potentially becoming a trainee actuary is that the field is somewhat prone to automation. Page 71 of this paper, which was linked to in 80K Hours’ report on career automation, suggests that there’s a 21 % chance that actuarial work can be automated. The automation of certain tasks done by actuaries is frequently discussed on the actuary subreddit, as well.
Thanks for reading, and for any advice or thoughts that you might have for me!
I’m an emotivist—I believe that “x is immoral” isn’t a proposition, but, rather, is just another way of saying “boo for x”. This didn’t keep me from becoming an EA, though; I would feel hugely guilty if I didn’t end up supporting GiveWell and other similar organizations once I have an income, and being charitable just feels nice anyways.
I agree with everything in your two replies to my post.
You know, I’m probably more susceptible to being dazzled by de Grey than most—he’s a techno-optimist, he’s an eloquent speaker, he’s involved in Alcor, and I personally have a stake in life-extension tech being developed. I’m not sure how much these factors have influenced me in subtle ways while I was writing up my thoughts on SENS.
Anyhow, doing cost-effectiveness estimates is one of my favorite ways of thinking about and better understanding problems, even when I end up throwing out the cost-effectiveness estimates at the end of the day.
I haven’t found any such breakdown, even after looking around for a while. The 80,000 Hours interview with Aubrey, as well as a number of Youtube interviews featuring Aubrey (I don’t remember which ones, sorry) note that Aubrey thinks SENS could make good use of $1 billion over the next ten years, but none of these sources justify why this much money is needed.
Thank you for sharing this! I hadn’t known that Bronies for Good had switched to fundraising for organizations recommended by GiveWell—given the variety of organizations that Bronies for Good has supported in the past, I certainly hope that they continue to support EA-approved organizations in the future, rather than moving on to another cause.
Anti-aging seems like a plausible area for effective altruists to consider giving to, so thank you for raising this thought. It looks like GiveWell briefly looked into this area before deciding to focus its efforts elsewhere.
I’ve seen a few videos of Aubrey de Grey speaking about how SENS could make use of $100 million per year to fund research on rejuvenation therapies, so presumably SENS has plenty of room for more funding. SENS’s I-990 tax forms show that the organization’s assets jumped by quite a lot in 2012, though this was because of de Grey’s donations during this year, and though I can’t find SENS’s I-990 for 2013, I would naively guess that they’ve been able to start spending the money donated in 2012 during the last couple of years. I still think that it would be worthwhile to ask someone at SENS where the marginal donation to the foundation would go in the short term—maybe a certain threshold of donations needs to be reached before rejuvenation research can be properly begun in the most cost-effective way.
I agree with Aubrey that too much money is spent researching cures to specific diseases, relative to the amount spent researching rejuvenation and healthspan-extension technology. I’ve focused this response on SENS because, as a person with a decent science background, I feel like Aubrey’s assertion that (paraphrased from memory) “academic research is constrained in a way that rewards low expected value projects which are likely to yield results quickly over longer term, high expected value projects” is broadly true, and that extra research into rejuvenation technologies is, on the margin, more valuable than extra research into possible treatments for particular diseases.
Hi there! In this comment, I will discuss a few things that I would like to see 80,000 Hours consider doing, and I will also talk about myself a bit.
I found 80,000 Hours in early/mid-2012, after a poster on LessWrong linked to the site. Back then, I was still trying to decide what to focus on during my undergraduate studies. By that point in time, I had already decided that I needed to major in a STEM field so that I would be able to earn to give. Before this, in late 2011, I had been planning on majoring in philosophy, so my decision in early 2012 to do something in a STEM field was a big change from my previous plans. I hadn’t known which STEM field I wanted to major in at this point; I had only realized that STEM majors generally had better earning potentials than philosophy majors.
The way that this ties back into 80,000 Hours is that I think that I would have liked someone to help me decide which STEM field to go into. Actually, I can’t find any discussion of choosing a college major on the 80,000 Hours site, though there are a couple of threads on this topic posted to LessWrong. I would like to see an in-depth discussion page on major choice as one of the core posts on 80,000 Hours.
Anyhow, I ended up majoring in chemistry because it seemed like one of the toughest things that I could major in—I made this decision under the rule-of-thumb that doing hard things makes you stronger. I probably should have majored in mathematics, because I actually really enjoy math, and have gotten good grades in most of my math classes; neither of those two things are true of the chemistry classes I have taken. I think that my biggest previous misconception about major choice was that all STEM majors were roughly equal in how well they prepared you for the job market—looking back, I feel that CS and Math are two of the best choices for earning to give, followed by engineering and then biology, with chemistry and physics as the two worst options for students interested in earning to give. Of course, YMMV, and people with physics degrees do go into quantitative finance, but I do think that not all STEM majors are equally useful for earning to give.
The second thing that I would like to mention is that, from my point of view, 80,000 Hours seems very elitist. I don’t mean this in a bad way, really, I don’t, but it is hard to be in the top third of mathematics graduates from an ivy league university. The first time that I had a face-to-face conversation with an effective altruist who had been inspired by 80,000 Hours, I told them that I was planning on doing important scientific research, and they just gave me a look and asked me why I wasn’t planning on going into one of the more lucrative earning-to-give type of careers.
I am sure that this person is a good person, but this episode leads me to wonder if adding more jobs that very smart people who aren’t quite ready to go into quantitative finance or strategic consulting could do to the top careers page on 80,000 Hours’ site would be a good idea. Specifically, mechanical, chemical, and electrical engineering, as well as the actuarial sciences, could be acceptable fields for one to go into for earning to give.
Does anyone have any thoughts on how much we should value leading other people to donate? I mean this in a very narrow sense, and my thoughts on this topic are quite muddled, so I’ll try to illustrate what I mean with a simplified example. I apologize if my confusion ends up making my writing unclear.
If I talk with a close friend of mine about EA for a bit, and she donates $100 to, say, GiveWell, and then she disengages from EA for the rest of her life, how much should I value her donation to GiveWell? In this scenario, it seems like I’ve put some time and effort into getting my friend to donate, and she presumably wouldn’t have donated $100 if I hadn’t chatted with her, so it feels like maybe I did a few dollars worth of good by chatting with her. At the same time, she’s the one who donated the money, so it feels like she should get credit for all of the good that was done because of her donation. But wait—if I did a few dollars of good, then does that mean that she did less than $100 worth of good?
At this point, my moral intuitions on this issue are all over the place. I guess that positing that the story above actually has a problem implies that the sum of good done by my friend and I should sum to $100, but the only reason I’ve tacitly assumed that to be true is because it intuitively feels true. I previously wrote a comment on LessWrong on this topic that wasn’t any clearer than this comment, and this response was quite clear, but I’m still confused.