Thanks for this, gavintaylor!
I’ve currently been compiling a list of lessons EA has that seem applicable over a wide array of cause areas/ charities. It seems like it is relevant to this post, so here’s a link:
(as of now it is still in it’s infancy, but I’m planning to continue working on it)
Interesting question! I have upvoted. A (very) minor issue (and it perhaps is just me), but you may want to consider adding in what ‘ACE’ stands for, took me a minute to realise that Animal Charity Evaluators hadn’t gone completely off the wall.
A well written post with a good level of depth into an important topic. Thank you! If I were to give a suggestion, I would say that I don’t find the title a very good flag of the content.
This isn’t a Task Y (at least it doesn’t obviously fulfil your outlined components), but a small-scale interaction with EA not currently mentioned would be the 80,000 Hours careers guide emailing scheme. This sends you a part of their careers guide each week for you to read and interact with. This supposedly takes 180 minutes to complete over 12 weeks. This seems like it may have up-scaling capabilities, too.
Another potential quick-fix similar to this could also be a more robust internship system within EA. 80,000 Hour’s job board has internships, and we could encourage people to utilise it to help fill some of the place of a Task Y (lower-commitment, can have a clear positive effect, includes career capital).
However, I would stress that neither of these ideas seem like they would completely fulfil the role of a good Task Y.
Here are some individual podcasts I would recommend as being especially good sources of conversation for podcast discussion meetings:
From The 80,000 Hours Podcast:
#45 - Prof Tyler Cowen’s stubborn attachments to maximising economic growth, making civilization more stable & respecting human rights
#25 - Prof Robin Hanson on why we have to lie to ourselves about why we do what we do
#24 - Stefan Schubert on why it’s a bad idea to break the rules, even if it’s for a good cause
Sam Harris’ Waking Up - #44 - Being Good and Doing Good with Will MacAskill
Here are some of the notes I thought would make good discussion points from our event on the 80,000 Hours’ podcast episode. 25 ‘Why we have to lie to ourselves about why we do what we do, according to Prof Robin Hanson’:- Are we at university just to show off?- Should we all cave to religion for practical reasons? If it is practically useful, why isn’t everyone religious?- How does EA incorporate for people wanting to show they care? Wear badges? - Showing we care vs big-headedness- Are we EAs to show off?- Should we be saving all our money till we hit a peak point of effectiveness in our 40s? - Should we start an NGO interested in streamlining marginal charity?(I’m a big fan of posing ideas for NGOs within the podcast talk)- Should we all start a pact that we will put all our money together and in 200 years time give it all away? What would be the optimal amount of time to wait?(Interestingly, this question sparked a movement in the group where about half of us were convinced that this was highly effective, and for the next few weeks we would bring it up. The idea was only quelled when we saw a stat about how much more charities valued a donation now against the same amount next year—it was more than any interest rate we could hope to get on our stored cash)- Would you pay $50 to know hospital death rates for the surgery you’re about to undergo?- Should we be selling EA to everyone?- Are we a youth movement?- Is identity the most important part of EA?
Interesting concept! A few considerations on your impact calculation, though:
The 500 users would all need to be non-EAs (as EAs probably would’ve given similar if not the same amount to effective charities anyway, perhaps only using the app for its UI, progress tracking etc.). Also, I don’t know if you have already considered this, but the 500 members would have likely counterfactually given money to (albeit probably less effective) charities anyway, and so you would need to consider this in an accurate estimation of your impact.
Another thing to consider is that the range of charities you provide can have big effect on your impact calculation. Your EA investors would have, presumably, given their money to top EA causes, and if your users don’t use their money as effectively as this it will lessen your impact.
Another thing to consider is that your EA investors actually potentially wouldn’t have given their money to charities, but to another EA start-up. That makes the counterfactual much harder to properly understand (evidently they thought that giving to Sparrow had higher expected value than top EA charities, and presumably they could’ve found another group with similar expected value).
Although none of this probably makes much difference when considering your higher user counts, I would worry that your lower estimates may be misleading.
Interesting post. It is a shame that people have to travel so far to get to one of the ea hubs.
I was interested in the offices in Oxford you referenced. What offices were these? They sound like a fantastic place to get do some work done on an ea project!