Builds web apps (eg viewpoints.xyz) and makes forecasts. Currently I have spare capacity.
Nathan Young
Have only scanned this but it seems to have flaws I’ve seen elsewhere. In general. I recommend reading @Charles Dillon 🔸’s article on comparative advantage (Charles, I couldn’t find it here, but I suggest posting it here):
The quickest summary is:
Comparative advantage means I’m guaranteed work but not that that work will provide enough for me to eat
If comparative advantage is a panacea, why are there fewer horses?
My AI Vibes are Shifting
I don’t have time to research this take, but one of my economist friends criticised this study for the following two reasons:
They claimed the averted deaths were in a famine, so there was regression to the mean (in a normal period there wouldn’t have been so many deaths in the control)
They claimed the averted deaths were close to hospitals, so areas without existing healthcare infrastructure would not see this benefit so the counterfactual value of the money is less.
I haven’t looked into this robustly so if someone has, please agree or disagree vote with this comment accordingly.
Thanks GiveDirectly for their work.
I was looking around for one of these numbers and Perplexity sent me here, which is I suppose a bit ironic.
Let’s discuss this on the other blog, not sure it’s good to do it in two places at once.
I agree that it could be easier for people in EA to build a track record that funders take seriously.
I struggle to know if your project is underfunded—many projects aren’t great and there have to be some that are rejected. In order to figure that out we have to actually discuss the project and I’ve appreciated our back and forth on the other blog you posted.
How have you factored this into your calculations? Surely if the returns are much lower, the total % of the market that could be run like this is much smaller?
Surely it’s going to be much more difficult for a PFG company to raise capital? Stocks are (in some way) related to future profits. If you are locked in to giving 90% away then doesn’t that mean that stocks will trade at a much lower price and hence it will be much harder for VCs to get their return?
I guess my questions are:
“what is earn to give”. is the typical ETG giving $1m? $10m? At what point do we want people to switch?
Is there a genuinely different skill set? Like, are there some people who are very mediocre EA jobs but great at earning money?
My guess would be that people should have some sense of how much they would earn to give for, and then how much impact they would stop earning to give and work for, and then they should move between the two. That would also create some great on-the-job learning, because I imagine that earn-to-give roles teach different skills, which can be fed back into the EA community.
It feels like if there were more money held by EAs some projects would be much easier:
Lots of animal welfare lobbying
Donating money to the developing world
AI lobbying
Paying people more for work trials
I don’t know if there are some people who are much more suited to earning than to doing direct work. It seems to me they’re quite similar skill sets. But if they’re really sort of at all different, then you should really want quite different people to work on quite different things.
I really like this format. Props to the forum team.
The percentage of EAs earning to give is too low
Resources are useful. The movement is very built around one large donor.
Feels a bit unrelated to the topic at hand?
My friend Barak Gila wrote about spending $10k offsetting plane & car miles, in Cars and Carbon
This seems way too expensive? I feel like make sunsets suggest you can offset a lifetime of carbon for like $500.
I think a big problem is it’s hard to know what to believe here. And hence people don’t offset.
To add some thoughts/anecdotes:
I’m sad this happens. I have had similar and it’s hard.
It seems like orgs and individuals have different incentives here—orgs want the most applicants possible, individuals want to get jobs.
I have been asked to apply for 1 − 3 jobs that seemed wildly beyond my qualification then failed at the first hurdle without any feedback. This was quite frustrating, but I guess I understand why it happens.
I like that work trials are paid well
If we believe the best person for a job might be 5-10x better than the next best, then perhaps it’s worth really trying to get that marginal person.
I wish orgs would publish the number of applicants they have and give more feedback about where one is in regard to the level required to proceed. Giving clear feedback is another kind of pay.
I feel annoyed when there isn’t honesty that, for some, EA can be kind of drudgery. I don’t think we should expect doing good to be all sunshine and I have never managed to find EA jobs that were a good fit (10s of applications, I guess?) and I guess I shouldn’t expect to, but equally that is an expectation that can be set—some people will find jobs easily, some won’t, we don’t all have skills that easily map to skills that EA orgs want.
EA orgs aren’t on average great forecasters about the future. Forecasting is really hard. Perhaps they are better than the median person about what skills to get into (AI has seemed like a big win in terms of ways to have impact) but getting any well-paid non-evil job (especially in AI) will likely create skills that are valuable. Down the line this can hopefully be pushed into doing good if now isn’t such an easy time for that (and some of the money can too!), so it’s worth considering taking lucrative non-evil jobs now and then having more options down the line. I would take this advice weakly.
I tend to think that one falls in love very hard the first time and EA is a little like that. People have never had a community before and it feels great to be part of something. Many want to do exactly the right thing. The orgs are wise, the leaders hyper-competent. But to me, many EA orgs and leaders seem pretty good, but not superhuman, focused on neglected topics. This is worth bearing in mind while applying. I weakly think that building my own skills is probably going to do more good than working at just the right place or for just the right person. There are a small number of cases where a specific org or person or project seems 10x better than everything else and in that case, I push myself to go for it, but that seems different to applying for “EA jobs” in general.
Debate experiments at The Curve, LessOnline and Manifest
I dunno, by how much? Seems contingent on lots of factors.
I like Ben personally.
I don’t intend to quote tweet him, but I’d like someone to make a kind of defence.
Ben Landau-Taylor tweeted this a couple of days back:
It has been annoying me, since I don’t think it’s accurate. Here is my proposed response (These aren’t tweets, it’s a scheduling app where I draft):
I would appreciate criticism.
Thanks for doing this and for adding the shortcut portal.