Thanks for sharing. I think it was brave and I appreciated getting to read this. I’m sorry you’ve had to go through this and am glad to hear you’re feeling optimistic.
Howie_Lempel
I think the people responsible for EA Global admissions (including Amy Labenz, Eli Nathan, and others) have added a bunch of value to me over the years by making it more likely that a conversation or meeting with somebody at EA Global who I don’t already know will end up being productive. Making admissions decisions at EAG (and being the public face of an exclusive admissions policy) sounds like a really thankless job and I know a bunch of the people involved end up having to make decisions that make them pretty sad because they think it’s best for the world. I mostly just wanted to express some appreciation for them and to mention that I’ve benefitted from it since it feels uncomfortable to say out loud so is probably under expressed.
One positive effect of selective admissions that I don’t often see discussed is that it makes me more likely to take meetings with folks I don’t already know. I’d guess that this increases the accessibility of EA leaders to a bunch of folks in the community.
Fwiw, I’ve sometimes gotten overambitious with the number of meetings I take at EAG and ended up socially exhausted enough to be noticeably less productive for several days afterwards. This is a big enough cost that I’ve skipped some years. So, I think in the past I’ve probably been on the margin where if the people at EAG had not been selected for being folks I could be helpful to, I’d have been less likely to go.- 9 Feb 2023 8:34 UTC; 20 points) 's comment on Solidarity for those Rejected from EA Global by (
I think this request undermines how karma systems should work on a website. ‘Only people who have engaged with a long set of prerequisites can decide to make this post less visible’ seems like it would systematically prevent posts people want to see less of from being downvoted.
Hi EarlyVelcro,
Howie from 80k here.
As Ben said in his comment, the key ideas page, which is the most current summary of 80k’s views, doesn’t recommend that “EA should focus on AI alone”. We don’t think the EA community’s focus should be anything close to that narrow.
That said, I do see how the page might give the impression that AI dominates 80k’s recommendations since most of the other paths/problems talked about are ‘meta’ or ‘capacity building’ paths. The page mentions that “we’d be excited for people to explore [our list of problems we haven’t yet investigated] as well as other areas that could foreseeably have a positive effect on the long-term future” but it doesn’t say anything about what those problems are (other than a link to our problem profiles page, which has a list).
I think it makes sense that people end up focusing on the areas we mention directly and the page could do a better job of communicating that our priorities are more diverse.
The good news is that we’re currently putting together a more thorough list of areas that we think might be very promising but aren’t among our priority paths/problems.[1] Unfortunately, it didn’t quite get done in time to add it to this version of key ideas.
More generally, I think 80k’s content was particularly heavy on AI over the last year and, while it will likely remain our top priority, I expect it will make up a smaller portion of our content over the next few years.
[1] Many of these will be areas we haven’t yet investigated or areas that are too niche to highlight among our priority paths.
Know that other people have gone through the disillusionment pipeline, including (especially!) very smart, dedicated, caring, independent-minded people who felt strong affinity for EA. Including people who you may have seen give talks at EA Global or who have held prestigious jobs at EA orgs.
Also, I think even people like this who haven’t gone through the disillusionment pipeline are often a lot more uncertain about many (though not all) things than most newcomers would guess.
Hi Kevin,
Howie from 80k here. Note that I’m just one staff member and don’t speak on behalf of the whole team, but I wanted to give some thoughts on your comment.
First of all, I appreciate the feedback and I’m sorry to hear that you had a frustrating experience applying for coaching from 80k. Unfortunately, we can only work with a small fraction of the people who apply. Our current coaching page tries to make this clear and I apologize if we’ve done a bad job communicating this (either currently or in the past).
I understand that this situation might be particularly frustrating in light of the EA community’s recent emphasis on ‘talent gaps.’ We think this term has ended up causing a lot of confusion and it’s partly our fault, so we wrote a post back in November that tries to clarify our thinking on this issue.
In particular, we believe that very specific skill sets are the biggest constraint for most of our priority problems, as opposed to ‘talent’ in general. To address this, we’ve increased our focus on what we call ‘priority paths,’ which are the best ways we know of for people to use their careers to free up these specific constraints. Our current priority in coaching is to help people who have demonstrated an interest and the ability to excel in (at least) one of these paths. This is partly because we think the need for people in these paths is particularly pressing but also because we’re most able to help people in areas that are our focus. We try to make sure that we talk to the people we think we’re best placed to help with coaching in other ways too, for example some of our advice and many of the connections we can make are particularly valuable for people who don’t already have lots of current links to other effective altruists.
You don’t literally need an ML PhD from Harvard to be accepted but we agree that there are a lot of very talented people who we don’t end up advising. In our experience, these people aren’t “wandering in the dark;” many of them are already on their way to incredibly high impact careers. Our aim is also for the research and content we put out (both in writing and through the podcast) to be useful to the vast majority of our audience that doesn’t receive formal coaching.
We do hope to add to our coaching capacity sometime this year. That said, we continue to endorse our decision to hire relatively slowly. As you say, we think it’s important to stay ‘lean’ at this stage in our development. Hiring and training are time intensive and directly trade off against our capacity to develop and improve our various products. Moreover, we’ve found that it’s quite hard to hire qualified coaches. Coaches need to have a very broad and deep understanding of EA, as well as being skilled advisors, which means most good candidates have several other excellent career options. Our users often make major career decisions based on our coaching, so we think it’s essential to maintain a very high bar for these positions.
Hope this helps to at least somewhat explain where we’re coming from.
I also donated $5,800 (though not due to this post).
Yes — since the first week of the crisis, Nick and Will have been recused from the relevant discussions / decisions on the boards of both EV entities to avoid any potential conflict of interest. Staff in both EV entities were informed about that decision in mid-November.
I didn’t actually become a member until after the wording of the pledge changed but I do vividly remember the first wave of press because all my friends sent me articles showing that there were some kids in Oxford who were just like me.
Learning about Giving What We Can (and, separately, Jeff and Julia) made me feel less alone in the world and I feel really grateful for that.
Probably the best thing I got out of two years in law school. . .
Fwiw, for mental health I’m not sure whether therapy is more likely to treat the ‘root causes’ than medications. You could have a model where some ‘chemical thingie’ that can be treated by meds is the root cause of mental illness and the actual cognitive thoughts treated by therapy are the symptoms.
In reality, I’m not sure the distinction is even meaningful given all the feedback loops involved.
I think it’s especially dangerous to use this word when talking about high schoolers, especially given the number of cult and near-cult groups that have arisen in communities adjacent to EA.
Hi Jan, thanks for your thoughts. Kit’s response is fairly close to our views.
The most important thing we want to emphasize is that at 80,000 Hours we definitely don’t think that working at an EA org is the only valuable thing for people to do. I think that taken as a whole, our writing reflects that.
The best way to quickly get sense of our views is reading through our high impact careers article, especially the list of 10 priority paths. Only one of these is working at an EA org.
I think our job board, problem profiles, podcast and so on give a similar sense of how much we value people working outside EA orgs.
A second key point is that when we score plan changes, we do not have a strict formula. We score changes based on our overall views of which paths are high-impact and assess many of the plan changes, especially the larger ones, on an individual basis, rather than simply putting them in a category. As an approximation, those we most prioritise are those represented by the 10 priority paths.
Of our top rated plan changes, only 25% involve people working at EA orgs
Fortunately, scoring on a case by case basis makes our scoring less vulnerable to Goodharting. Unfortunately, it means that it’s difficult for us to communicate exactly how we score plan changes to others. When we do so, it’s generally a few sentences, which are just aimed at giving people a sense of how impactful 80,000 Hours is as an organisation. These explanations are not intended to be career advice and it would be a shame if people have been taking them as such.
The specific sentences you quote are a bit out of date and we explain the categories differently in a draft of our annual review, which we hope to publish in the coming months. For example, we often score a plan change as rated-10 if somebody takes up a particularly valuable skill-building opportunity within one of our priority paths.
I hope that helps answer your concern!
Hi Jan,
I just wrote a bit more about how we measure IASPCs in another comment on this thread. We don’t use a precise formula and the details are important so I can’t say exactly how we’d rate a particular change at this level of generality.
That said, we take into account someone’s degree of follow through when we score their plan change, such that very few of our highest rated plan changes (rated-100 or more) are from people who are not currently doing impactful work.
Of the rated 10’s, our analysis in 2017 found:
30% have changed their plans but not yet passed a major “milestone” in their shift. Most of these people have applied to a new graduate programme but not yet received an offer.
30% have reached a milestone, but are still building career capital (e.g. entered graduate school, or taken a high-earning job but not yet donated much).
40% have already started having an impact (e.g. have published research, taken a non-profit job).
I agree if we didn’t take follow through into account it would lead to some scores that were far removed from expected impact such as the hypothetical you’ve described.
Hope this clarifies things.
“A foreign national may not direct, dictate, control or directly or indirectly participate in the decision-making process of any person (such as a corporation, labor organization, political committee or political organization) with regard to the person’s federal or nonfederal election-related activities. This includes decisions concerning the making of contributions, donations, expenditures or disbursements in connection with any federal state or local election or decisions concerning the administration of a political committee.”
https://www.fec.gov/help-candidates-and-committees/foreign-nationals/
I think it would be pretty hard to argue that a donation swap didn’t at least involve indirectly participating in someone’s decision to donate.
Indeed, IIRC, EAs tend to be more progressive/left-of-center than the general population. I can’t find the source for this claim right now.
The 2019 EA Survey says:
“The majority of respondents (72%) reported identifying with the Left or Center Left politically and just over 3% were on the Right or Center Right, very similar to 2018.”
Note that Lentzos has also been critical of Bill Gates for drawing attention to the risk of terrorists using bioweapons. She thinks that terrorists are unlikely to deploy powerful bioweapons (because they won’t have the capabilities and because they won’t have the motivation) and by talking about bioterrorists, Gates might draw attention away from state actors. https://thebulletin.org/2017/07/ignore-bill-gates-where-bioweapons-focus-really-belongs/
She’s written more about why she thinks concern about terrorists using synthetic biology to create WMDs are based on myths. Her main points of disagreement with what she calls the dominant narrative:
1. Synthetic biology is not easy.
2. Do-it-yourself biology is not particularly sophisticated.
3. Building a dangerous virus from scratch is hard.
4. Even experts have a hard time enhancing disease pathogens.
5. Terrorists aren’t interested in making WMD bioweapons.
6. There are serious technical and logistical barriers to creating a bioweapon that’s a WMD.
https://thebulletin.org/2014/09/the-myths-and-realities-of-synthetic-bioweapons/
- 2 May 2019 9:35 UTC; 9 points) 's comment on Will splashy philanthropy cause the biosecurity field to focus on the wrong risks? by (
My impression is that a lot of her quick success was because her antitrust stuff tapped into progressive anti Big Tech sentiment. It’s possible EAs could somehow fit into the biorisk zeitgeist but otherwise, I think it would take a lot of thought to figure out how an EA could replicate this.
I briefly and informally looked into this several years ago and, at the time, had a few additional concerns. (Can’t promise I’m remembering this perfectly and the research may have progressed since then).
1) Many of the best studies on mindfulness’s effect on depression and anxiety were specifically on populations where people had other medical conditions (especially, I think, chronic pain or chronic illness) in addition to mental illness. But, most people I know who are interested in mindfulness aren’t specifically interested in this population.
My impression is that Jon Kabat-Zinn initially developed Mindfulness-Based Stress Reduction (MBSR) for people with other conditions and my intuition from my experience with it is that it might be especially helpful for things like chronic pain. So I had some external validity concerns.
2) There were few studies of long-term effects and it seems pretty plausible the effects would fade over time. This is especially true if we care about intention-to-treat effects. The fixed cost of an MBSR course might only be justified if it can be amortized over a fairly long period. But it wouldn’t be surprising if there are short-to-medium term benefits that fade over time as people stop practicing.
By contrast, getting a prescription for anti-depressants or anti-anxiety has a much lower fixed cost and it’s less costly and easier to take a pill every day (or as needed) than to keep up a meditation practice. (On the other hand, some meds have side effects for many people.)
3) You already mention that “many of those researching it seem to be true believers” but it seems worth reemphasizing this. When I looked over the studies included in a meta-analysis (I think it was the relevant Cochrane Review), I think a significant proportion of them literally had Jon Kabat-Zinn (the founder of MBSR) as a coauthor.
---
All that said, my personal subjective experience is that meditating has had a moderate but positive effect on my anxiety and possibly my depression when I’ve managed to keep it up.
Hey Bob—Howie from EV UK here. Thanks for flagging this! I definitely see why this would look concerning so I just wanted to quickly chime in and let you/others know that we’ve already gotten in touch with relevant regulators about this and I don’t think there’s much to worry about here.
The thing going on is that EV UK has an extended filing deadline (from 30 April to 30 June 2023) for our audited accounts,[1] which are one of the things included in our Annual Return. So back in April, we notified the Charity Commission that we’ll be filing our Annual Return by 30 June.
This is due to a covid extension, which the UK government has granted to many companies.