Is helpful/friendly :-) Loves to learn. Wants to solve neglected problems. See website for current progress.
Madhav Malhotra
Antifungal Resistance—The Neglected Cousin of Antibiotic Resistance
Summary: The Case for Halting AI Development—Max Tegmark on the Lex Fridman Podcast
Summary of “Technology Favours Tyranny” by Yuval Noah Harari
Biodiversity Loss: In the Shadow of Climate Change
For what it’s worth, I run an EA university group outside of the U.S (at the University of Waterloo in Canada). I haven’t observed any of the points you mentioned in my experience with the EA group:
We don’t run intro to EA fellowships because we’re a smaller group. We’re not trying to convert more students to be ‘EA’. We more so focus on supporting whoever’s interested in working on EA-relevant projects (ex: a cheap air purifier, a donations advisory site, a cybersecurity algorithm, etc.). Whether they identify with the EA movement or not.
Since we’re not trying to get people to become EA members, we’re not hosting any discussions where a group organiser could convince people to work on AI safety over all else.
No one’s getting paid here. We have grant money that we’ve used for things like hosting an AI governance hackathon. But that money gets used for things like marketing, catering, prizes, etc. - not salaries.
Which university EA groups specifically did you talk to before proclaiming “University EA Groups Need Fixing”? Based only on what I read in your article, a more accurate title seems to be “Columbia EA Needs Fixing”
What are the top 2-3 issues Rethink Priorities is facing that prevent you from achieving your goals? What are you currently doing to work on these issues?
Next Generation Biosecurity—Summary of Course by University of Bath
Preventing AI Misuse: State of the Art Research and its Flaws
What have you been intentional about prioritising in the workplace culture at Rethink Priorities? If you focus on making it a great place for people to work, how do you do that?
Young People of EA—Database of Friendly Contacts
AI Safety For Dummies (Like Me)
Internship Lessons at the Happier Lives Institute
To any staff brave enough to answer :D
You’re fired tomorrow and replaced by someone more effective than you. What do they do that you’re not doing?
A lot of people have gotten the message: “Direct your career towards AI Safety!” from EA. Yet there seem to be way too few opportunities to get mentorship or a paying job in AI safety. (I say this having seen others’ comments on the forum and applied to 5+ fellowships personally where there were 500-3000% more applicants than spots).
What advice would you give to those feeling disenchanted by their inability to make progress in AI safety? How is 80K hours working to better (though perhaps not entirely) balance the supply and demand for AI safety mentorship/jobs?
Effective Giving Recommendations in India?
The UX has so much improvement since the 2022 version of this :-) It feels concise and the scrolling to each new graph makes it interesting to learn each new thing. Kudos to whoever designed it this way!
I’d be interested in hearing someone from Anthropic discuss the upsides or downsides of this arrangement. From an entirely personal standpoint, it seems odd that Anthropic gave up equity AND had restrictions in how the investment could be used. That said, I imagine there are MANY other details about I’m not aware of since I wasn’t involved in the decision.
In your past experiences, what are the biggest barriers to getting your research in front of governmental organisations? (ex: official development aid grantmakers or policy-makers)
Biggest barriers in getting them to act on it?
It takes courage to share such detailed stories of goals not going right! Good on you for having the courage to do so :-)
It seems that two kinds of improvements within EA might be helpful to reduce the probability of other folks having similar experiences.
Proactively, we could adjust the incentives promoted (especially by high-visibility organisations like 80K hours). Specifically, I think it would be helpful to:
Recommend that early-career folks try out university programs with internships/coops in the field they think they’d enjoy. This would help error-correct earlier rather than later.
Adjust the articles on high-visibility sites to focus less on finding the “most” impactful career path, but instead one of many impactful career paths. I especially say this because sites like 80K hours have gotten a lot more general traffic ever since they vastly increased marketing. When you’re reaching a broader target audience (especially for the first time), it’s not as essential to urgently direct someone to the exact right career path. It might be a more reasonable goal to get them thinking about a few options. Then, those who want to refine their plan can be directed to more specialised resources within EA (ex: biosecurity → reading list).
To be more specific about what I mean by making content focus on “one of many impactful paths,” here are examples of content rewrites on 80K hour’s career reviews:
Original: “The highest-impact career for you is the one that allows you to make the biggest contribution to solving one of the world’s most pressing problems.”
Rewrite: The highest-impact career for you depends on your unique skills and motivations. Out of the careers that suit you, which ones increase your contributions to solving one of the world’s most pressing problems?
Original: “Below we list some other career paths that we don’t recommend as often or as highly as those above, but which can still often be top options for people we advise.”
Rewrite: Below, we list some career paths that we recommend less frequently than those above. However, they might specifically be a good fit for your unique preferences.
Original: “The lists are based on 10 years of research and experience advising people, and represent the careers it seems to us will be most impactful over the long run if you get started on them now — though of course we can’t be sure what the future holds.”
Rewrite: None, the ending clause on uncertainty is good :-)
Reactively, various efforts have been trying to improve mental health support within EA. I look forward to seeing continued progress in creating easily-accessible collections of resources!
I’m not sure this is a good idea.
It seems possible that the individual interventions you’re linking to research on are not representative of every possible intervention about skill development.
Also, it seems possible that future interventions may integrate both building human and economic capital to enable recipients to make changes in their lives. Ie. Skill-building + direct cash transfers.
Also, it’s generally uncertain whether GiveDirectly will continue to be the most effective or endorsed donation recommendation. I say this given changes in how we measure wellbeing (admittedly, a topic with frequent updates to opinions and mistake corrections being made).
Why potentially reduce the effectiveness of those future interventions by launching this campaign?