As technicalities noted, itās easy to see the merits of these arguments in general, but harder to see who should actually do things, and what they should do.
To summarize the below:
EA orgs already look at a wide range of causes, and the org with most of the money looks at perhaps the widest range of causes
Our community is small and well-connected; new causes can get attention and support pretty easily if someone presents a good argument, and thereās a strong historical precedent for this
People should be welcoming and curious to people from many different backgrounds, and attempts to do more impactful work should be celebrated for many kinds of work
If this isnāt the case now, people should be better about this
If you have suggestions for what specific orgs or funders should do, Iām interested in hearing them!
My comments often look like: āWhen you say that āEA should do Xā, which people and organizations in EA are you referring to?ā
Open Philanthropy does more funding and research than anyone, and they work in a broad range of areas. Maybe the concrete argument here is that they should develop more shallow investigations into medium-depth investigations?
Rethink Priorities probably does the second-most research among EA orgs, and they also look at a lot of different topics.
Founders Pledge is probably top-five among orgs, and⦠again, lots of variety.
Past those two organizations, most research orgs in EA have pretty specific areas of focus. Animal Charity Evaluators looks at animal charities. GiveWell looks at global health and development interventions with strong RCT support. If you point ACE at a promising new animal charity to fund, or GiveWell at a new paper showing a cool approach to improving health in the developing world, theyād probably be interested! But theyāre not likely to move into causes outside their focus areas, which seems reasonable.
After all of this, which organizations are left that actually have ātoo narrowā a focus? 80,000 Hours? The Future of Humanity Institute?
A possible argument here is that some new org should exist to look for totally new causes; on the other hand, Open Philanthropy already does a lot of this, and if they were willing to fund other people to do more of it, I assume theyād rather hire those people ā and they have, in fact, been rapidly expanding their research team.
*****
On your example of cancer: Open Philanthropy gave a $6.5 million grant to cancer research in 2017, lists cancer as one of the areas they support on their āHuman Health and Wellbeingā page, and notes it as a plausible focus area in a 2014 report. Iām guessing theyāve looked at other cancer research projects and found them somewhat less promising than their funding bar.
Aside from Open Phil, I donāt know which people or entities in EA are well-positioned to focus on cancer. It seems like someone would have to encourage existing bio-interested people to focus on cancer instead of biosecurity or neglected tropical diseases, which doesnāt seem obviously good.
In the case of a cancer researcher looking for funding from an EA organization, there just arenāt many people who have the necessary qualifications to judge their work, because EA is a tiny movement with a lot of young people and few experienced biologists.
The best way for someone who isnāt a very wealthy donor to change this would probably be to write a compelling case for cancer research on the Forum; lots of people read this website, including people with money to spend. Same goes for other causes someone thinks are neglected.
This path has helped organizations like ALLFED and the Happier Lives Institute get more attention for their novel research agendas, and posts with the āless-discussed causesā tag do pretty well here.
As far as I can tell, weāre bottlenecked on convincing arguments that other areas and interventions are worth funding, rather than willingness to consider or fund new areas and interventions for which convincing arguments exist.
*****
Fortunately, thereās good historical precedent here: EA is roughly 12 years old, and has a track record of integrating new ideas at a rapid pace. Hereās my rough timeline (Iād welcome corrections on this):
2007: GiveWell is founded
2009: Giving What We Can is founded, launching the āEA movementā (though the term āeffective altruismā didnāt exist yet). The initial focus was overwhelmingly on global development.
2011: The Open Philanthropy Project is founded (as GiveWell Labs). Initial shallow investigations include climate change, in-country migration, and asteroid detection (conducted between 2011 and 2013).
2013: The Singularity Institute for Artificial Intelligence becomes MIRI
2014: The first EA Survey is run. The most popular orgs people mention as donation targets are (in order) AMF, SCI, GiveDirectly, MIRI, GiveWell, CFAR, Deworm the World, Vegan Outreach, the Humane League, and 80,000 Hours.
To be fair, the numbers look pretty similar for the 2019 survey, though they are dwarfed by donations from Open Phil and other large funders.
Depending on where you count the āstarting pointā, it took between 5 and 7 years to get from āeffective giving should existā to something resembling our present distribution of causes.
In the seven years since, weāve seen:
The launch of multiple climate-focused charity recommenders (Iād argue that the Clean Air Task Force is now as well-established an āEA charityā as most of the charities GiveWell recommends)
The rise of wild animal suffering and AI governance/āpolicy as areas of concern (adding a ton of depth and variety to existing cause areas ā it hasnāt been that long since āAIā meant MIRIās technical research and āanimal advocacyā meant lobbying against factory farming when those things came up in EA)
The founding of the Good Food Institute (2016) and alternative protein becoming āa thingā
The founding of Charity Entrepreneurship and resultant founding of orgs focused on tobacco taxation, lead abatement, fish welfare, family planning, and other āunusualā causes
Open Philanthropy going from a few million dollars in annual grants to in the neighborhood of ~$200 million. Alongside āstandard cause areaā grants, 2021 grants include $7 million for the Centre for Pesticide Suicide Prevention, $1.5 million for Fair and Just Prosecution, and $0.6 million for Abundant Housing Massachusetts (over two years ā but given that the org has a staff of one person right now, I imagine thatās a good chunk of their total funding)
Three of the ten highest-karma Forum posts of all time (1, 2, 3) discuss cause areas with little existing financial support within EA
Iād hope that all this would also generate a better social environment for people to talk about different types of work ā if not, individuals need better habits.
*****
Everyone reasonably familiar with EA knows that AI safety, pandemic preparedness, animal welfare and global poverty are considered EA cause areas, whereas feminism, LGBT rights, wildlife conservation and dental hygiene arenāt.
I think that any of these causes could easily get a bunch of interest and support if someone published a single compelling Forum post arguing that putting some amount of funding into an existing organization or intervention would lead to a major increase in welfare. (Maybe not wildlife conservation, because it seems insanely hard for that to be competitive with farmed animal welfare, but Iām open to having my mind blown.)
Until that post exists (or some other resource written with EA principles in mind), thereās not much for a given person in the community to do. Though I do think that individuals should generally try to read more research outside of the EA-sphere, to get a better sense for whatās out there.
If someone is reading this and wants to try writing a compelling post about a new area, Iād be psyched to hear about it!
Or, if you arenāt sure what area to focus on, but want to embrace the challenge of opening a new conversation, Iāve got plenty of suggestions for you (starting here).
*****
However, this calculus can be somewhat incomplete, as it doesnāt take into account the personal circumstances of the particular biologist debating her career. What if sheās a very promising cancer researcher (as a result of her existing track record, reputation or professional inclinations) but itās not entirely clear how sheād do in the space of clean meat? What if she feels an intense inner drive working on cancer (since her mother died of melanoma)? These considerations should factor in when she tries to estimate her expected career-long impact.
I think that very few people in this community would disagree, at least in the example youāve put forth.
*****
From my experience, a biologist choosing to spend her career doing cancer research would often feel inferior to other EAs choosing a more EA-stereotypic career such as pandemic preparedness or clean meat. When introducing herself in front of other EAs, she may start with an apology like āWhat Iām working on isnāt really related to EAā.
What if we tried more actively to let people feel that whatever they want to work on is really fine, and simply tried to support and help them do it better through evidence and reason?
This is where I agree with you, in that I strongly support āletting people feel that what they want to work on is fineā and ānot making people feel apologetic about what they doā.
But Iām not sure how many people actually feel this way, or whether the way people respond to them actually generates this kind of feeling. My experience is that when people tell me they work on something unusual, I try to say things like āCool!ā and āWhatās that like?ā and āWhat do you hope to accomplish with that?ā and āHave you thought about writing this up on the Forum?ā (I donāt always succeed, because small talk is an imperfect art, but thatās the mindset.)
Iād strongly advocate for other people in social settings also saying things like this. Maybe the most concrete suggestion from here is for EA groups, and orgs that build resources for them, to encourage this more loudly than they do now? I try to be loud, here and in the EA Newsletter, but Iām one person :-(
*****
I think that the EA community should be a big tent for people who want to do a better job of measuring and increasing their impact, no matter what they work on.
I think that EA research should generally examine a wide range of options in a shallow way, before going deeper on more promising options (Open Philās approach). But EA researchers should look at whatever seems interesting or promising to them, as long as they understand that getting funded to pursue research will probably require presenting strong evidence of impact/āpromise to a funder.
I think that EA funding should generally be allocated based on the best analysis we can do on the likely impact of different work. But EA funders should fund whatever seems interesting or promising to them, as long as they understand that theyāll probably get less impact if they fund something that few other people in the community think is a good funding target. (Value of learning is real, and props to small funders who make grants with a goal of learning more about some area.)
I think that EA advice should try to work out what the person being advised actually wants ā is it āhave an impactful career in dental hygiene promotionā, or āhave an impactful career, full stopā? Is it āsave kids from cancerā, or āsave kids, full stopā?
And I think we should gently nudge people to consider the āfull stopā options, because the āfollow your passions wherever they goā argument seems more common in the rest of society than it ought to be. Too many people choose a cause or career based on a few random inputs (āI saw a movie about itā, āI got into this lab and not that labā, āI needed to pay off my student loans ASAPā) without thinking about a wide range of options first.
But in the end, thereās nothing wrong with wanting to do a particular thing, and trying to have the most impact you can with the thing you do. This should be encouraged and celebrated, whether or not someone chooses to donate to it.
Thank you Aaron for taking the time to write this detailed and thoughtful comment to my post!
Iāll start with saying that I pretty much agree with everything you say, especially in your final remarksāthat we should be really receptive to what people actually want and advise them accordingly, and maybe try to gently nudge them into taking a more open-minded general-impact-oriented approach (but not try to force it on them if they donāt want to).
I also totally agree that most EA orgs are doing a fantastic job at exploring diverse causes and ways to improve the world, and that the EA movement is very open-minded to accepting new causes in the presence of good evidence.
To be clear, I donāt criticize specific EA orgs. The thing I do criticize is pretty subtle, and refers more to the EA community itselfāsometimes to individuals in the community, but mostly to our collective attitude and the atmospheres we create as groups.
When I say āI think we need to be more open to diverse causesā, it seems that your main answer is āpresent me with good evidence that a new cause is promising and Iāll support itā, which is totally fair. I think this is the right attitude for an EA to have, but it doesnāt exactly address what I allude to. I donāt ask EAs to start contributing to new unproven causes themselves, but rather that they be open to others contributing to them.
I agree with you that most EAs would not confront a cancer researcher and blame her of doing something un-EA-like (and I presume many would even be kind and approach her with curiosity about the motives for her choice). But in the end, I think it is still very likely she would nonetheless feel somewhat judged. Because even if every person she meets at EA Global tries to nudge her only very gently (āOh, thatās interesting! So why did you decide to work on cancer? Have you considered pandemic preparedness? Do you think cancer is more impactful?ā), those repeating comments can accumulate into a strong feeling of unease. To be clear, Iām not blaming any of the imaginary people who met the imaginary cancer researcher at the imaginary EAG conference for having done anything wrong, because each one of them tried to be kind and welcoming. Itās only their collective action that made her feel off.
I think the EA community should be more welcoming to people who want to operate in areas we donāt consider particularly promising, even if they donāt present convincing arguments for their decisions.
In the end, I think it is still very likely she would nonetheless feel somewhat judged. Because even if every person she meets at EA Global tries to nudge her only very gently (āOh, thatās interesting! So why did you decide to work on cancer? Have you considered pandemic preparedness? Do you think cancer is more impactful?ā), those repeating comments can accumulate into a strong feeling of unease.
I like this example! It captures something I can more easily imagine happening (regularly) in the community.
One proposal for how to avoid this collective action problem would be for people to ask the same sorts of questions, no matter what area someone works on (assuming they donāt know enough to have more detailed/āspecific questions).
For example, instead of:
Have you considered X?
Do you think your thing, Y, is more impactful than X?
Youād have questions like:
What led you to work on Y?
And then, if they say something about impact, āWere there any other paths you considered? How did you choose Y in the end?ā
What should someone not involved in Y know about it?
What are your goals for this work? How is it going so far?
What are your goals for this event? (If itās a major event and not e.g. a dinner party)
These should work about equally well for people in most fields, and I think that ādiscussing the value/āpromise of an areaā conversations will typically go better than ādiscussing whether a new area ābeatsā another area by various imperfect measuresā. We still have to take the second step at some point as a community, but Iād rather leave that to funders, job-seekers, and Forum commentators.
I think the EA community should be more welcoming to people who want to operate in areas we donāt consider particularly promising, even if they donāt present convincing arguments for their decisions.
Depends on the context.
Plenty of people in the EA space are doing their own thing (disconnected from standard paths) but still provide interesting commentary, ask good questions, etc. I have no idea what some Forum users do for work, but I donāt feel the need to ask. If theyāre a good fit for the culture and the community seems better for their presence, Iām happy.
The difficulty comes when certain decisions have to be made ā whose work to fund, which people are likely to get a lot of benefit from EA Global, etc. At that point, you need solid evidence or a strong argument that your work is likely to have a big impact.
In casual settings, the former āvibeā seems better ā but sometimes, I think that people who thrive in casual spaces get frustrated when they āhit a wallā in the latter situations (not getting into a conference, not getting a grant, etc.)
In the end, EA canāt really incorporate an area without having a good reason to do so. Iād be satisfied if we could split āsocial EAā from ābusiness EAā in terms of how much evidence and justification people are asked for, but we should be transparent about the difference between enjoying the community and looking for career or charity support.
I like your suggestions for questions one could ask a stranger at an EA event!
About āsocial EAā vs. ābusiness EAā, I think Iād make a slightly different distinction. If you ask for someone elseās (or some orgās) time or money, then of course you need to come up with good explanations for why the thing you are offering (whether it is your employment or some project) is worthwhile. Itās not even a unique feature of EA. But, if you are just doing your own thing and not asking for anyoneās time or money, and just want to enjoy the company of other EAs, then this is the case where I think the EA community should be more welcoming and be happy to just let you be.
As technicalities noted, itās easy to see the merits of these arguments in general, but harder to see who should actually do things, and what they should do.
To summarize the below:
EA orgs already look at a wide range of causes, and the org with most of the money looks at perhaps the widest range of causes
Our community is small and well-connected; new causes can get attention and support pretty easily if someone presents a good argument, and thereās a strong historical precedent for this
People should be welcoming and curious to people from many different backgrounds, and attempts to do more impactful work should be celebrated for many kinds of work
If this isnāt the case now, people should be better about this
If you have suggestions for what specific orgs or funders should do, Iām interested in hearing them!
*****
To quote myself:
Open Philanthropy does more funding and research than anyone, and they work in a broad range of areas. Maybe the concrete argument here is that they should develop more shallow investigations into medium-depth investigations?
Rethink Priorities probably does the second-most research among EA orgs, and they also look at a lot of different topics.
Founders Pledge is probably top-five among orgs, and⦠again, lots of variety.
Past those two organizations, most research orgs in EA have pretty specific areas of focus. Animal Charity Evaluators looks at animal charities. GiveWell looks at global health and development interventions with strong RCT support. If you point ACE at a promising new animal charity to fund, or GiveWell at a new paper showing a cool approach to improving health in the developing world, theyād probably be interested! But theyāre not likely to move into causes outside their focus areas, which seems reasonable.
After all of this, which organizations are left that actually have ātoo narrowā a focus? 80,000 Hours? The Future of Humanity Institute?
A possible argument here is that some new org should exist to look for totally new causes; on the other hand, Open Philanthropy already does a lot of this, and if they were willing to fund other people to do more of it, I assume theyād rather hire those people ā and they have, in fact, been rapidly expanding their research team.
*****
On your example of cancer: Open Philanthropy gave a $6.5 million grant to cancer research in 2017, lists cancer as one of the areas they support on their āHuman Health and Wellbeingā page, and notes it as a plausible focus area in a 2014 report. Iām guessing theyāve looked at other cancer research projects and found them somewhat less promising than their funding bar.
Aside from Open Phil, I donāt know which people or entities in EA are well-positioned to focus on cancer. It seems like someone would have to encourage existing bio-interested people to focus on cancer instead of biosecurity or neglected tropical diseases, which doesnāt seem obviously good.
In the case of a cancer researcher looking for funding from an EA organization, there just arenāt many people who have the necessary qualifications to judge their work, because EA is a tiny movement with a lot of young people and few experienced biologists.
The best way for someone who isnāt a very wealthy donor to change this would probably be to write a compelling case for cancer research on the Forum; lots of people read this website, including people with money to spend. Same goes for other causes someone thinks are neglected.
This path has helped organizations like ALLFED and the Happier Lives Institute get more attention for their novel research agendas, and posts with the āless-discussed causesā tag do pretty well here.
As far as I can tell, weāre bottlenecked on convincing arguments that other areas and interventions are worth funding, rather than willingness to consider or fund new areas and interventions for which convincing arguments exist.
*****
Fortunately, thereās good historical precedent here: EA is roughly 12 years old, and has a track record of integrating new ideas at a rapid pace. Hereās my rough timeline (Iād welcome corrections on this):
2007: GiveWell is founded
2009: Giving What We Can is founded, launching the āEA movementā (though the term āeffective altruismā didnāt exist yet). The initial focus was overwhelmingly on global development.
2011: The Open Philanthropy Project is founded (as GiveWell Labs). Initial shallow investigations include climate change, in-country migration, and asteroid detection (conducted between 2011 and 2013).
2012: Animal Charity Evaluators is founded.
2013: The Singularity Institute for Artificial Intelligence becomes MIRI
2014: The first EA Survey is run. The most popular orgs people mention as donation targets are (in order) AMF, SCI, GiveDirectly, MIRI, GiveWell, CFAR, Deworm the World, Vegan Outreach, the Humane League, and 80,000 Hours.
To be fair, the numbers look pretty similar for the 2019 survey, though they are dwarfed by donations from Open Phil and other large funders.
Depending on where you count the āstarting pointā, it took between 5 and 7 years to get from āeffective giving should existā to something resembling our present distribution of causes.
In the seven years since, weāve seen:
The launch of multiple climate-focused charity recommenders (Iād argue that the Clean Air Task Force is now as well-established an āEA charityā as most of the charities GiveWell recommends)
The rise of wild animal suffering and AI governance/āpolicy as areas of concern (adding a ton of depth and variety to existing cause areas ā it hasnāt been that long since āAIā meant MIRIās technical research and āanimal advocacyā meant lobbying against factory farming when those things came up in EA)
The founding of the Good Food Institute (2016) and alternative protein becoming āa thingā
The founding of Charity Entrepreneurship and resultant founding of orgs focused on tobacco taxation, lead abatement, fish welfare, family planning, and other āunusualā causes
Open Philanthropy going from a few million dollars in annual grants to in the neighborhood of ~$200 million. Alongside āstandard cause areaā grants, 2021 grants include $7 million for the Centre for Pesticide Suicide Prevention, $1.5 million for Fair and Just Prosecution, and $0.6 million for Abundant Housing Massachusetts (over two years ā but given that the org has a staff of one person right now, I imagine thatās a good chunk of their total funding)
Three of the ten highest-karma Forum posts of all time (1, 2, 3) discuss cause areas with little existing financial support within EA
Iād hope that all this would also generate a better social environment for people to talk about different types of work ā if not, individuals need better habits.
*****
I think that any of these causes could easily get a bunch of interest and support if someone published a single compelling Forum post arguing that putting some amount of funding into an existing organization or intervention would lead to a major increase in welfare. (Maybe not wildlife conservation, because it seems insanely hard for that to be competitive with farmed animal welfare, but Iām open to having my mind blown.)
Until that post exists (or some other resource written with EA principles in mind), thereās not much for a given person in the community to do. Though I do think that individuals should generally try to read more research outside of the EA-sphere, to get a better sense for whatās out there.
If someone is reading this and wants to try writing a compelling post about a new area, Iād be psyched to hear about it!
Or, if you arenāt sure what area to focus on, but want to embrace the challenge of opening a new conversation, Iāve got plenty of suggestions for you (starting here).
*****
I think that very few people in this community would disagree, at least in the example youāve put forth.
*****
This is where I agree with you, in that I strongly support āletting people feel that what they want to work on is fineā and ānot making people feel apologetic about what they doā.
But Iām not sure how many people actually feel this way, or whether the way people respond to them actually generates this kind of feeling. My experience is that when people tell me they work on something unusual, I try to say things like āCool!ā and āWhatās that like?ā and āWhat do you hope to accomplish with that?ā and āHave you thought about writing this up on the Forum?ā (I donāt always succeed, because small talk is an imperfect art, but thatās the mindset.)
Iād strongly advocate for other people in social settings also saying things like this. Maybe the most concrete suggestion from here is for EA groups, and orgs that build resources for them, to encourage this more loudly than they do now? I try to be loud, here and in the EA Newsletter, but Iām one person :-(
*****
I think that the EA community should be a big tent for people who want to do a better job of measuring and increasing their impact, no matter what they work on.
I think that EA research should generally examine a wide range of options in a shallow way, before going deeper on more promising options (Open Philās approach). But EA researchers should look at whatever seems interesting or promising to them, as long as they understand that getting funded to pursue research will probably require presenting strong evidence of impact/āpromise to a funder.
I think that EA funding should generally be allocated based on the best analysis we can do on the likely impact of different work. But EA funders should fund whatever seems interesting or promising to them, as long as they understand that theyāll probably get less impact if they fund something that few other people in the community think is a good funding target. (Value of learning is real, and props to small funders who make grants with a goal of learning more about some area.)
I think that EA advice should try to work out what the person being advised actually wants ā is it āhave an impactful career in dental hygiene promotionā, or āhave an impactful career, full stopā? Is it āsave kids from cancerā, or āsave kids, full stopā?
And I think we should gently nudge people to consider the āfull stopā options, because the āfollow your passions wherever they goā argument seems more common in the rest of society than it ought to be. Too many people choose a cause or career based on a few random inputs (āI saw a movie about itā, āI got into this lab and not that labā, āI needed to pay off my student loans ASAPā) without thinking about a wide range of options first.
But in the end, thereās nothing wrong with wanting to do a particular thing, and trying to have the most impact you can with the thing you do. This should be encouraged and celebrated, whether or not someone chooses to donate to it.
Thank you Aaron for taking the time to write this detailed and thoughtful comment to my post!
Iāll start with saying that I pretty much agree with everything you say, especially in your final remarksāthat we should be really receptive to what people actually want and advise them accordingly, and maybe try to gently nudge them into taking a more open-minded general-impact-oriented approach (but not try to force it on them if they donāt want to).
I also totally agree that most EA orgs are doing a fantastic job at exploring diverse causes and ways to improve the world, and that the EA movement is very open-minded to accepting new causes in the presence of good evidence.
To be clear, I donāt criticize specific EA orgs. The thing I do criticize is pretty subtle, and refers more to the EA community itselfāsometimes to individuals in the community, but mostly to our collective attitude and the atmospheres we create as groups.
When I say āI think we need to be more open to diverse causesā, it seems that your main answer is āpresent me with good evidence that a new cause is promising and Iāll support itā, which is totally fair. I think this is the right attitude for an EA to have, but it doesnāt exactly address what I allude to. I donāt ask EAs to start contributing to new unproven causes themselves, but rather that they be open to others contributing to them.
I agree with you that most EAs would not confront a cancer researcher and blame her of doing something un-EA-like (and I presume many would even be kind and approach her with curiosity about the motives for her choice). But in the end, I think it is still very likely she would nonetheless feel somewhat judged. Because even if every person she meets at EA Global tries to nudge her only very gently (āOh, thatās interesting! So why did you decide to work on cancer? Have you considered pandemic preparedness? Do you think cancer is more impactful?ā), those repeating comments can accumulate into a strong feeling of unease. To be clear, Iām not blaming any of the imaginary people who met the imaginary cancer researcher at the imaginary EAG conference for having done anything wrong, because each one of them tried to be kind and welcoming. Itās only their collective action that made her feel off.
I think the EA community should be more welcoming to people who want to operate in areas we donāt consider particularly promising, even if they donāt present convincing arguments for their decisions.
I like this example! It captures something I can more easily imagine happening (regularly) in the community.
One proposal for how to avoid this collective action problem would be for people to ask the same sorts of questions, no matter what area someone works on (assuming they donāt know enough to have more detailed/āspecific questions).
For example, instead of:
Have you considered X?
Do you think your thing, Y, is more impactful than X?
Youād have questions like:
What led you to work on Y?
And then, if they say something about impact, āWere there any other paths you considered? How did you choose Y in the end?ā
What should someone not involved in Y know about it?
What are your goals for this work? How is it going so far?
What are your goals for this event? (If itās a major event and not e.g. a dinner party)
These should work about equally well for people in most fields, and I think that ādiscussing the value/āpromise of an areaā conversations will typically go better than ādiscussing whether a new area ābeatsā another area by various imperfect measuresā. We still have to take the second step at some point as a community, but Iād rather leave that to funders, job-seekers, and Forum commentators.
Depends on the context.
Plenty of people in the EA space are doing their own thing (disconnected from standard paths) but still provide interesting commentary, ask good questions, etc. I have no idea what some Forum users do for work, but I donāt feel the need to ask. If theyāre a good fit for the culture and the community seems better for their presence, Iām happy.
The difficulty comes when certain decisions have to be made ā whose work to fund, which people are likely to get a lot of benefit from EA Global, etc. At that point, you need solid evidence or a strong argument that your work is likely to have a big impact.
In casual settings, the former āvibeā seems better ā but sometimes, I think that people who thrive in casual spaces get frustrated when they āhit a wallā in the latter situations (not getting into a conference, not getting a grant, etc.)
In the end, EA canāt really incorporate an area without having a good reason to do so. Iād be satisfied if we could split āsocial EAā from ābusiness EAā in terms of how much evidence and justification people are asked for, but we should be transparent about the difference between enjoying the community and looking for career or charity support.
I like your suggestions for questions one could ask a stranger at an EA event!
About āsocial EAā vs. ābusiness EAā, I think Iād make a slightly different distinction. If you ask for someone elseās (or some orgās) time or money, then of course you need to come up with good explanations for why the thing you are offering (whether it is your employment or some project) is worthwhile. Itās not even a unique feature of EA. But, if you are just doing your own thing and not asking for anyoneās time or money, and just want to enjoy the company of other EAs, then this is the case where I think the EA community should be more welcoming and be happy to just let you be.