There seems to be an opportunity for founding an org for “EA mental health”:
It’s plausible that, even ignoring the wellbeing of the recipients, the cost effectiveness for impact could be enough. For example, even if you had to pay the costs of treating a thousand EAs, if doing so pulled a dozen of those people out of depression, the impact might be worth it. Depression and burnout is terrible. Also, the preventative value from understanding/protecting burnout and other issues seems high.
Conditional on mental health being a viable cause area, there’s probably spillover effects from founding and executing the org (normalizing mental health, gears level knowledge for other interventions, partnerships and visibility)
More delicately/pragmatically, EA meta orgs tend to be attractive to founders as such orgs tend to be high status and more stable in funding. There’s probably many versions of this org where a competent generalist EA founder, without mental health experience, can execute this well, by recruiting/partnering with established mental health practitioners .
On the other hand, I could see the “ceiling” and “floor” for success of such an org to be high. For example, it may be cheap and highly effective to get virtually every senior EA leader to contribute snippets of their own experiences. I already see this happening in private, high trust situations (or Peter Wildeford right above my comment in the EA forum). A founder probably needs to be trusted, established, and experienced to do the intervention well.
Mental health may be a unique case that helps overcome the normal reluctance of charities to spend on staff. There seems like an opportunity where a skilled founder can use the historical reluctances around mental health and charity spending, to cancel each other out.
There’s a sort of “virtue ethics” reason for having an org focusing on EA mental health. It just feels right. Long time EAs know a lot of people who have given a lot of their lives and income (which means giving up safety and ability to buy mental health services). This seems especially important in less technical cause areas.
EAs are different in how they view the world and why they think their choices have value. There’s probably benefits to focus that an EA specific org would have.
My gut reaction is to think we should just make use of existing mental health resources out there which are abundant. I’m not sure why it would help for it to be EA specific.
It would certainly be useful for someone to make a summary of available resources and/or to do a meta-review of what works for mental health, but I can’t see that this would require a whole organisation to be set up. CEA could hire one person to work on this for example and that would seem to me to be sufficient.
I can’t see that this would require a whole organisation to be set up. CEA could hire one person to work on this for example and that would seem to me to be sufficient.
This org would be setup to provision actual mental health services or programs at free or low cost for EAs.
To be really concrete, maybe imagine a pilot with 1-2 EA founders and maybe 2-4.0 FTE practitioners or equivalent partnerships.
It would certainly be useful for someone to make a summary of available resources and/or to do a meta-review of what works for mental health,
There are perspectives where the value of reviews and compendium websites have limited value and my first impression is that it may apply here.
My gut reaction is to think we should just make use of existing mental health resources out there which are abundant. I’m not sure why it would help for it to be EA specific.
This is a very strong statement. I have trouble relating to this belief.
Your profile says you work in management consulting or economics. You also seem to live in the UK. You seem have and directly use high human capital in management or highly technical fields. In totality, you probably enjoy substantial mental health services and while such jobs can be stressful, it’s usually the case they do not involve direct emotional trauma.
Not all EAs enjoy the same experiences. For example:
In global health or in animal welfare, people who love animals have to pour over graphic footage of factory farming, or physically enter factory farms risking specific legal retaliation from industry, to create movies like this. I assure you funding for such work is low and there may be no mental health support.
Similar issues exist in global health and poverty, where senior leaders often take large pay cuts and reduction of benefits.
I know an EA-like person who had to work for free during 2020 for 3-4 months in a CEO role, where they worked 20 hour days. They regularly faced pleas for medical supplies from collapsing institutions, as well as personal attacks and fires inside and outside the org, for not doing enough.
Many of these roles are low status or have zero pay or benefits.
Many of the people who do this work above have very high human capital and would enjoy high pay and status, but actively choose to do work because no one else will or will even understand it.
While I am sympathetic to the idea of doing lots of well-being stuff, it’s not obvious why this needs a new EA-specified org.
To restate, I take it thought is that improving mental health of EAs could be a plausible priority because of the productivity gains from those people, which allows them to do more good—saliently, the main benefit of this isn’t supposed to come from the welfare gains to the treated people.
Seeing as people can buy mental health treatments for themselves, and orgs can pay for it for their staff, I suppose the intervention you have in mind is to improve the mental health of organisation as a whole—that is, change the system, rather than keep the system fixed but help the people in the system. This is a classic organizational psychology piece, and I’m sure there are consultants EAs orgs could hire to help them with this. Despite being a huge happiness nerd, I’m actually not familiar with the world of happiness/mental health organisational consultancies. One I do know of is Friday Pulse, but I’m sure they aren’t the only people who try to do this sort of thing.
Given such things exist, it’s not obvious why self-described effective altruists should prioritise setting up more things of this type.
I think broadly what you’re saying is “Well, if impact can be improved by mental health, then orgs can provision this without our help.”
I’m pattern matching this to a “free market” sort of argument, which I don’t think this is right.
Most directly, I argue that mental health services can be very unapproachable and are effectively under provisioned. Many people do not have access to it, contrary to what you’re saying. Secondly, there’s large differences in quality and fit from services, and I suspect many EAs would benefit from a specific set of approaches that can be developed for them.
More meta, I think a reasonable worldview is that mental health is a resource which normally gets depleted. Despite—or because someone is a strong contributor, they can make use of mental health resources. In this worldview, mental health services should be far more common since it’s less of a defect to be addressed.
I suppose the intervention you have in mind is to improve the mental health of organisation as a whole—that is, change the system, rather than keep the system fixed but help the people in the system. This is a classic organizational psychology piece, and I’m sure there are consultants EAs orgs could hire to help them with this.
No, this isn’t what I’m thinking about. I don’t understand what you’re saying here.
Given my original comment, I think it’s appropriate to give a broad view of the potential forms the intervention can take and what can be achieved by a strong founding team.
These services can take forms that don’t currently exist. I think it’s very feasible to find multiple useful programs or approaches that could be implemented.
When people say “EAs should do X”, it’s usually wise to reflect on whether that is really the case—are there skills or mindsets that members of the EA community are bringing to X?
The case I would like to see made her is why EA orgs would benefit from getting mental health services from some EA provider rather than the existing ones available. Could you elaborate on why you think this is the case? I’m not sure why you think current mental services, eg regular therapists are unapproachable and how having an ‘EA’ service would get around this. I don’t buy the access point, at least not for EA orgs: access is a question of funding, and that’s something EA orgs plausibly have. Demand for a service leads to more of it being supplied (of course, there are elasticities). If I buy more groceries, it’s not like someone else goes hungry, it’s more like more groceries get produced.
No, this isn’t what I’m thinking about. I don’t understand what you’re saying here.
I assume you didn’t mean it this way, but I found the tone of this comment rather brusque and dismissive. Please be mindful of that for discussions, particularly those in the EA forum.
I’m not sure how else to explain my point. One approach to MH is to talk to each individual about what they can do. Another approach, the organisational psychology one, is to think about how to change office culture and working practices. Sort of bottom-up vs top-down.
Given my original comment, I think it’s appropriate to give a broad view of the potential forms the intervention can take and what can be achieved by a strong founding team.
These services can take forms that don’t currently exist. I think it’s very feasible to find multiple useful programs or approaches that could be implemented.
I’d be interested to hear you expand on what you mean here!
The case I would like to see made her is why EA orgs would benefit from getting mental health services from some EA provider rather than the existing ones available.
My parent comment is a case for an organization that provides mental health services to EAs in general.
I don’t know why a case needs to be made that it needs to replace mental health services provided to EA orgs that are already available, which seems to be a major element in your objection.
Replacing or augmenting mental health services in EA orgs is one aspect/form/subset of the services that could be provided. This isn’t necessarily for it to be successful, the case is broader.
However, some of the points given might suggest how it could do this, and at least be helpful to EA orgs.
I’m not sure why you think current mental services, eg regular therapists are unapproachable and how having an ‘EA’ service would get around this
Ok, here’s another response. In one of the comments here, someone brought up a navigator service (which may be fantastic, or it may not be that active).
On the website it says:
I can imagine objections related stats/validity with this one figure, but it’s a member of a class of evidence that seems ample.
Separately and additionally, I have models that support the view further.
However, honestly, as indicated by your objection, I’m concerned it’s not going to be practical/productive to try to lay them out.
I view myself as “steel-manning” an intervention (which I have no intention to implement or have any personal benefit to me) which makes my discourse acceptable to me.
There seems to be an opportunity for founding an org for “EA mental health”:
It’s plausible that, even ignoring the wellbeing of the recipients, the cost effectiveness for impact could be enough. For example, even if you had to pay the costs of treating a thousand EAs, if doing so pulled a dozen of those people out of depression, the impact might be worth it. Depression and burnout is terrible. Also, the preventative value from understanding/protecting burnout and other issues seems high.
Conditional on mental health being a viable cause area, there’s probably spillover effects from founding and executing the org (normalizing mental health, gears level knowledge for other interventions, partnerships and visibility)
More delicately/pragmatically, EA meta orgs tend to be attractive to founders as such orgs tend to be high status and more stable in funding. There’s probably many versions of this org where a competent generalist EA founder, without mental health experience, can execute this well, by recruiting/partnering with established mental health practitioners .
On the other hand, I could see the “ceiling” and “floor” for success of such an org to be high. For example, it may be cheap and highly effective to get virtually every senior EA leader to contribute snippets of their own experiences. I already see this happening in private, high trust situations (or Peter Wildeford right above my comment in the EA forum). A founder probably needs to be trusted, established, and experienced to do the intervention well.
Mental health may be a unique case that helps overcome the normal reluctance of charities to spend on staff. There seems like an opportunity where a skilled founder can use the historical reluctances around mental health and charity spending, to cancel each other out.
There’s a sort of “virtue ethics” reason for having an org focusing on EA mental health. It just feels right. Long time EAs know a lot of people who have given a lot of their lives and income (which means giving up safety and ability to buy mental health services). This seems especially important in less technical cause areas.
EAs are different in how they view the world and why they think their choices have value. There’s probably benefits to focus that an EA specific org would have.
My gut reaction is to think we should just make use of existing mental health resources out there which are abundant. I’m not sure why it would help for it to be EA specific.
It would certainly be useful for someone to make a summary of available resources and/or to do a meta-review of what works for mental health, but I can’t see that this would require a whole organisation to be set up. CEA could hire one person to work on this for example and that would seem to me to be sufficient.
This org would be setup to provision actual mental health services or programs at free or low cost for EAs.
To be really concrete, maybe imagine a pilot with 1-2 EA founders and maybe 2-4.0 FTE practitioners or equivalent partnerships.
There are perspectives where the value of reviews and compendium websites have limited value and my first impression is that it may apply here.
This is a very strong statement. I have trouble relating to this belief.
Your profile says you work in management consulting or economics. You also seem to live in the UK. You seem have and directly use high human capital in management or highly technical fields. In totality, you probably enjoy substantial mental health services and while such jobs can be stressful, it’s usually the case they do not involve direct emotional trauma.
Not all EAs enjoy the same experiences. For example:
In global health or in animal welfare, people who love animals have to pour over graphic footage of factory farming, or physically enter factory farms risking specific legal retaliation from industry, to create movies like this. I assure you funding for such work is low and there may be no mental health support.
Similar issues exist in global health and poverty, where senior leaders often take large pay cuts and reduction of benefits.
I know an EA-like person who had to work for free during 2020 for 3-4 months in a CEO role, where they worked 20 hour days. They regularly faced pleas for medical supplies from collapsing institutions, as well as personal attacks and fires inside and outside the org, for not doing enough.
Many of these roles are low status or have zero pay or benefits.
Many of the people who do this work above have very high human capital and would enjoy high pay and status, but actively choose to do work because no one else will or will even understand it.
While I am sympathetic to the idea of doing lots of well-being stuff, it’s not obvious why this needs a new EA-specified org.
To restate, I take it thought is that improving mental health of EAs could be a plausible priority because of the productivity gains from those people, which allows them to do more good—saliently, the main benefit of this isn’t supposed to come from the welfare gains to the treated people.
Seeing as people can buy mental health treatments for themselves, and orgs can pay for it for their staff, I suppose the intervention you have in mind is to improve the mental health of organisation as a whole—that is, change the system, rather than keep the system fixed but help the people in the system. This is a classic organizational psychology piece, and I’m sure there are consultants EAs orgs could hire to help them with this. Despite being a huge happiness nerd, I’m actually not familiar with the world of happiness/mental health organisational consultancies. One I do know of is Friday Pulse, but I’m sure they aren’t the only people who try to do this sort of thing.
Given such things exist, it’s not obvious why self-described effective altruists should prioritise setting up more things of this type.
I think broadly what you’re saying is “Well, if impact can be improved by mental health, then orgs can provision this without our help.”
I’m pattern matching this to a “free market” sort of argument, which I don’t think this is right.
Most directly, I argue that mental health services can be very unapproachable and are effectively under provisioned. Many people do not have access to it, contrary to what you’re saying. Secondly, there’s large differences in quality and fit from services, and I suspect many EAs would benefit from a specific set of approaches that can be developed for them.
More meta, I think a reasonable worldview is that mental health is a resource which normally gets depleted. Despite—or because someone is a strong contributor, they can make use of mental health resources. In this worldview, mental health services should be far more common since it’s less of a defect to be addressed.
No, this isn’t what I’m thinking about. I don’t understand what you’re saying here.
Given my original comment, I think it’s appropriate to give a broad view of the potential forms the intervention can take and what can be achieved by a strong founding team.
These services can take forms that don’t currently exist. I think it’s very feasible to find multiple useful programs or approaches that could be implemented.
When people say “EAs should do X”, it’s usually wise to reflect on whether that is really the case—are there skills or mindsets that members of the EA community are bringing to X?
The case I would like to see made her is why EA orgs would benefit from getting mental health services from some EA provider rather than the existing ones available. Could you elaborate on why you think this is the case? I’m not sure why you think current mental services, eg regular therapists are unapproachable and how having an ‘EA’ service would get around this. I don’t buy the access point, at least not for EA orgs: access is a question of funding, and that’s something EA orgs plausibly have. Demand for a service leads to more of it being supplied (of course, there are elasticities). If I buy more groceries, it’s not like someone else goes hungry, it’s more like more groceries get produced.
I assume you didn’t mean it this way, but I found the tone of this comment rather brusque and dismissive. Please be mindful of that for discussions, particularly those in the EA forum.
I’m not sure how else to explain my point. One approach to MH is to talk to each individual about what they can do. Another approach, the organisational psychology one, is to think about how to change office culture and working practices. Sort of bottom-up vs top-down.
I’d be interested to hear you expand on what you mean here!
My parent comment is a case for an organization that provides mental health services to EAs in general.
I don’t know why a case needs to be made that it needs to replace mental health services provided to EA orgs that are already available, which seems to be a major element in your objection.
Replacing or augmenting mental health services in EA orgs is one aspect/form/subset of the services that could be provided. This isn’t necessarily for it to be successful, the case is broader.
However, some of the points given might suggest how it could do this, and at least be helpful to EA orgs.
Ok, here’s another response. In one of the comments here, someone brought up a navigator service (which may be fantastic, or it may not be that active).
On the website it says:
I can imagine objections related stats/validity with this one figure, but it’s a member of a class of evidence that seems ample.
Separately and additionally, I have models that support the view further.
However, honestly, as indicated by your objection, I’m concerned it’s not going to be practical/productive to try to lay them out.
I view myself as “steel-manning” an intervention (which I have no intention to implement or have any personal benefit to me) which makes my discourse acceptable to me.