Import more of Silicon Valley’s “pay it forward” culture
Less reputation management / more psychological safety
Less sniping
OAK, Bay Area group houses, EA Hotel
Again, building out (non-dominating) ways to audit & collect data from the object-level projects
Less scrupulosity
Ties into the above but deserves its own bullet given how our collective psychology skews
Compassionate fighting against the thought-pattern Scott Alexander describes here
Make EA sexier
Market to retail donors / the broader public (e.g. Future Perfect, e.g. 80k, e.g. GiveWell running ads on Vox podcasts)
Market to impact investors (e.g. Lionheart) and big philanthropy
Cultivating more “I want to be like that” energy
Seems easy to walk back if it isn’t working because so many interest groups are competing for mindshare
Support EA physical health
Propagate effective treatments for RSI & back problems, as above
Take the mind-body connection seriously
Propagate best practices for nutrition, sleep, exercise; make the case that attending to these is prerequisite to having impact (rather than trading off against having impact)
Advance our frontier of knowledge
e.g. GPI’s research agenda, e.g. the stuff Michael Dickens laid out in his comment
More work on how to solve coordination problems
More work on governance (e.g. Vitalik’s stuff, e.g. the stuff Palladium is exploring)
Fund many moonshots / speculative projects
Fund projects that can be walked back if they aren’t working out (which is most projects, though some tech projects may be hard-to-reverse)
That’s an interesting list, especially for 30 minutes :) (Makes me wonder what you or others could do with more time.)
Much of it focused on EA community stuff. I kind of wonder if funders are extra resistant to some of this because it seems like they’re just “giving money to their friends”, which in some ways, they are. I could see some of it feeling odd and looking bad, but I think if done well it could be highly effective.
Many religious and ethnic groups spend a lot of attention helping each other, and it seems to have very positive effects. Right now EA (and the subcommunities I know of in EA) seem fairly far from that still.
A semi-related point on that topic; I’ve noticed that for many intelligent EAs, it feels like EA is a competition, not a collaboration. Individuals at social events will be trying to one-up each other with their cleverness. I’m sure I’ve contributed to this. I’ve noticed myself becoming jealous when I hear of others who are similar in some ways doing well, which really should make no sense at all. I think in the anonymous surveys 80K did a while back a bunch of people complained that there was a lot of signaling going on and that status was a big deal.
Many companies and open source projects live or die depending on the cultural health. Investments in the cultural health of EA may be difficult to measure, but pay off heavily in the long run.
100% agree that cultural health is very important, and that EA is under-investing in it. (The “we don’t want to just give money to our friends” point resonates, and other scrupulosity-related stuff is probably at play here as well.)
Individuals at social events will be trying to one-up each other with their cleverness. I’m sure I’ve contributed to this. I’ve noticed myself becoming jealous when I hear of others who are similar in some ways doing well, which really should make no sense at all.
Thank you for talking about this!
I’ve noticed similar patterns in my own mind, especially around how I engage with this Forum. (I’ve been stepping back from it more this year because I’ve noticed that a lot of my engagement wasn’t coming from a loving place.)
These dynamics may not make any sense, but there are deep biological & psychological forces giving rise to them. [insert Robin Hanson’s “everything you do is signaling” rant here]
… I think in the anonymous surveys 80K did a while back a bunch of people complained that there was a lot of signaling going on and that status was a big deal.
Right. Last year concerns about status made a lot of heat on the Forum (1, 2, 3), but as far as I know nothing has really changed since then, perhaps other than more folks acknowledging that status is a thing.
(A bunch of those ideas seem interesting, but I’ll just comment on the one where I have something to say)
Seems easy to walk back if it isn’t working because so many interest groups are competing for mindshare
This does seem to me like it makes it easy to walk back efforts to make EA sexier, but it doesn’t seem like it makes it easy to do it again later in a different way (without the odds of success being impaired by the first attempt).
Essentially:
I think we could make EA relatively small/non-prominent/whatever again if we wanted to
But it also seems plausible to me that EA can only make “one big first impression”, and that that’ll colour a lot of people’s perceptions of EA if it tries to make a splash again later (even perhaps 10-30 years later).
Put another way:
They might stop thinking about EA if we stop actively reminding them
But then if we start competing for their attention again later they’ll be like “Wait, aren’t those the people who [whatever impression they got of us the first time]?”
Here’s a list I came up with from thinking about this for ~30 minutes:
Better ways of measuring what matters
Better neuroimaging tech to parse out the neurological basis of desirable & undesirable subjective states
Better measures of subjective well-being
Help EAs see more clearly, unpack + resolve personal traumas, and boost their efficacy + motivation
Emotional healing as a prerequisite to rationality
CFAR, OAK, Leverage, etc.
Plus building methods to audit which projects are working, which are failing, which are stagnating
Perhaps also a data collection project that vacuums up outcomes from the object-level projects?
Strengthen EA community ties / our sense of fellowship
More honesty about how weird effective research methods can be
More acknowledgement of the interdependent causal complex that gives rise to good research (e.g. Alex Flint’s introduction here)
More Ben Franklin-esque Juntos
Import more of Silicon Valley’s “pay it forward” culture
Less reputation management / more psychological safety
Less sniping
OAK, Bay Area group houses, EA Hotel
Again, building out (non-dominating) ways to audit & collect data from the object-level projects
Less scrupulosity
Ties into the above but deserves its own bullet given how our collective psychology skews
Compassionate fighting against the thought-pattern Scott Alexander describes here
Make EA sexier
Market to retail donors / the broader public (e.g. Future Perfect, e.g. 80k, e.g. GiveWell running ads on Vox podcasts)
Market to impact investors (e.g. Lionheart) and big philanthropy
Cultivating more “I want to be like that” energy
Seems easy to walk back if it isn’t working because so many interest groups are competing for mindshare
Support EA physical health
Propagate effective treatments for RSI & back problems, as above
Take the mind-body connection seriously
Propagate best practices for nutrition, sleep, exercise; make the case that attending to these is prerequisite to having impact (rather than trading off against having impact)
Advance our frontier of knowledge
e.g. GPI’s research agenda, e.g. the stuff Michael Dickens laid out in his comment
More work on how to solve coordination problems
More work on governance (e.g. Vitalik’s stuff, e.g. the stuff Palladium is exploring)
Fund many moonshots / speculative projects
Fund projects that can be walked back if they aren’t working out (which is most projects, though some tech projects may be hard-to-reverse)
Worry less about brand management
That’s an interesting list, especially for 30 minutes :) (Makes me wonder what you or others could do with more time.)
Much of it focused on EA community stuff. I kind of wonder if funders are extra resistant to some of this because it seems like they’re just “giving money to their friends”, which in some ways, they are. I could see some of it feeling odd and looking bad, but I think if done well it could be highly effective.
Many religious and ethnic groups spend a lot of attention helping each other, and it seems to have very positive effects. Right now EA (and the subcommunities I know of in EA) seem fairly far from that still.
https://www.nationalgeographic.com/culture/2018/09/south-asia-america-motels-immigration/
A semi-related point on that topic; I’ve noticed that for many intelligent EAs, it feels like EA is a competition, not a collaboration. Individuals at social events will be trying to one-up each other with their cleverness. I’m sure I’ve contributed to this. I’ve noticed myself becoming jealous when I hear of others who are similar in some ways doing well, which really should make no sense at all. I think in the anonymous surveys 80K did a while back a bunch of people complained that there was a lot of signaling going on and that status was a big deal.
Many companies and open source projects live or die depending on the cultural health. Investments in the cultural health of EA may be difficult to measure, but pay off heavily in the long run.
Thanks!
100% agree that cultural health is very important, and that EA is under-investing in it. (The “we don’t want to just give money to our friends” point resonates, and other scrupulosity-related stuff is probably at play here as well.)
Thank you for talking about this!
I’ve noticed similar patterns in my own mind, especially around how I engage with this Forum. (I’ve been stepping back from it more this year because I’ve noticed that a lot of my engagement wasn’t coming from a loving place.)
These dynamics may not make any sense, but there are deep biological & psychological forces giving rise to them. [insert Robin Hanson’s “everything you do is signaling” rant here]
Right. Last year concerns about status made a lot of heat on the Forum (1, 2, 3), but as far as I know nothing has really changed since then, perhaps other than more folks acknowledging that status is a thing.
(Status seems closely related to scrupulosity & to EA being vetting-constrained; I haven’t unpacked this yet.)
(A bunch of those ideas seem interesting, but I’ll just comment on the one where I have something to say)
This does seem to me like it makes it easy to walk back efforts to make EA sexier, but it doesn’t seem like it makes it easy to do it again later in a different way (without the odds of success being impaired by the first attempt).
Essentially:
I think we could make EA relatively small/non-prominent/whatever again if we wanted to
But it also seems plausible to me that EA can only make “one big first impression”, and that that’ll colour a lot of people’s perceptions of EA if it tries to make a splash again later (even perhaps 10-30 years later).
Put another way:
They might stop thinking about EA if we stop actively reminding them
But then if we start competing for their attention again later they’ll be like “Wait, aren’t those the people who [whatever impression they got of us the first time]?”
Posts that informed my thinking here:
Hard-to-reverse decisions destroy option value (which I see you also referenced yourself)
The fidelity model of spreading ideas
How valuable is movement growth?
Why not to rush to translate effective altruism into other languages