hmmmm I’d disagree based on my experience running GWWC events and EA Taskmaster but I’m probably not going in with the mindset of optimising for short term impact/urgency based on recruitment into AI safety.
I think getting the ratio of newbies / experienced EAs right can be hard so I try to think carefully about how to attract the “hard side” of the network (experienced EAs). My approach is more creating a fun environment where people who are currently doing direct work and/or donating significantly and effectively want to hang out or find useful.
My model is that significant behaviour change requires multiple positive interactions and time in between for reflection.
I certainly agree with your model on behaviour change. Likewise, my approach has over the years simplified from more convoluted ideas to one simple maxim: “Just make sure you feed them. The rest will often take care of itself.”
I’m concerned about animal welfare, human welfare and AI safety—without the urgency of AI dominating entirely.
I think what I highlight is similar to how many professional communities are optimized for matching prospective employers with employees rather than the happiness and enjoyment of their members. If there are 100 members but employers are only interested in one candidate you will have 99 less happy members. But this is not a bad thing as the goal of the community is to matching particular employers. It could easily be a mistake to find different employers and different events to make it more likely that you’ll have more happy members—risks include value drift and reducing your actual goal of maximizing impact. Still, 99% of your members are disgruntled as a tradeoff.
Professional-adjacent communities like say “computer tinkerers who just do it for fun” do not have this problem. If 99% in the community are not happy then you either change what you are doing to what the community of tinkerers are interested in or the community ceases to exist—or at least this is a much more likely outcome.
hmmmm I’d disagree based on my experience running GWWC events and EA Taskmaster but I’m probably not going in with the mindset of optimising for short term impact/urgency based on recruitment into AI safety.
I think getting the ratio of newbies / experienced EAs right can be hard so I try to think carefully about how to attract the “hard side” of the network (experienced EAs). My approach is more creating a fun environment where people who are currently doing direct work and/or donating significantly and effectively want to hang out or find useful.
My model is that significant behaviour change requires multiple positive interactions and time in between for reflection.
Thank you for the perspective!
I certainly agree with your model on behaviour change. Likewise, my approach has over the years simplified from more convoluted ideas to one simple maxim: “Just make sure you feed them. The rest will often take care of itself.”
I’m concerned about animal welfare, human welfare and AI safety—without the urgency of AI dominating entirely.
I think what I highlight is similar to how many professional communities are optimized for matching prospective employers with employees rather than the happiness and enjoyment of their members. If there are 100 members but employers are only interested in one candidate you will have 99 less happy members. But this is not a bad thing as the goal of the community is to matching particular employers. It could easily be a mistake to find different employers and different events to make it more likely that you’ll have more happy members—risks include value drift and reducing your actual goal of maximizing impact. Still, 99% of your members are disgruntled as a tradeoff.
Professional-adjacent communities like say “computer tinkerers who just do it for fun” do not have this problem. If 99% in the community are not happy then you either change what you are doing to what the community of tinkerers are interested in or the community ceases to exist—or at least this is a much more likely outcome.