People have argued for i) flatter organizational structure ii) pivoting from charity evaluation to more fundamental research (in order to add more value over and above GiveWell), and iii) growing emphasis of the EA brand for a while, so it’s good to see this feedback incorporated.
The Institute for EA and the reported success with high net-worth outreach are awesome developments, as is Will’s direct participation.
Great news, and cheers on all of your terrific work!
People have argued for i) flatter organizational structure ii) pivoting from charity evaluation to >more fundamental research (in order to add more value over and above GiveWell), and iii) >growing emphasis of the EA brand for a while, so it’s good to see this feedback incorporated.
Yeah, I want CEA strategy to be guided significantly by the views of engaged members of the EA community. (Of course, that doesn’t mean we’ll always go with others’ views (not least because different people regularly disagree)). This, it seems to me, has both inside and outside view support. Inside view: when I talk to engaged EAs, they often have interesting and well-reasoned views about what CEA should or should not be doing. Outside view: the current dedicated EAs are the equivalent of the ‘early users’ of EA as an idea, and the standard advice for startups is to pay a huge amount of attention to what early users want, and be responsive to that. I also simply see CEA’s role in significant part as to serve the EA community, so it’s therefore obviously important to know what that community thinks is most important.
“Early users” of EA would be the beneficiaries, not the participants, right? This relates to the fundamental reason why charities can get away with being ineffective—the sentient beings receiving the benefit are not the ones deciding to contribute money and effort. You goal shouldn’t be to please EAs, it should be to help people. Usually, pleasing donors doesn’t align with helping people, although that’s probably less true in CEA’s case.
People have argued for i) flatter organizational structure ii) pivoting from charity evaluation to more fundamental research (in order to add more value over and above GiveWell), and iii) growing emphasis of the EA brand for a while, so it’s good to see this feedback incorporated.
The Institute for EA and the reported success with high net-worth outreach are awesome developments, as is Will’s direct participation.
Great news, and cheers on all of your terrific work!
Thanks!
Yeah, I want CEA strategy to be guided significantly by the views of engaged members of the EA community. (Of course, that doesn’t mean we’ll always go with others’ views (not least because different people regularly disagree)). This, it seems to me, has both inside and outside view support. Inside view: when I talk to engaged EAs, they often have interesting and well-reasoned views about what CEA should or should not be doing. Outside view: the current dedicated EAs are the equivalent of the ‘early users’ of EA as an idea, and the standard advice for startups is to pay a huge amount of attention to what early users want, and be responsive to that. I also simply see CEA’s role in significant part as to serve the EA community, so it’s therefore obviously important to know what that community thinks is most important.
“Early users” of EA would be the beneficiaries, not the participants, right? This relates to the fundamental reason why charities can get away with being ineffective—the sentient beings receiving the benefit are not the ones deciding to contribute money and effort. You goal shouldn’t be to please EAs, it should be to help people. Usually, pleasing donors doesn’t align with helping people, although that’s probably less true in CEA’s case.