From the early sections, I thought you were going in the opposite direction—how already involved EAs can be mindful of their secret motives for being involved. (I think that’s super-important, btw.) For outreach, I would have thought the implication was that we should balance the need to appeal to and accomodate the human need for status with the possibility that EA would get diluted by the attempt to market EA in a low-fidelity way. I agree with CEA’s emphasis on the high-fidelity model: there’s no point in growing EA if it stops being EA in the process.
I think there is some very low-hanging fruit EA orgs can pick re:prestige they can offer recruits. #1 is making sure the name of the organization and the name of positions are as impressive and not-loaded as possible. Foundational Research Institute, for example, went with that title over “The Future of Suffering Institute” because they got feedback from academics that they wouldn’t be able to put that name on their CVs. At Harvard EA, we have multiple named fellowships for students (the undergrad one is the “Arete Fellowship”). There is no reason we can’t call our programs fellowships or name them, even though they are just student club programming. But being able to put “2016 Fellow of the Harvard College Effective Altruism Arete Fellowship” on a resume gives Harvard students the prestige they need to justify spending their time on us. There is a ton of cheap status EA can confer without it costing us anything (just requires us to contribute to the inflation of terms for volunteering, employment, and awards—I’m not losing any sleep).
From the early sections, I thought you were going in the opposite direction—how already involved EAs can be mindful of their secret motives for being involved. (I think that’s super-important, btw.) For outreach, I would have thought the implication was that we should balance the need to appeal to and accomodate the human need for status with the possibility that EA would get diluted by the attempt to market EA in a low-fidelity way. I agree with CEA’s emphasis on the high-fidelity model: there’s no point in growing EA if it stops being EA in the process.
I think there is some very low-hanging fruit EA orgs can pick re:prestige they can offer recruits. #1 is making sure the name of the organization and the name of positions are as impressive and not-loaded as possible. Foundational Research Institute, for example, went with that title over “The Future of Suffering Institute” because they got feedback from academics that they wouldn’t be able to put that name on their CVs. At Harvard EA, we have multiple named fellowships for students (the undergrad one is the “Arete Fellowship”). There is no reason we can’t call our programs fellowships or name them, even though they are just student club programming. But being able to put “2016 Fellow of the Harvard College Effective Altruism Arete Fellowship” on a resume gives Harvard students the prestige they need to justify spending their time on us. There is a ton of cheap status EA can confer without it costing us anything (just requires us to contribute to the inflation of terms for volunteering, employment, and awards—I’m not losing any sleep).