I really appreciate how thoughtful you’ve been about this, including sensitivity to downside risks. Do you have any plans to monitor the downside risks? A lot of them seem quite verifiable/testable.
(Note: some of these are things I’ve done already, some are things I have vague plans to do at some point, and some are much firmer plans. You shouldn’t take this as a commitment from me to take any of these particular actions, but more a sense of the kind of stuff I’ve thought about here. I’d greatly appreciate suggestions you or others have about how I could do better!)
Community growth rate —
Staying up-to-date on the views of others in community building;
Learning from data on the current growth rate of EA (much of which is gathered by Rethink, so thanks!);
User interviews.
Annoying people —
This one seems the hardest to learn about because if you annoy someone and cause them to bounce off your stuff, they don’t want to talk to you and give you feedback on why! Aaargh.
I did do one survey (with the help of the Good Impressions team) to ask people if they got annoyed by some very high-frequency ads we ran, but I don’t trust the results too much (small, biased sample).
I ran a message testing survey last year which had a free text response for one question, so we found some evidence of things that annoyed people.
We also monitor discussion about 80k on social media (Twitter, Facebook, LinkedIn, Reddit) as well as the comments on any ads we run.
Idea inoculation —
For this one I think I can mostly use our usual feedback mechanisms for 80k (experts/advisors who read our drafts, our annual user survey, user interviews, some feedback forms on our site, message testing, feedback we get opportunistically via email or at EAGs, etc.)
Demographic diversity —
We gather info on our audience’s demographics in our annual user survey, so that’s the most useful source here.
I just want to second Peter’s comments—your discussion of downside risks was really thoughtful and cogent.
Of the risks you identified, I’m personally most concerned about 80k contributing to a negative feedback loop on demographic diversity (public perception of EA becomes that it is demographically homogenous → people who don’t match those demographic characteristics are rightfully more skeptical of EA → recruitment efforts target demographically skewed areas because engagement in them is higher → public perception of EA as demographically homogenous solidifies → …). I look forward to seeing what you find in your investigation of methods of improving 80k’s demographic reach!
I really appreciate how thoughtful you’ve been about this, including sensitivity to downside risks. Do you have any plans to monitor the downside risks? A lot of them seem quite verifiable/testable.
Thanks Peter!
Yep; in the order I mentioned them in the post:
(Note: some of these are things I’ve done already, some are things I have vague plans to do at some point, and some are much firmer plans. You shouldn’t take this as a commitment from me to take any of these particular actions, but more a sense of the kind of stuff I’ve thought about here. I’d greatly appreciate suggestions you or others have about how I could do better!)
Community growth rate —
Staying up-to-date on the views of others in community building;
Learning from data on the current growth rate of EA (much of which is gathered by Rethink, so thanks!);
User interviews.
Annoying people —
This one seems the hardest to learn about because if you annoy someone and cause them to bounce off your stuff, they don’t want to talk to you and give you feedback on why! Aaargh.
I did do one survey (with the help of the Good Impressions team) to ask people if they got annoyed by some very high-frequency ads we ran, but I don’t trust the results too much (small, biased sample).
I ran a message testing survey last year which had a free text response for one question, so we found some evidence of things that annoyed people.
We also monitor discussion about 80k on social media (Twitter, Facebook, LinkedIn, Reddit) as well as the comments on any ads we run.
Idea inoculation —
For this one I think I can mostly use our usual feedback mechanisms for 80k (experts/advisors who read our drafts, our annual user survey, user interviews, some feedback forms on our site, message testing, feedback we get opportunistically via email or at EAGs, etc.)
Demographic diversity —
We gather info on our audience’s demographics in our annual user survey, so that’s the most useful source here.
Also the EA Survey :D
I just want to second Peter’s comments—your discussion of downside risks was really thoughtful and cogent.
Of the risks you identified, I’m personally most concerned about 80k contributing to a negative feedback loop on demographic diversity (public perception of EA becomes that it is demographically homogenous → people who don’t match those demographic characteristics are rightfully more skeptical of EA → recruitment efforts target demographically skewed areas because engagement in them is higher → public perception of EA as demographically homogenous solidifies → …). I look forward to seeing what you find in your investigation of methods of improving 80k’s demographic reach!