The second cohort (‘logical ceiling’) are people who basically already run their entire lives around EA principles (and I met several at EAGx). They’ve taken the 10% pledge, they work at EA orgs, they are vegan, they attend every EA event they reasonably can, they volunteer, they are active online, etc. It’s hard to imagine how people this committed could meaningfully increase their engagement with EA.
I think ‘engagement’ can be a misleading way to think about this: you can be fully engaged, but still increase your impact by changing how you spend you efforts.
Thinking back over my personal experience, three years ago I think I would probably be counted in this “fully engaged” cohort: I was donating 50%, writing publicly about EA, co-hosting our local EA group, had volunteered for EA organizations and at EA conferences, and was pretty active on the EA Forum. But since then I’ve switched careers from earning to give to direct work in biosecurity and am now leading a team at the NAO. I think my impact is significantly higher now (ex: I would likely reject an offer to resume earning to give at 5x my previous donation level), but the change here isn’t that I’m putting more of my time into EA-motivated work, but instead that (prompted by discussion with other EAs, and downstream from EA cause prioritization work) my EA-motivated work time is going into doing different things.
Yeah, I think this is an excellent point that you have made more clearly than I did: we are measuring engagement as a proxy for effectiveness. It might be a decent proxy for something like ‘probability of future effectiveness’ when considering young students in particular—if an intervention meaningfully increases the likelihood that some well-meaning undergrads make EA friends and read books and come to events, then I have at least moderate confidence that it also increases impact because some of those people will go on to make more impactful choices through their greater engagement with EA ideas. But I don’t think it’s a good proxy for the amount of impact being made by ‘people who basically run their whole lives around EA ideas already.’ It’s hard to imagine how these people could increase their ENGAGEMENT with EA (they’ve read all the books, they RUN the events, they’re friends with most people in the community, etc etc) but there are many ways they could increase their IMPACT, which may well be facilitated/prompted by EAGx but not captured by the data.
Out of curiosity, would you say that since switching careers, your engagement measured by these kind of metrics (books read, events attended, number of EA friends, frequency of forum activity, etc) has gone up, gone down, or stayed the same?
would you say that since switching careers, your engagement measured by these kind of metrics (books read, events attended, number of EA friends, etc) has gone up, gone down, or stayed the same?
I think it’s up, but a lot of that is pretty confounded by other things going in the community. For example, my five most-upvoted EA Forum posts are since switching careers, but several are about controversial community issues, and a lot of the recency effect goes away when looking at inflation-adjusted voting. I did attended EAG in 2023 for the first time since 2016, though, which was driven by wanting to talk to people about biosecurity.
I think ‘engagement’ can be a misleading way to think about this: you can be fully engaged, but still increase your impact by changing how you spend you efforts.
Thinking back over my personal experience, three years ago I think I would probably be counted in this “fully engaged” cohort: I was donating 50%, writing publicly about EA, co-hosting our local EA group, had volunteered for EA organizations and at EA conferences, and was pretty active on the EA Forum. But since then I’ve switched careers from earning to give to direct work in biosecurity and am now leading a team at the NAO. I think my impact is significantly higher now (ex: I would likely reject an offer to resume earning to give at 5x my previous donation level), but the change here isn’t that I’m putting more of my time into EA-motivated work, but instead that (prompted by discussion with other EAs, and downstream from EA cause prioritization work) my EA-motivated work time is going into doing different things.
Yeah, I think this is an excellent point that you have made more clearly than I did: we are measuring engagement as a proxy for effectiveness. It might be a decent proxy for something like ‘probability of future effectiveness’ when considering young students in particular—if an intervention meaningfully increases the likelihood that some well-meaning undergrads make EA friends and read books and come to events, then I have at least moderate confidence that it also increases impact because some of those people will go on to make more impactful choices through their greater engagement with EA ideas. But I don’t think it’s a good proxy for the amount of impact being made by ‘people who basically run their whole lives around EA ideas already.’ It’s hard to imagine how these people could increase their ENGAGEMENT with EA (they’ve read all the books, they RUN the events, they’re friends with most people in the community, etc etc) but there are many ways they could increase their IMPACT, which may well be facilitated/prompted by EAGx but not captured by the data.
Out of curiosity, would you say that since switching careers, your engagement measured by these kind of metrics (books read, events attended, number of EA friends, frequency of forum activity, etc) has gone up, gone down, or stayed the same?
I think it’s up, but a lot of that is pretty confounded by other things going in the community. For example, my five most-upvoted EA Forum posts are since switching careers, but several are about controversial community issues, and a lot of the recency effect goes away when looking at inflation-adjusted voting. I did attended EAG in 2023 for the first time since 2016, though, which was driven by wanting to talk to people about biosecurity.