I liked this post, and agree with many of these comments regarding types of analysis that are less common within EA.
However, I’ll make the same comment here that I do on many other posts: Given that EA doesn’t have much of (thing X), how should we get more of it?
For your post, my questions are:
Which of the types of analysis you mentioned, if any, do you think would be most useful to pursue? You could make an argument for this based on the causes you think are most important to gain information about, the types of analysis you think EA researchers are best-suited to pursue, the missions of the organizations best-positioned to make new research grants, etc.
Are there types of EA research you think should be less popular? Do the research agendas of current EA orgs overlap in ways that could be “fixed” by one org agreeing to move in a different direction? Does any existing research stand out as redundant, or as “should have been done using/incorporating other methodologies”?
Are there fields you think have historically been better at getting “correct answers” than EA about certain fields that are or ought to be interesting to EAs—or, if not “better”, at least “getting some correct answers EA missed or would have missed”? What are those answers?
This question might run headlong into problems around methodologies EA doesn’t use or give credit to, but I’d hope that certain answers derived by methods unpopular within EA might still be “verifiable” by EA, e.g. by generating results the movement can appreciate/understand.
This is a good question, but it makes a bit of a leap, and I don’t think answers to it should have been included in the original article. The article doesn’t actually say EA shouldn’t be ideological, just that it is. I read it as descriptive, not prescriptive. I think the article was strong just pointing out ideological aspects of EA and letting readers think about whether they’re happy with that.
I don’t think the post was wrong not to address any of these questions (they would all require serious effort to answer). I only meant to point out that these are questions which occurred to me as I read the post and thought about it afterward. I’d be happy if anything in my response inspired someone to create a follow-up post.
I liked this post, and agree with many of these comments regarding types of analysis that are less common within EA.
However, I’ll make the same comment here that I do on many other posts: Given that EA doesn’t have much of (thing X), how should we get more of it?
For your post, my questions are:
Which of the types of analysis you mentioned, if any, do you think would be most useful to pursue? You could make an argument for this based on the causes you think are most important to gain information about, the types of analysis you think EA researchers are best-suited to pursue, the missions of the organizations best-positioned to make new research grants, etc.
Are there types of EA research you think should be less popular? Do the research agendas of current EA orgs overlap in ways that could be “fixed” by one org agreeing to move in a different direction? Does any existing research stand out as redundant, or as “should have been done using/incorporating other methodologies”?
Are there fields you think have historically been better at getting “correct answers” than EA about certain fields that are or ought to be interesting to EAs—or, if not “better”, at least “getting some correct answers EA missed or would have missed”? What are those answers?
This question might run headlong into problems around methodologies EA doesn’t use or give credit to, but I’d hope that certain answers derived by methods unpopular within EA might still be “verifiable” by EA, e.g. by generating results the movement can appreciate/understand.
This is a good question, but it makes a bit of a leap, and I don’t think answers to it should have been included in the original article. The article doesn’t actually say EA shouldn’t be ideological, just that it is. I read it as descriptive, not prescriptive. I think the article was strong just pointing out ideological aspects of EA and letting readers think about whether they’re happy with that.
I don’t think the post was wrong not to address any of these questions (they would all require serious effort to answer). I only meant to point out that these are questions which occurred to me as I read the post and thought about it afterward. I’d be happy if anything in my response inspired someone to create a follow-up post.
I misunderstood—thanks for clarifying!