I loved this critique and sympathize with it. As somebody studying Public Policy which requires interfacing with all the different social science departments in my university, I empathize with what you mean by the “insularity” of the economics department!
But I think you are missing a bigger critique here. It is that EA philosophy has a lack of rigor stemming from an intrinsic insularity that is not inherited from elsewhere. If anything, EA has to start acknowledging ideas from other disciplines and not reinvent the wheel! This includes acknowledging economics, and as you rightly say, other humanities. In fact, this opinion is shared by an economist—Tyler Cowen. See his critique of Parfit’s ‘On What Matters?’ (Link to article). My reading of his critique is that Parfit (who planted the seeds for EA philosophy) was too much of a philosopher to his own detriment! He couldn’t engage with scholars from other disciplines to see the progress already made on many of the questions he was interested in.
Overall, I agree with you that EA philosophy must engage widely with ideas from outside. But I disagree that this is a problem it inherited from economics. I think this is intrinsic to EA philosophy due to the circumstances in which it was born (i.e.) an academic sitting in some cabin in the woods. That is why I don’t find it appealing at all when EA people plan to live together with other EAs or date other EAs or find employment only in EA-aligned organizations. This is altogether to the detriment of EA evolving into new areas and X risks! To me the best EA people are ones who know of EA philosophy but engage with it only once in a while to check in and see what’s going on. These are the folks who will act as bridges that will ultimately make EA philosophy much better. We don’t post as much or come to all the events but we come when we feel the need to!
In my own head, the way I deal with this is, is to create a distinction between “Parfitianism” and “EA”. “Parfitianism” is done and it is a good starting point for us to learn from. But to build “EA” we need to engage mostly outside of EA philosophy, as you rightly say, and start building something better. In fact, I think Parfit himself believed in something like this, when he concluded Reasons and Persons with a section titled, “How both human history, and the history of ethics, may be just beginning”
I probably won’t read another post on the forum for a while. Out of the stuff that is currently on the forum this was the one worth reading. Good post!
I think it’s a bit misleading to say EA philosophy “lacks rigor”, because it could be taken to imply it falls below some sort of known disciplinary standard of reasoning/evidence that at least some other philosophy reaches. I don’t think this is even close to being true. EA philosophy to me means mostly “Bostrom, Ord and MacAskill’s academic papers, and stuff that came out of the Global Priorities Institute”. And that stuff has been published in very good journals over and over again. Even MIRI’s unorthodox ideas about decision theory have been written-up and published in a very good philosophy journal! EA philosophy is about as academically mainstream as philosophy gets. It’s true a large majority of academic philosophers disagree with at least some of it, but that is also true of any comparable rival body of philosophical work.
Great comment David! It made me focus in on the heart of the question here. It is simply this—What is the right counterfactual to EA, the philosophy/academic discipline? OP is comparing EA philosophy/discipline to Econ. But is that fair? When I read your comment this morning, I noticed how I utterly failed to clarify that EA is less rigorous… than what?! It got me thinking empirically and I quickly whipped up some stuff using OpenAlex which is a open-source repository of publications used by bibliometricians. Now this preliminary analysis I show doesn’t resolve the question of what is counterfactual to EA, but it begins to describe the problem better.
If you’re interested, see my GitHub repo with more details on the research design that I imagined and the Python/R code. I wonder if OP will like this because I’m doing stuff that a (design-based) econometrician would do :-)
First up, lets validate what David said makes sense—Are EA publications in top journals?
Yep. David is not wrong. But of course that is not the question! The claim I want to make is that EA publications are more insular than other stuff and the obstacle to making this claim is what the hell is “other stuff”?! This is where OpenAlex topics come in. OpenAlex uses a clsutering+classification pipeline to classify papers as belonging to some topic. Here is a plot showing that:
Now, the next step would be to ask ourselves, “EA has such a spread of topics. But field X has a much wider spread. This is why EA is insular” But what is that field X? EA compared to Deontology? Utilitarianism? These have around for decades—how is that a fair comparison group? What exactly is the benchmark to weigh EA, the discipline, against?
Now maybe the way to do this, is to pull out all the papers in these topics I have plotted above from OpenAlex and compare against those. But I guess a better way to do this would be to pull out abstracts of all publications, clean up, tokenize, cluster and see whats close by and compare against that. Can someone else make this pipeline more concrete?
I loved this critique and sympathize with it. As somebody studying Public Policy which requires interfacing with all the different social science departments in my university, I empathize with what you mean by the “insularity” of the economics department!
But I think you are missing a bigger critique here. It is that EA philosophy has a lack of rigor stemming from an intrinsic insularity that is not inherited from elsewhere. If anything, EA has to start acknowledging ideas from other disciplines and not reinvent the wheel! This includes acknowledging economics, and as you rightly say, other humanities. In fact, this opinion is shared by an economist—Tyler Cowen. See his critique of Parfit’s ‘On What Matters?’ (Link to article). My reading of his critique is that Parfit (who planted the seeds for EA philosophy) was too much of a philosopher to his own detriment! He couldn’t engage with scholars from other disciplines to see the progress already made on many of the questions he was interested in.
Overall, I agree with you that EA philosophy must engage widely with ideas from outside. But I disagree that this is a problem it inherited from economics. I think this is intrinsic to EA philosophy due to the circumstances in which it was born (i.e.) an academic sitting in some cabin in the woods. That is why I don’t find it appealing at all when EA people plan to live together with other EAs or date other EAs or find employment only in EA-aligned organizations. This is altogether to the detriment of EA evolving into new areas and X risks! To me the best EA people are ones who know of EA philosophy but engage with it only once in a while to check in and see what’s going on. These are the folks who will act as bridges that will ultimately make EA philosophy much better. We don’t post as much or come to all the events but we come when we feel the need to!
In my own head, the way I deal with this is, is to create a distinction between “Parfitianism” and “EA”. “Parfitianism” is done and it is a good starting point for us to learn from. But to build “EA” we need to engage mostly outside of EA philosophy, as you rightly say, and start building something better. In fact, I think Parfit himself believed in something like this, when he concluded Reasons and Persons with a section titled, “How both human history, and the history of ethics, may be just beginning”
I probably won’t read another post on the forum for a while. Out of the stuff that is currently on the forum this was the one worth reading. Good post!
I think it’s a bit misleading to say EA philosophy “lacks rigor”, because it could be taken to imply it falls below some sort of known disciplinary standard of reasoning/evidence that at least some other philosophy reaches. I don’t think this is even close to being true. EA philosophy to me means mostly “Bostrom, Ord and MacAskill’s academic papers, and stuff that came out of the Global Priorities Institute”. And that stuff has been published in very good journals over and over again. Even MIRI’s unorthodox ideas about decision theory have been written-up and published in a very good philosophy journal! EA philosophy is about as academically mainstream as philosophy gets. It’s true a large majority of academic philosophers disagree with at least some of it, but that is also true of any comparable rival body of philosophical work.
Great comment David! It made me focus in on the heart of the question here. It is simply this—What is the right counterfactual to EA, the philosophy/academic discipline? OP is comparing EA philosophy/discipline to Econ. But is that fair? When I read your comment this morning, I noticed how I utterly failed to clarify that EA is less rigorous… than what?! It got me thinking empirically and I quickly whipped up some stuff using OpenAlex which is a open-source repository of publications used by bibliometricians. Now this preliminary analysis I show doesn’t resolve the question of what is counterfactual to EA, but it begins to describe the problem better.
If you’re interested, see my GitHub repo with more details on the research design that I imagined and the Python/R code. I wonder if OP will like this because I’m doing stuff that a (design-based) econometrician would do :-)
First up, lets validate what David said makes sense—Are EA publications in top journals?
Yep. David is not wrong. But of course that is not the question! The claim I want to make is that EA publications are more insular than other stuff and the obstacle to making this claim is what the hell is “other stuff”?! This is where OpenAlex topics come in. OpenAlex uses a clsutering+classification pipeline to classify papers as belonging to some topic. Here is a plot showing that:
Now, the next step would be to ask ourselves, “EA has such a spread of topics. But field X has a much wider spread. This is why EA is insular” But what is that field X? EA compared to Deontology? Utilitarianism? These have around for decades—how is that a fair comparison group? What exactly is the benchmark to weigh EA, the discipline, against?
Now maybe the way to do this, is to pull out all the papers in these topics I have plotted above from OpenAlex and compare against those. But I guess a better way to do this would be to pull out abstracts of all publications, clean up, tokenize, cluster and see whats close by and compare against that. Can someone else make this pipeline more concrete?
EDIT: Fixed broken links
You may be interested in Siobhan’s classic 2022 post Learning from non-EAs who seek to do good.