Welcome to the forum! Apologies that the rest of my comment may seem overly critical/nitpicky.
While I agree with some other parts of your complaint, the implicit assumption behind
The fact they don’t even bother to get most papers peer reviewed is really bad. Yes, peer review sucks and is a lot of work, but every paper I had peer reviewed was improved by the process.
seems unlikely to be correct to me, at least on a naive interpretation. The implication here is that EA research orgs will be better if they tried to aim for academic publishing incentives. I think this is wrong because academic publishing incentives frequently make you prioritize bad things*. The problem isn’t an abstract issue of value-neutral “quality” but what you are allowed to think about/consider important.
As an example,
Additionally, peer review and being able to publish in a good journal is a useful (although noisy) signal to outsiders and funders that your work is at least novel and not highly derivative.
is indicative of one way in which publishing incentives may warp someone’s understanding, specifically constraining research quality to be primarily defined by “novelty” as understood by an academic field (as opposed to eg, truth, or decision-relevance, or novelty defined in a more reasonable way).
Holden Karnofsky’s interview might be relevant here, specifically the section on academia and the example of David Roodman’s research on criminal justice reform.
Holden Karnofsky [..] : recently when we were doing our Criminal Justice Reform work and we wanted to check ourselves. We wanted to check this basic assumption that it would be good to have less incarceration in the US.
David Roodman, who is basically the person that I consider the gold standard of a critical evidence reviewer, someone who can really dig on a complicated literature and come up with the answers, he did what, I think, was a really wonderful and really fascinating paper, which is up on our website, where he looked for all the studies on the relationship between incarceration and crime, and what happens if you cut incarceration, do you expect crime to rise, to fall, to stay the same? He picked them apart. What happened is he found a lot of the best, most prestigious studies and about half of them, he found fatal flaws in when he just tried to replicate them or redo their conclusions.
When he put it all together, he ended up with a different conclusion from what you would get if you just read the abstracts. It was a completely novel piece of work that reviewed this whole evidence base at a level of thoroughness that had never been done before, came out with a conclusion that was different from what you naively would have thought, which concluded his best estimate is that, at current margins, we could cut incarceration and there would be no expected impact on crime. He did all that. Then, he started submitting it to journals. It’s gotten rejected from a large number of journals by now. I mean starting with the most prestigious ones and then going to the less.
Robert Wiblin: Why is that?
Holden Karnofsky: Because his paper, it’s really, I think, it’s incredibly well done. It’s incredibly important, but there’s nothing in some sense, in some kind of academic taste sense, there’s nothing new in there. He took a bunch of studies. He redid them. He found that they broke. He found new issues with them, and he found new conclusions. From a policy maker or philanthropist perspective, all very interesting stuff, but did we really find a new method for asserting causality? Did we really find a new insight about how the mind of a …
Robert Wiblin: Criminal.
Holden Karnofsky: A perpetrator works. No. We didn’t advance the frontiers of knowledge. We pulled together a bunch of knowledge that we already had, and we synthesized it. I think that’s a common theme is that, I think, our academic institutions were set up a while ago. They were set up at a time when it seemed like the most valuable thing to do was just to search for the next big insight.
These days, they’ve been around for a while. We’ve got a lot of insights. We’ve got a lot of insights sitting around. We’ve got a lot of studies. I think a lot of the times what we need to do is take the information that’s already available, take the studies that already exist, and synthesize them critically and say, “What does this mean for what we should do? Where we should give money, what policy should be.”
I don’t think there’s any home in academia to do that. I think that creates a lot of the gaps. This also applies to AI timelines where it’s like there’s nothing particularly innovative, groundbreaking, knowledge frontier advancing, creative, clever about just … It’s a question that matters. When can we expect transformative AI and with what probability? It matters, but it’s not a work of frontier advancing intellectual creativity to try to answer it.
*Also society already has very large institutions working on academic publishing incentives called “universities,” so from a strategic diversification perspective we may not want to replicate them exactly.
Welcome to the forum! Apologies that the rest of my comment may seem overly critical/nitpicky.
While I agree with some other parts of your complaint, the implicit assumption behind
seems unlikely to be correct to me, at least on a naive interpretation. The implication here is that EA research orgs will be better if they tried to aim for academic publishing incentives. I think this is wrong because academic publishing incentives frequently make you prioritize bad things*. The problem isn’t an abstract issue of value-neutral “quality” but what you are allowed to think about/consider important.
As an example,
is indicative of one way in which publishing incentives may warp someone’s understanding, specifically constraining research quality to be primarily defined by “novelty” as understood by an academic field (as opposed to eg, truth, or decision-relevance, or novelty defined in a more reasonable way).
Holden Karnofsky’s interview might be relevant here, specifically the section on academia and the example of David Roodman’s research on criminal justice reform.
*Also society already has very large institutions working on academic publishing incentives called “universities,” so from a strategic diversification perspective we may not want to replicate them exactly.