Somehow the field of machine learning has developed a norm of putting articles on arxiv.org (open access pre-print paper archive) before submitting them to be published to journals with a paywall. Apparently this is because researchers want to publish their idea/results as soon as possible, lest other researchers hit upon that idea first. Would it be possible to have a norm at RTI of publishing the preprint version of the article freely? Publishing preprints is allowed by Elsevier, at least.
Would it be possible to collaborate with Vox Future Perfect and see if they would be interested in publishing an article about new research from RTI?
It looks like Founders Pledge could be useful for thinking, specifically, about climate change. At the moment, I’m really unsure about whether, practically, it would make more sense for me to try to implement a general framework for evaluating the criticality of our research portfolio vs. trying to rank the criticality of potential interventions within one small sub-section of our research portfolio to give the organization an example of what ranking criticality looks like (e.g., climate change). The answer is probably that I’ll need to do both. I will definitely keep Founders Pledge in mind as a resource.
Re: open access publications—Thank you for raising this point! This touches on a larger, tangential problem I’ve been thinking about: namely, lack of meaningful, public access to academic research, and how that relates to the gap between research and policy. I think there is certainly room for ideas like these as I get further along in the implementation process. I will add that idea to my idea tracker which includes all the things I need to create momentum around until they’re implemented. (It’s amazing how long each small idea will probably take to implement.)
As an aside: I recently learned that the word “scooped” is used to refer to when someone else publishes similar results to yours first. Like, “Oh no! We got scooped!” I think it’s a funny word to use, so I thought I’d share it in case it brings joy to others as well.
I will look into it! Off the top of my head, RTI may not want research published there as I think Vox is perceived as somewhat “left” leaning, and RTI fancies itself a deeply non-partisan organization.
Also, is there a need to create an open access multidisciplinary repository (green open access) for effective altruism researchers? Or is the existing network of repositories enough?
Not sure if this was meant for me or Lauren. Anyways, I’ve been in touch with the people at PREreview and I think it ticks the right boxes.
I propose this in my “let’s do this already action plan HERE”
I think the crucial steps are
Set up an “experimental space” on PREreview allowing us to include additional, more quantitative metrics (they have offered this as a possibility)
Most important: Get funding and support (from Open Phil etc) and commitments (from GPI, RP, etc)
for people to do reviewing, rating, and feedback activities in our space PREreview
for ‘editorial’ people to oversee which research projects are relevant and assign relevant reviewers
Link arms with Cooper Smout and the “Free our Knowledge” pledges and initiatives like this one as much as possible
I don’t think setting up an OA journal with an impact factor is necessary. I think “credible quantitative peer review” is enough, and in fact the best mode. (But I am also supportive of open access journals with good feedback/rating models like SciPost, and it might be nice to have an EA-relevant place like this).
Some off-the-cuff thoughts (I don’t have any expertise in this area so this might be totally off-base):
Founders Pledge might have relevant research regarding the impact of focusing on climate policy in the US: https://founderspledge.com/stories/the-implications-of-bidens-victory-for-impact-focused-climate-philanthropy, https://founderspledge.com/stories/climate-change-executive-summary
Somehow the field of machine learning has developed a norm of putting articles on arxiv.org (open access pre-print paper archive) before submitting them to be published to journals with a paywall. Apparently this is because researchers want to publish their idea/results as soon as possible, lest other researchers hit upon that idea first. Would it be possible to have a norm at RTI of publishing the preprint version of the article freely? Publishing preprints is allowed by Elsevier, at least.
Would it be possible to collaborate with Vox Future Perfect and see if they would be interested in publishing an article about new research from RTI?
Thank you for these suggestions!
It looks like Founders Pledge could be useful for thinking, specifically, about climate change. At the moment, I’m really unsure about whether, practically, it would make more sense for me to try to implement a general framework for evaluating the criticality of our research portfolio vs. trying to rank the criticality of potential interventions within one small sub-section of our research portfolio to give the organization an example of what ranking criticality looks like (e.g., climate change). The answer is probably that I’ll need to do both. I will definitely keep Founders Pledge in mind as a resource.
Re: open access publications—Thank you for raising this point! This touches on a larger, tangential problem I’ve been thinking about: namely, lack of meaningful, public access to academic research, and how that relates to the gap between research and policy. I think there is certainly room for ideas like these as I get further along in the implementation process. I will add that idea to my idea tracker which includes all the things I need to create momentum around until they’re implemented. (It’s amazing how long each small idea will probably take to implement.)
As an aside: I recently learned that the word “scooped” is used to refer to when someone else publishes similar results to yours first. Like, “Oh no! We got scooped!” I think it’s a funny word to use, so I thought I’d share it in case it brings joy to others as well.
I will look into it! Off the top of my head, RTI may not want research published there as I think Vox is perceived as somewhat “left” leaning, and RTI fancies itself a deeply non-partisan organization.
On the ‘publishing’ and peer-review front, I’d like to propose a move to a different model. We can do very strong
peer review, feedback, rating, filtering,
curating and ‘publishing’ of research
...without needing to go through the traditional ‘frozen 0⁄1 pdf-prison for-profit publication houses’
We can use our new platform to subtly move the agenda to consider EA ideas and metrics.
I discuss this here [link fixed]. I’d love to get a critical mass together.
But the ‘gitbook’ link may actually be better going forward as a project planning and info-aggregation space.
Thanks for your post!
Would an open access repository plus an open peer review system like PREreview or the Open Peer Review Module meet your needs?
Also, is there a need to create an open access multidisciplinary repository (green open access) for effective altruism researchers? Or is the existing network of repositories enough?
Not sure if this was meant for me or Lauren. Anyways, I’ve been in touch with the people at PREreview and I think it ticks the right boxes.
I propose this in my “let’s do this already action plan HERE”
I think the crucial steps are
Set up an “experimental space” on PREreview allowing us to include additional, more quantitative metrics (they have offered this as a possibility)
Most important: Get funding and support (from Open Phil etc) and commitments (from GPI, RP, etc)
for people to do reviewing, rating, and feedback activities in our space PREreview
for ‘editorial’ people to oversee which research projects are relevant and assign relevant reviewers
Link arms with Cooper Smout and the “Free our Knowledge” pledges and initiatives like this one as much as possible
I don’t think setting up an OA journal with an impact factor is necessary. I think “credible quantitative peer review” is enough, and in fact the best mode. (But I am also supportive of open access journals with good feedback/rating models like SciPost, and it might be nice to have an EA-relevant place like this).