Do you intend to submit Logical Induction to a relevant magazine for peer review and publication? Do you still hold with ~Eliezer2008 that people who currently object that MIRI doesn’t participate in the orthodox scientific progress would still object for other reasons, even if you tried to address the lack of peer review?
Also why no /r/IAmA or /r/science AMA? The audience on this site seems limited from the start. Are you trying to target people who are already EAs in specific?
We’re submitting “Logical Induction” for publication, yeah. Benya and Jessica (and Stuart Armstrong, a MIRI research associate based at FHI) co-authored papers in a top-10 AI conference this year, UAI, and we plan to publish in similarly high-visibility venues in the future.
We’ve thought about doing a Reddit AMA sometime. It sounds fun, though it would probably need to focus more on basic background questions; EAs have a lot of overlapping knowledge, priorities, styles of thinking, etc. with MIRI, so we can take a lot of stuff for granted here that we couldn’t on /r/science. I usually think of orgs like FHI and Leverhulme CFI and Stuart Russell’s new alignment research center as better-suited to that kind of general outreach.
Do you intend to submit Logical Induction to a relevant magazine for peer review and publication? Do you still hold with ~Eliezer2008 that people who currently object that MIRI doesn’t participate in the orthodox scientific progress would still object for other reasons, even if you tried to address the lack of peer review?
Also why no /r/IAmA or /r/science AMA? The audience on this site seems limited from the start. Are you trying to target people who are already EAs in specific?
We’re submitting “Logical Induction” for publication, yeah. Benya and Jessica (and Stuart Armstrong, a MIRI research associate based at FHI) co-authored papers in a top-10 AI conference this year, UAI, and we plan to publish in similarly high-visibility venues in the future.
We’ve thought about doing a Reddit AMA sometime. It sounds fun, though it would probably need to focus more on basic background questions; EAs have a lot of overlapping knowledge, priorities, styles of thinking, etc. with MIRI, so we can take a lot of stuff for granted here that we couldn’t on /r/science. I usually think of orgs like FHI and Leverhulme CFI and Stuart Russell’s new alignment research center as better-suited to that kind of general outreach.