We’re submitting “Logical Induction” for publication, yeah. Benya and Jessica (and Stuart Armstrong, a MIRI research associate based at FHI) co-authored papers in a top-10 AI conference this year, UAI, and we plan to publish in similarly high-visibility venues in the future.
We’ve thought about doing a Reddit AMA sometime. It sounds fun, though it would probably need to focus more on basic background questions; EAs have a lot of overlapping knowledge, priorities, styles of thinking, etc. with MIRI, so we can take a lot of stuff for granted here that we couldn’t on /r/science. I usually think of orgs like FHI and Leverhulme CFI and Stuart Russell’s new alignment research center as better-suited to that kind of general outreach.
We’re submitting “Logical Induction” for publication, yeah. Benya and Jessica (and Stuart Armstrong, a MIRI research associate based at FHI) co-authored papers in a top-10 AI conference this year, UAI, and we plan to publish in similarly high-visibility venues in the future.
We’ve thought about doing a Reddit AMA sometime. It sounds fun, though it would probably need to focus more on basic background questions; EAs have a lot of overlapping knowledge, priorities, styles of thinking, etc. with MIRI, so we can take a lot of stuff for granted here that we couldn’t on /r/science. I usually think of orgs like FHI and Leverhulme CFI and Stuart Russell’s new alignment research center as better-suited to that kind of general outreach.