My attitude, and the attitude of many of the alignment researchers I know, is that this problem seems really important and neglected, but we overall don’t want to stop working on alignment in order to work on this. If I spotted an opportunity for research on this that looked really surprisingly good (e.g. if I thought I’d be 10x my usual productivity when working on it, for some reason), I’d probably take it.
It’s plausible that I should spend a weekend sometime trying to really seriously consider what research opportunities are available in this space.
My guess is that a lot of the skills involved in doing a good job of this research are the same as the skills involved in doing good alignment research.
My attitude, and the attitude of many of the alignment researchers I know, is that this problem seems really important and neglected, but we overall don’t want to stop working on alignment in order to work on this. If I spotted an opportunity for research on this that looked really surprisingly good (e.g. if I thought I’d be 10x my usual productivity when working on it, for some reason), I’d probably take it.
It’s plausible that I should spend a weekend sometime trying to really seriously consider what research opportunities are available in this space.
My guess is that a lot of the skills involved in doing a good job of this research are the same as the skills involved in doing good alignment research.