One question: I am curious to hear anyone’s perspective on the following “conflict”:
The former is more important for influencing labs, the latter is more important for doing alignment research.
And yet, as I say, I believe both of these are necessary.
FWIW when I talk about the “specific skill”, I’m not talking about having legible experience doing this, I’m talking about actually just being able to do it. In general I think it’s less important to optimize for having credibility, and more important to optimize for the skills needed. Same for ML skill—less important for gaining credibility, more important for actually just figuring out what the best plans are.
Also are there good online courses anyone would recommend?
The former is more important for influencing labs, the latter is more important for doing alignment research.
FWIW when I talk about the “specific skill”, I’m not talking about having legible experience doing this, I’m talking about actually just being able to do it. In general I think it’s less important to optimize for having credibility, and more important to optimize for the skills needed. Same for ML skill—less important for gaining credibility, more important for actually just figuring out what the best plans are.
See the resources listed here.
Thanks Richard, This is clear now.
And thank you (and others) for sharing the resources link—this indeed looks like a fantastic resource.
Denis