This is the first time I’ve felt like I might be able to contribute to the AI alignment problem (so maybe you did oversell it!).
I feel like I have a pretty good understanding of everything in this post. I started finding a lot more concepts and vocabulary I didn’t understand when I read about the (counter)examples in the Less Wrong post—to the point where asking questions/doing further research started to seem a bit overwhelming.
It’s possible that this is an indicator that I wouldn’t be a good fit for the contest, or that I need to just suck it up and dig in. Being unsure of which makes me hesitant to take the tome to do the latter.
I can’t help but think that some kind of real-time walkthrough with the ability to ask questions would be helpful to me, and perhaps others.
I don’t know if anyone with a full understanding would have enough to time for something like that. Or if there would be enough other people interested in doing it to make it worthwhile.
I’m imagining it would take the form of something like a couple hour class over Zoom.
The biggest concern I can think of is that it would waste a qualified facilitator’s very valuable time. The expected value would go up though, with the number of interested attendees.
Maybe it could be contingent on enough people signing up?
I am sure that if you join the AI Alignment slack [1], Rob Miles discord server [2] or ask questions on LW you will find people willing to answer.
Finding a dedicated tutor might be harder, but if you can compensate them for their time. The bountied rationality Facebook group [3] might be a good place to ask.
This is the first time I’ve felt like I might be able to contribute to the AI alignment problem (so maybe you did oversell it!).
I feel like I have a pretty good understanding of everything in this post. I started finding a lot more concepts and vocabulary I didn’t understand when I read about the (counter)examples in the Less Wrong post—to the point where asking questions/doing further research started to seem a bit overwhelming.
It’s possible that this is an indicator that I wouldn’t be a good fit for the contest, or that I need to just suck it up and dig in. Being unsure of which makes me hesitant to take the tome to do the latter.
I can’t help but think that some kind of real-time walkthrough with the ability to ask questions would be helpful to me, and perhaps others.
I don’t know if anyone with a full understanding would have enough to time for something like that. Or if there would be enough other people interested in doing it to make it worthwhile.
I’m imagining it would take the form of something like a couple hour class over Zoom.
The biggest concern I can think of is that it would waste a qualified facilitator’s very valuable time. The expected value would go up though, with the number of interested attendees.
Maybe it could be contingent on enough people signing up?
I am sure that if you join the AI Alignment slack [1], Rob Miles discord server [2] or ask questions on LW you will find people willing to answer.
Finding a dedicated tutor might be harder, but if you can compensate them for their time. The bountied rationality Facebook group [3] might be a good place to ask.
[1] https://eahub.org/profile/jj-hepburn/ [2] https://www.patreon.com/posts/patreon-discord-41901653 [3] https://m.facebook.com/groups/bountiedrationality/about/
Thanks for the suggestions!