In addition to four videos on his own channel, Robert Miles also published threevideoson Computerphile during the last 12 months. He also publishes the Alignment Newsletter podcast. So there’s at least some additional output. There’s probably more I don’t know of.
you could find someone with a similar talent level (explaining fairly basic concepts)
I personally actually think this would be very difficult. Robert Miles’ content seems to have been received positively by the AI safety community, but science communications in general is notoriously difficult, and I’d expect most YouTubers to routinely distort and oversimplify important concepts, such that I’d worry that such content would do more harm than good. In contrast, Robert Miles seems sufficiently nuanced.
you could find someone with a similar talent level … who could produce many more videos
It seems that the Long-Term Future Fund isn’t actively searching for people to do specific tasks, if I understand the post correctly. Instead, it’s reviewing applications that come to them. (It’s more labour-intensive to do an active search.) That means that it can be warranted to fund an applicant even if it’s possible that there could be better candidates for the same task somewhere out there. (Minor edits.)
Thanks for the understanding responses Jonas and Linch. Again, I should clarify, I don’t know where I stand here but I’m still not entirely convinced.
So, we have four videos in the last year on his channel, plus three videos on Computerphile, giving seven videos. If I remember correctly, The Alignment Newsletter podcast is just reading Shah’s newsletter, which may be useful but I don’t think that requires a lot of effort.
I should reiterate that I think what Miles does is not easy. I may also be severely underestimating the time it takes to make a YouTube video!
Thanks for the critique!
In addition to four videos on his own channel, Robert Miles also published three videos on Computerphile during the last 12 months. He also publishes the Alignment Newsletter podcast. So there’s at least some additional output. There’s probably more I don’t know of.
I personally actually think this would be very difficult. Robert Miles’ content seems to have been received positively by the AI safety community, but science communications in general is notoriously difficult, and I’d expect most YouTubers to routinely distort and oversimplify important concepts, such that I’d worry that such content would do more harm than good. In contrast, Robert Miles seems sufficiently nuanced.
(Disclosure: I work at EA Funds.)
Yes. Also, regarding this issue:
It seems that the Long-Term Future Fund isn’t actively searching for people to do specific tasks, if I understand the post correctly. Instead, it’s reviewing applications that come to them. (It’s more labour-intensive to do an active search.) That means that it can be warranted to fund an applicant even if it’s possible that there could be better candidates for the same task somewhere out there. (Minor edits.)
Thanks for the understanding responses Jonas and Linch. Again, I should clarify, I don’t know where I stand here but I’m still not entirely convinced.
So, we have four videos in the last year on his channel, plus three videos on Computerphile, giving seven videos. If I remember correctly, The Alignment Newsletter podcast is just reading Shah’s newsletter, which may be useful but I don’t think that requires a lot of effort.
I should reiterate that I think what Miles does is not easy. I may also be severely underestimating the time it takes to make a YouTube video!