I don’t know; I doubt it’s a problem where throwing money at it is the right answer. In any case, it’s unclear to me whether doing this would actually be positive value or not. I imagine it would be quite controversial, even among EAs who are into longtermism. I just shared the idea because I thought it was interesting, not because I necessarily thought it was good.
Yeah, I agree that money is not the bottleneck. I think the strongest bottleneck is decision quality on whether this is a good idea, and a secondary bottleneck is whether our Hollywood contacts are good enough to make this happen conditional upon us believing it’s actually a good idea.
Having popular presentations of our ideas in an unnuanced form may either a) give the impression that our ideas are bad/silly/unnuanced or b) low-status, akin to how a lot of AI safety efforts are/were rounded off as “Terminator” scenarios.
I don’t know; I doubt it’s a problem where throwing money at it is the right answer. In any case, it’s unclear to me whether doing this would actually be positive value or not. I imagine it would be quite controversial, even among EAs who are into longtermism. I just shared the idea because I thought it was interesting, not because I necessarily thought it was good.
Yeah, I agree that money is not the bottleneck. I think the strongest bottleneck is decision quality on whether this is a good idea, and a secondary bottleneck is whether our Hollywood contacts are good enough to make this happen conditional upon us believing it’s actually a good idea.
Do you have a story for why this could be a bad idea?
Having popular presentations of our ideas in an unnuanced form may either a) give the impression that our ideas are bad/silly/unnuanced or b) low-status, akin to how a lot of AI safety efforts are/were rounded off as “Terminator” scenarios.