Hi Eugene!
This seems like the kind of thing Tyler Cowen would support. He runs a program called emergent ventures, which funds the ambitions of bright talented young people like yourself. Here’s the application link. Best of luck with everything.
Wim
Hi Karl,
Thanks for your post. You noted that it seems less of the audience was convinced by the end of the debate. I attended that debate, and the page where participants were supposed to vote about which side they were on was not loading. So, I expect far fewer people voted at the end of the debate than at the beginning. The sample of people who did vote at the end, therefore, may very well not be representative. All to say: I wouldn’t weight the final vote too heavily!
Hey, thanks for the comment! And sorry about the late reply.
I agree that there is potential to shift the aims of status games towards, as you put it, “conspicuous, effective giving.” I think this would have great consequences overall, though there could be optics risks. (e.g. may not look good in the eyes of leftists, which could matter in some cases? EA might’ve already bitten that bullet though.)
You make a great point about the second order effects of steering these dynamics for good. I hadn’t thought of it, but if you can change demand, you’re totally right that incentives around supply change too.
Perhaps advertising would be an effective (though surface level) top-down steering approach. I’ll let you know if I come across any others!
Hey, thanks! ’m glad you liked the post!
New Cause Area: Steering and Stabilising Status Games
Will MacAskill on Tyler Cowen’s Podcast
Thanks a lot for your thoughtful feedback!
I share the hesitancy around promoting arguments that don’t seem robust. To keep the form short, I only tried to communicate the thrust of the arguments. There are stronger and more detailed versions of most of them, which I plan on using. In the cases you pointed to:
Some existential risks could definitely happen rather painlessly. But some could also happen painfully, so while the argument is perhaps not all encompassing, I think it still stands. Nevertheless, I’ll change it to something more like “you and everyone you know and love will die prematurely.”
Other intelligent life is definitely a possibility, but even if it’s a reality, I think we can still consider ourselves cosmically significant. I’ll use a less objectionable version of this argument like “… destroy what could be the universe’s only chance…”
I got the co-benefits argument from this paper, which lists a bunch of co-benefits of GCR work, one of which I could swap the “better healthcare infrastructure bit.” I’ll try to get a few more opinions on this.
In any case, thanks again for your comment—I hadn’t considered some of the objections you pointed out!
Which of these arguments for x-risk do you think we should test?
Great! Thank you.
Anyone know how we could access the stats on this video (e.g. how many people clicked through the links on the banner) ? These might be useful for gauging how well suited EA ideas are to this media format and how much we should invest in creating similar content going forward!
Thank you! I’m particularly interested in the effectiveness of civil disobedience in influencing institutional decision-making. If anyone took notes on this discussion and would be willing to share, I’d be incredibly grateful! If you plan on centering future events on this topic and are able to make them virtually accessible without too much hassle, I would love to attend them along with some of my friends. Of course, totally understand if not!
Will there be any way to access this event virtually/will it be recorded?
Your description of retreats matches my experience almost disconcertingly; it even described things I didn’t even realize I took away from the retreat. I went to I felt like the only one who had those experiences. Thanks for writing this up. I hope things work out for you!