Fantastic stuff Aaron. Even as someone who has followed EA forum/newsletters/blogs for 2-3 years, there were quite a few things I didn’t know about. Thanks!
Excellent criticisms thanks. I do, occasionally, have ideas like this that seem worth posting, but that I am unlikely to ever get around to if I wait until I have time to give them the full research treatment. I suppose it would make sense to note that in the original post.
It also occurs to me that, while the idea that organ sales should be legal seems like a pretty mainstream view in EA circles, there are likely some members of the forum that disagree with that. Perhaps that could be someone’s reason for downvoting.
Yes that makes sense for sure. Thanks for the feedback.
Would the people who down-voted this be willing to give a brief note about why?
Or maybe a better idea. A “black market non-profit” that vets donors and purchases kidneys in order to start donation chains with them (rather than benefit a specific person). As long as it operates, it saves a lot of lives. If it gets into legal trouble, it’s a pretty good story to get press coverage, and to help shift public opinion.
You mean because it was $1 or because it’s not usually enforced? The $1 part doesn’t seem essential. Could make it $100k.
Yes, I had considered doing another post, but I guess I was thinking probably not. It would certainly be diminishing returns, yeah. I, and I’m sure many others, wouldn’t want to start seeming spammy about it. Maybe after a couple of years.
@Davidmanheim I am not sure if they are still doing it, but I know they were doing a survey. It is linked to in the Google Doc of their summary that I linked to above.
Thanks David! Great suggestions. I am just back from a trip, but I will dig in more deeply this week and make some revisions and reply to you in more depth—either here or via email. Thanks again!
Good idea. I will definitely do that.
I see what you’re saying, but I (intentionally) didn’t ask about future plans so I’m not sure if that part works. While I definitely don’t want it to be high pressure, I do want to somehow emphasize the importance of tracking future actions. I wonder if this strikes a balance.
Thank you for taking the time to share what you have done. In order to help us understand the longer-term impact of the book, would you mind if we followed up with another short survey a year from now? If that’s alright, please enter your email address below—it will not be shared with anyone, or used for any other purpose.
I really appreciate you taking the time to back-and-forth with me about this!
Thanks Aaron. Glad to hear you might try it.
Good point about that survey question. I was really trying to get across the importance of tracking future actions, but I agree that it could come across off-putting. What do you think of this?
Thank you for taking the time to tell us about what you’ve done. We’d love to hear about what you may still do as well. If you think you might make future donations or changes in line with the principles of Effective Altruism, can we send you a single, short follow-up survey a year from now? If so, please enter your email address below—it will not be shared with anyone, or used for any other purpose.
It seems like there are 3 possible outcomes.
1) AI risk is associated with and covered primarily by one side of the political spectrum.
2) AI risk is associated with and covered by both sides more or less evenly.
3) AI risk is associated with and covered in a neutral way.
Intuitively, 3 seems like the best-case scenario, but that horse may already have left the barn (as it seems like it does with most causes).
1 probably seems bad—though I believe Rob did point out that if it becomes a part of one party’s platform, then maybe it’s easier to implement policy when that party is in power. Obviously, that’s a bit of a gamble.
2 seems like the best remaining option then, but obviously with some risks—perhaps along the lines of what you are hinting at. I don’t see why there would need to be a right wing cause to associate it with though. I mean, if both sides are covering it, the coverage could turn into a back and forth, pro/con on the most controversial aspects of it (a less collegial version of the discussion you linked to), which also seems not that great. Perhaps having both sides be philanthropy-funded and not dependent on generating controversy for advertising/clicks could help with that.
Enthusiastic +1 here. I’d also be willing to contribute to the bounty if there were an easy way to do that.
What I am wondering is, of dung beetles are a particularly hot research topic among animals/insects. They are pretty cool, but would this graph look less extreme with something like antelope or sea urchin in there—or the average number of research papers on a particular animal.
Though I guess X-Risk is more or less an entire field, so maybe the more apt comparison is with papers about animals or biology papers—in which case I’m sure it looks much worse
Great discussion here. I’m trying to imagine how most people consume these articles. Linked from the Vox home page? Shared on Facebook or Twitter? Do they realize they aren’t just a standard Vox article? Some probably barely know what Vox is. Certainly, we are all aware of the connection to EA, but I bet most readers are pretty oblivious.
In that case, maybe these tangentially related or unrelated articles don’t matter too much. Conversely, the better articles may spark an interest that leads a few people towards finding out more about EA and becoming involved.
I agree generally with your criticisms. It’s not particularly surprising given the frequency with which they publish and the variation in quality if Vox’s reporting in general.
I would say your advice to read and check the overall soundness of the message before sharing could probably be broadly applied—and strikes me as a bit self-evident. Do you feel like these poor quality FP articles are getting shared widely? Do you have reason to believe it is being done without the sharer reading them?