[Question] Curious if GWWC takes into account existential risk probabilities in calculating impact of recurring donors.

With the seeming increase of AI risk discussion, I was just wondering if anything like this existed. My silly imagining of this might be a ~1/​6 x-risk penalty on some lifetime donors (per Ord’s The Precipice x-risk estimation for this century), not that I think this should be the case or that I think this number is still representative.

I don’t mean this as an out of the blue criticism—mostly just curious if/​how x-risk might be taken into account, since I myself am beginning to think in this way about my own life.

No comments.