Background in cognitive science. I run a workshop with the aim to teach methods to manage strong disagreements (including non-EA people). Also community building.
Interested in cyborgism and AIS via debate.
https://typhoon-salesman-018.notion.site/Date-me-doc-be69be79fb2c42ed8cd4d939b78a6869?pvs=4
I’ll give it a try !
Open question: would it be useful to frame this an “impact insurance” for people in impactful careers?
As in:
-I expect this goal to be good for the world (say ~100 WELLBYs in expectation)
-If I don’t achieve this goal, then I definitely owe something that’s a least comparably good for the world (say ~$200 to PureEarth (since I’ve chosen WELLBYs))
(or maybe at least half as good)
I think using it this way could help people who have a real hesitation between impactful work and impactful donations.