Yeah, I don’t think that they’re as similar as Anglicanism or Catholicism, I’m just saying that you have to consistently apply your chosen scheme.
But I do in fact think that they’re decently related. EA Outreach wants to increase the number of people trying to do good. 80,000 Hours wants to move altruists to impactful careers (increasing quality of altruistic activities). GPP wants to get altruists working on higher-value projects. CFAR wants altruists to be equipped with better thinking skills (and non-altruists). The common thread is that they all want to increase the number, build the capacities and improve the target projects of altruists.
Yeah, I don’t think that they’re as similar as Anglicanism or Catholicism, I’m just saying that you have to consistently apply your chosen scheme.
But I do in fact think that they’re decently related. EA Outreach wants to increase the number of people trying to do good. 80,000 Hours wants to move altruists to impactful careers (increasing quality of altruistic activities). GPP wants to get altruists working on higher-value projects. CFAR wants altruists to be equipped with better thinking skills (and non-altruists). The common thread is that they all want to increase the number, build the capacities and improve the target projects of altruists.
Makes sense, but points to meta being an unusually broad and unspecific description.