I’m not au fait with metacharity or rationality—can you explain why rationality should be bundled under metacharity? What is the meta plan behind promotion of rationality (particular in it’s specific forms, like the organisation CFAR)?
Is this really true as strongly as that Anglican and Catholic should be bundled under Christian? I guess the point of your analogy may have been that Four Focus Areas of Effective Altruism classed rationality/CFAR under meta EA, which is fair.
Yeah, I don’t think that they’re as similar as Anglicanism or Catholicism, I’m just saying that you have to consistently apply your chosen scheme.
But I do in fact think that they’re decently related. EA Outreach wants to increase the number of people trying to do good. 80,000 Hours wants to move altruists to impactful careers (increasing quality of altruistic activities). GPP wants to get altruists working on higher-value projects. CFAR wants altruists to be equipped with better thinking skills (and non-altruists). The common thread is that they all want to increase the number, build the capacities and improve the target projects of altruists.
I’m not au fait with metacharity or rationality—can you explain why rationality should be bundled under metacharity? What is the meta plan behind promotion of rationality (particular in it’s specific forms, like the organisation CFAR)?
Is this really true as strongly as that Anglican and Catholic should be bundled under Christian? I guess the point of your analogy may have been that Four Focus Areas of Effective Altruism classed rationality/CFAR under meta EA, which is fair.
Yeah, I don’t think that they’re as similar as Anglicanism or Catholicism, I’m just saying that you have to consistently apply your chosen scheme.
But I do in fact think that they’re decently related. EA Outreach wants to increase the number of people trying to do good. 80,000 Hours wants to move altruists to impactful careers (increasing quality of altruistic activities). GPP wants to get altruists working on higher-value projects. CFAR wants altruists to be equipped with better thinking skills (and non-altruists). The common thread is that they all want to increase the number, build the capacities and improve the target projects of altruists.
Makes sense, but points to meta being an unusually broad and unspecific description.