Effective altruism’s meta-strategy is about friendliness to (tech) power. All our funding comes from tech billionaires. We recruit at elite colleges. We strongly prioritize good relations with AI labs and the associated big tech companies. EA just isn’t going to be genuinely critical or antagonistic toward the powerful groups we depend on for support and status. Not how EA works.
You assume that “anything not super big gets ignored”. The world of “grand battles” is good for some, not all. Same as the world of “small independent entities” is good for some, not all.
I totally feel this isn’t the only choice to do things. There are massive crowdfunding campaigns that work.
I think that an entity that is not opposing itself to the power in any way has its own limitations, serious limitations.
Here’s an example from Russia where some charities collect money, but HAVE to say they’re pro-government.
In many cases those were criticised, and I think justly, that they created more troubles than the effects of their charity.
For example, some used TV ads to gather money for cancer treatment for children.
However, the real problem is: Putin used all the taxes and gas profits on his wars and “internet research” operations, as well as personal luxury items.
So these charities, some argue, were used as a “front” by the government to convince people that “medicine is OK, no need to worry”
Those charities only helped like, the few, and some argue, if they didn’t exist, at all, people wouldn’t have a false belief that “healthcare works fine in Russia”, and would protest and maybe we could get it.
All because of charity’s inability to protest against existing power structures.
I think it applies to alignment too, it’s hard to do alignment when one gets funding from a corp that has a financial interest in “profit first safety second”
Effective altruism’s meta-strategy is about friendliness to (tech) power. All our funding comes from tech billionaires. We recruit at elite colleges. We strongly prioritize good relations with AI labs and the associated big tech companies. EA just isn’t going to be genuinely critical or antagonistic toward the powerful groups we depend on for support and status. Not how EA works.
This doesn’t seem like a bad meta-strategy, fwiw. Surely otherwise EA just gets largely ignored.
See my other comment in the same thread
You assume that “anything not super big gets ignored”. The world of “grand battles” is good for some, not all. Same as the world of “small independent entities” is good for some, not all.
Alignment, however, is for the whole of humanity.
So. What do we do with this?
I totally feel this isn’t the only choice to do things. There are massive crowdfunding campaigns that work.
I think that an entity that is not opposing itself to the power in any way has its own limitations, serious limitations.
Here’s an example from Russia where some charities collect money, but HAVE to say they’re pro-government.
In many cases those were criticised, and I think justly, that they created more troubles than the effects of their charity.
For example, some used TV ads to gather money for cancer treatment for children.
However, the real problem is: Putin used all the taxes and gas profits on his wars and “internet research” operations, as well as personal luxury items.
So these charities, some argue, were used as a “front” by the government to convince people that “medicine is OK, no need to worry”
Those charities only helped like, the few, and some argue, if they didn’t exist, at all, people wouldn’t have a false belief that “healthcare works fine in Russia”, and would protest and maybe we could get it.
All because of charity’s inability to protest against existing power structures.
I think it applies to alignment too, it’s hard to do alignment when one gets funding from a corp that has a financial interest in “profit first safety second”