If we buy this argument there’s also an important question of how we sequence outreach. Typical startup advice for example, is to start in a small market of early-adopters that you can dominate and then use that dominance to access a wider market.
It seems that EA is an extremely powerful set of ideas for mathematically/philosophically inclined, well-off, ambitious people. It may be that we should continue to target this set of early adopters for the time being and then worry about widespread value change in the future.
Also, a premise in this post is that emotional appeals are better able to achieve value change. That might be true, but it also might be false. I certainty would accept that as a given.
If we buy this argument there’s also an important question of how we sequence outreach. Typical startup advice for example, is to start in a small market of early-adopters that you can dominate and then use that dominance to access a wider market.
It seems that EA is an extremely powerful set of ideas for mathematically/philosophically inclined, well-off, ambitious people. It may be that we should continue to target this set of early adopters for the time being and then worry about widespread value change in the future.
Also, a premise in this post is that emotional appeals are better able to achieve value change. That might be true, but it also might be false. I certainty would accept that as a given.