When browsing in http://intentionalinsights.org/ it doesn’t look like promoting EA is your primary goal. Goals stated in http://intentionalinsights.org/about (nothing directly about charity/altruism) are very different from the ones stated here. Are you planning to radically shift your focus?
Thanks for bringing this up! We only mention altruism briefly in our vision, among other things. I can see how this might be confusing :-)
Here is the reason. Since we’re pursuing promoting effective giving to non-EAs, our organization is outward-facing to the broad audience, unlike the majority of EA meta-organizations, which are mainly inward-facing to the EA movement. Since we are outward-facing, we need to be careful about stating explicitly the goals we are pursuing—it won’t be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them :-) Instead, we tell our non-EA audiences that we are trying to help them reach their goals, which is the case—we are helping them realize their actual goals if they understood how to reach their actual giving goals by giving in the most impactful manner.
This is why we have a separate EA webpage that outlines our EA orientation, which is not linked to from our outward, public-facing website.
I see no reason to promote EA indirectly because EA is easy to sell to many rational people. People who easily agree to EA are most-likely the ones that can be most cost-effectively targeted for advertisement while the movement small.
Sorry if any of that came of as harsh. I still think it’s great you are actually trying something while I just sit at home and criticise everyone :)
Thanks for raising these points, and no worries about sounding critical! If you have these concerns, other people do too, and it’s important to have a transparent dialogue about them :-)
1) First, let’s be very clear and specific about out terminology. I think the word “dishonest” does not serve us well here. Let’s taboo that, and talk about what we actually do. As I stated above, what we do is help people realize their actual goals, if they knew about the best methods of reaching them. Namely, typical people have a desire to help others, but they don’t necessarily know the best way to do that. They fall into attention bias, they don’t realize the salience of the drowning child problem, and they give to whatever charity has the best marketing. That’s why our mission states “We empower people to refine and reach their goals,” and the refining part addresses helping people figure out what their goals actually are. We help them achieve their more long-term goals, in other words.
2A) Yup, Intentional Insights promotes both effective giving and effective decision-making. That’s mentioned briefly above and described here in more depth. Doing so helps improve the capacity of EAs who engage with our content, and contributes to the flourishing of non-EAs.
2B) Ugh, the wiki thing was pretty ugly. The full story is here—we had a hater try to wipe the InIn wiki entry. It was settled but the part about promoting EA was deleted in the settlement. I’m not happy with the outcome, but it’s the best we could get.
2C) You can see my goals through my actions—I have invested a lot of time and efforts and money into promoting effective giving, freely and of my own volition. There was nothing forcing me to do so, and no specific benefit I was getting from it. Both from a social status perspective and from a financial perspective, I’m fine with my situation as a professor at Ohio State—I get social respect and fine job benefits. So my only gain from promoting effective giving is other people giving effectively and my only reason for engaging with the EA movement and taking the GWWC pledge and TLYCS pledge is my passion for helping people flourish :-)
3) Actually, not promoting effective giving leaves a lot of money on the table, as I argue here.
1A) In my experience, typical people don’t have strong desires to help people far away. They just don’t care about them nearly as much as themselves and their relatives, especially children. It never seems to me when talking with such people that they are confused. It always seems that they just have different values. Actually, their values make more sense from evolutionary psychology POV. So if you ask a person “Being effective at altruism (towards people/animals that might be far away and you won’t necessarily meet) is one of goals in your life, right?” and he disagrees (or agrees to seem good but then doesn’t act on it), IMO most likely that person has different core values, which are usually very hard to change. If I am right, little will be donated by audience you gain by omitting that altruism is your goal. By omitting that you may also fail to attract some people who are interested in altruism and can be targeted more productively.
Not sure if people who e. g. donate to cancer charities because they recently lost their relative to cancer are usually confused. It could also be different values to some degree. IMO that could be a more productive target audience.
1B) If I was a non-EA fan of InIn and after a google search I found a sentence like “it won’t be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them”, I would probably feel angry, manipulated and being treated as someone of a lower intellectual class. Not sure what percentage of people would feel in a similar way. If a journalist found such sentence when writting about InIn, he might see it as an opportunity for initiating a scandal. Stuff like that can never happen when you always say/write everything you are thinking that is important enough to be said: no lies, no “Not Technically Lying”, no omissions. Just always trying to make maps in other brains closer to what you think is reality. This is what I call honesty. Spreading EA ideas seems like an admirable goal to many people so to me it’s strange that you chose to hide that.
1B) This leads to the second point. Speaking to people’s emotions and using stories is not lying, even technically :-) Helping people learn more about their own values is what’s at stake. I accept that you have a belief that spreading EA ideas seems like an admirable goal to many people, but it’s not an admirable goal to people who don’t know about EA. So we need to be strategic in how we speak to them.
Anyway, I don’t think a further continuation of this conversation is productive. I have had these conversations with many folks on this forum, you can read my past posts. We may have different perspectives on the best way to approach the same goals, so let’s just focus on our broader shared values, and leave each of us to do the best we can to advance human flourishing.
3) I’m not arguing against promoting effective giving. I’m all for that. Just thinking which ways to do that are the most effective. To your knowledge, how many people you already convinced to donate to EA charities? How much do you think was donated because of InIn?
We’re at the early stages of starting this campaign, and to my personal knowledge, six people.
However, our methods are such that we don’t know the actual impact—for example, one article was shared over 1K times on social media. So it might have convinced thousands. There’s no way to tell right now. As I wrote above, we plan to run RCT studies on our impact once we have the funding to do so :-)
When browsing in http://intentionalinsights.org/ it doesn’t look like promoting EA is your primary goal. Goals stated in http://intentionalinsights.org/about (nothing directly about charity/altruism) are very different from the ones stated here. Are you planning to radically shift your focus?
Thanks for bringing this up! We only mention altruism briefly in our vision, among other things. I can see how this might be confusing :-)
Here is the reason. Since we’re pursuing promoting effective giving to non-EAs, our organization is outward-facing to the broad audience, unlike the majority of EA meta-organizations, which are mainly inward-facing to the EA movement. Since we are outward-facing, we need to be careful about stating explicitly the goals we are pursuing—it won’t be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them :-) Instead, we tell our non-EA audiences that we are trying to help them reach their goals, which is the case—we are helping them realize their actual goals if they understood how to reach their actual giving goals by giving in the most impactful manner.
This is why we have a separate EA webpage that outlines our EA orientation, which is not linked to from our outward, public-facing website.
Not sure it’s a good approach because:
Being dishonest about your goals and half-secretly manipulating people spreads the wrong values and can backfire easily, giving bad reputation to EA.
Topics like http://intentionalinsights.org/how-sure-are-you-about-your-memories don’t seem to promote EA in any way and yet your resources are spent on them. Because you admitted being dishonest about your goals to your readers, it makes me doubt whether you are honest about them to us :) In fact, your changes in https://wiki.lesswrong.com/index.php?title=Intentional_Insights&diff=15260&oldid=152533 targeted at a rational crowd also don’t mention charity or altruism. And defending your stuff on a wiki in third person is also not very nice.
I see no reason to promote EA indirectly because EA is easy to sell to many rational people. People who easily agree to EA are most-likely the ones that can be most cost-effectively targeted for advertisement while the movement small.
Sorry if any of that came of as harsh. I still think it’s great you are actually trying something while I just sit at home and criticise everyone :)
Thanks for raising these points, and no worries about sounding critical! If you have these concerns, other people do too, and it’s important to have a transparent dialogue about them :-)
1) First, let’s be very clear and specific about out terminology. I think the word “dishonest” does not serve us well here. Let’s taboo that, and talk about what we actually do. As I stated above, what we do is help people realize their actual goals, if they knew about the best methods of reaching them. Namely, typical people have a desire to help others, but they don’t necessarily know the best way to do that. They fall into attention bias, they don’t realize the salience of the drowning child problem, and they give to whatever charity has the best marketing. That’s why our mission states “We empower people to refine and reach their goals,” and the refining part addresses helping people figure out what their goals actually are. We help them achieve their more long-term goals, in other words.
2A) Yup, Intentional Insights promotes both effective giving and effective decision-making. That’s mentioned briefly above and described here in more depth. Doing so helps improve the capacity of EAs who engage with our content, and contributes to the flourishing of non-EAs.
2B) Ugh, the wiki thing was pretty ugly. The full story is here—we had a hater try to wipe the InIn wiki entry. It was settled but the part about promoting EA was deleted in the settlement. I’m not happy with the outcome, but it’s the best we could get.
2C) You can see my goals through my actions—I have invested a lot of time and efforts and money into promoting effective giving, freely and of my own volition. There was nothing forcing me to do so, and no specific benefit I was getting from it. Both from a social status perspective and from a financial perspective, I’m fine with my situation as a professor at Ohio State—I get social respect and fine job benefits. So my only gain from promoting effective giving is other people giving effectively and my only reason for engaging with the EA movement and taking the GWWC pledge and TLYCS pledge is my passion for helping people flourish :-)
3) Actually, not promoting effective giving leaves a lot of money on the table, as I argue here.
1A) In my experience, typical people don’t have strong desires to help people far away. They just don’t care about them nearly as much as themselves and their relatives, especially children. It never seems to me when talking with such people that they are confused. It always seems that they just have different values. Actually, their values make more sense from evolutionary psychology POV. So if you ask a person “Being effective at altruism (towards people/animals that might be far away and you won’t necessarily meet) is one of goals in your life, right?” and he disagrees (or agrees to seem good but then doesn’t act on it), IMO most likely that person has different core values, which are usually very hard to change. If I am right, little will be donated by audience you gain by omitting that altruism is your goal. By omitting that you may also fail to attract some people who are interested in altruism and can be targeted more productively.
Not sure if people who e. g. donate to cancer charities because they recently lost their relative to cancer are usually confused. It could also be different values to some degree. IMO that could be a more productive target audience.
1B) If I was a non-EA fan of InIn and after a google search I found a sentence like “it won’t be very beneficial to tell our non-EA audiences that we are trying to promote EA-themed effective giving idea through using emotional engagement and persuasion tactics on them”, I would probably feel angry, manipulated and being treated as someone of a lower intellectual class. Not sure what percentage of people would feel in a similar way. If a journalist found such sentence when writting about InIn, he might see it as an opportunity for initiating a scandal. Stuff like that can never happen when you always say/write everything you are thinking that is important enough to be said: no lies, no “Not Technically Lying”, no omissions. Just always trying to make maps in other brains closer to what you think is reality. This is what I call honesty. Spreading EA ideas seems like an admirable goal to many people so to me it’s strange that you chose to hide that.
1A) The example of Giving Games seems to confound your hypothesis, and I am convinced by it: http://www.thelifeyoucansave.org/Blog/ID/196/Can-Giving-Games-change-donor-behavior-We-did-an-experiment-to-find-out Essentially, Giving Games help people realize their own values, through putting them before a real choice, one that is emotionally engaging and forces them to think through the situation.
1B) This leads to the second point. Speaking to people’s emotions and using stories is not lying, even technically :-) Helping people learn more about their own values is what’s at stake. I accept that you have a belief that spreading EA ideas seems like an admirable goal to many people, but it’s not an admirable goal to people who don’t know about EA. So we need to be strategic in how we speak to them.
Anyway, I don’t think a further continuation of this conversation is productive. I have had these conversations with many folks on this forum, you can read my past posts. We may have different perspectives on the best way to approach the same goals, so let’s just focus on our broader shared values, and leave each of us to do the best we can to advance human flourishing.
Just noticed that almost the same thoughts regarding 1A) were said in http://effective-altruism.com/ea/rr/the_big_problem_with_how_we_do_outreach/ You don’t have to answer any of this if it’s not new.
3) I’m not arguing against promoting effective giving. I’m all for that. Just thinking which ways to do that are the most effective. To your knowledge, how many people you already convinced to donate to EA charities? How much do you think was donated because of InIn?
We’re at the early stages of starting this campaign, and to my personal knowledge, six people.
However, our methods are such that we don’t know the actual impact—for example, one article was shared over 1K times on social media. So it might have convinced thousands. There’s no way to tell right now. As I wrote above, we plan to run RCT studies on our impact once we have the funding to do so :-)