CEO of Rethink Priorities
Marcus_A_Davis
I’m pretty excited to help out. Of course, as pointed by Ryan, if anyone has any pointers about spreading our reach more effectively on social media, I’m open to hearing them.
Working on getting a more useful skill but for now if ever someone needs some audio editing, perhaps for a potential EA podcast, I can do it.
Also this job board seems relevant as skills people have they might not think would be of use are in demand.
I’m interested in if anyone has any data or experience attempting to introduce people to EA through conventional charitable activities like blood drives, volunteering at a food bank, etc. The idea I’ve been kicking around is basically start or co-opt a blood drive or whatever event.
While people are engaged in the activity, or before or after, you introduce them to the idea of EA. Possibly even using this conventional charitable event as the prelude to a giving game. On the plus side the people you are speaking with are self-selected for doing charitable acts so might be more receptive to EA than a typical audience. On the downside is this group might be self-selected for people who care a lot about personally getting hands on for charitable works which typically aren’t the most effective things you can do.
Can anyone recommend to me some work on existential threats as a whole? I don’t just mean AI or technology related threats but nuclear war, climate change, etc.
Btw Nick Bostrom’s Superintelligence is already at the top of my reading list, and I know Less Wrong is currently engaged in a reading group on that book.
This is very useful. As someone still very new to this who wants to contribute more it can be helpful to see what other EAs are doing in detail. I still struggle from not knowing exactly what I can do now and what are realistic goals for behavioral and social changes, particularly in the short term.
More generally, as someone trying to be more productive and efficient Toggl looks promising and I’m going to try it out myself.
Having grown up as one of those people who figured “can’t succeed, don’t try” with regards to large problems, I think this is really a fantastic point that I hadn’t considered expressing this way. I think lots of people like who currently think like I did could be swayed if the message could get through to them that they can indeed change the world for the better.
Interesting piece. However, the article conflates psychopathy meaning “people with smaller amygdalas” and psychopathy meaning “people with smaller amygdalas who display anti-social behavior”. The former group is not necessarily in the latter group. For example, you may have a smaller than average amygdala and genuinely respond less to the fear and distress of others but not become a social predator that manipulates people.
And as you point out, it’s not clear how this study relates to EAs. It could be that EAs have relatively normal amygdala size but are disproportionately interested in rationality and ethics and hence recognize the good they can and should be doing in the world.
Hola everyone. I’m Marcus. I’m an audio engineer but I really got into philosophy during college. Eventually that led me to ethics and effective altruism.
I’m currently learning a more financially beneficial skill so I can earn to give. In the meantime, I intend to do everything I can outside of that to contribute and help spread the word of EA.
Ryan and I were discussing doing that for different subreddits that a given post here might be of interest to. So if it’s a post about medical interventions posting it in /r/medicine for example.
Of course, the Internet is a lot bigger than Reddit though so there are probably many venues related to philanthropy, productivity, philosophy, animal rights, medical interventions, etc. that posts here could be relevant to. I’m going to try to do what I can but I would appreciate guidance toward relevant venues and potentially help actually doing the work if it proves to be a huge task.