I have thought for years that targeted EA outreach to ‘weirdoes’ on the internet is much better than college clubs. I think it’s much more likely to get aligned, interesting people.
EA’s recruitment MO has been to recruit the best elites it can on the margin, which I agree with due to power laws. However I disagree how to measure “elite”. Selecting from people attending Ivy Leagues does adverse selection on the kind of person who gets into Ivy Leagues. Other people get into this rabbithole by following links on the internet. I would rather engage with someone who cares about ideas than someone following the power-seeking gradient. Now, SBF was both someone who was an early contributor to Felicifia and went to an elite university, so it’s not to say that college clubs aren’t drawing from both sets. On the margin though, these clubs will want to recruit themselves more by say tabling at their college, and that makes sense they want to do that but if I was a funder I would rather support something like say paying for some NEET running a Discord server to grow their server (depending on the topic naturally). This does select for less conscientiousness and my specific story for what to do could be wrong, but I think the overall thrust is right that selectivity should be more weird and in the age of AI we have better tooling for this kind of selection.
Concrete operationalization: There’s a long tail of search terms that orgs like CEA could do ad spend on that would be terms generated by highly thoughtful people. I would bet they are underspending on these terms. Also focusing on what these terms translate to in other languages, and doing more deep talent search in other countries and trying to integrate those people into our network. Is anyone buying ads on Baidu for the Chinese equivalent of the word “utilitarianism”? There could be a lot of low-hanging fruit like this that hasn’t been considered.
I’m not sure what I think about this recent take about the attention arms race but I think we share a sense of “changing up how things are advertised”. My point is more about subtle signalling in the information ecology.
It is possible I cached this thought a long time ago and haven’t properly investigated to see whether the evidence reflects this or we are in fact in the world where most of the portfolio of outreach resources are being spent the way I’d endorse. Like maybe more of the resources are going to these new AI safety Youtube videos instead of uni clubs, and the actual form of my critique should be comparing those videos to some other outreach tactic.
I don’t think I agree either with the idea of recruiting people from elite colleges or recruiting “Internet weirdoes”. I’m not against inviting in either of those kinds of people, but why target them specifically? I prefer a version of the EA movement that is more wholesome, populist, inclusive, and egalitarian.
I don’t mean populist in the typical political sense used these days of being against institutions, against experts, highly distrustful, framing things as good people vs. bad people, or adopting the “paranoid style”. I mean populist in the sense of believing average, everyday, ordinary people are good, have a lot to contribute, are diverse and heterogenous, are often talented, wise, intelligent, and moral, and often are full of surprises. A belief in people, in the average person, in the median person, in the diversity of people who are never quite captured by an average or a median.
I don’t like the somewhat more traditional, more institutionalist elitism you sometimes see in EA, and I don’t like the idiosyncratic, anti-institutionalist nerd elitism of the rationalist community, where people seem to think the best people by far, and maybe the only people really worth a damn, are them, or people just like them. I’m a weird person, and I’ve often had to fight to find a place in the world, but I think it’s the wrong lesson to learn to say, “People treated me badly because I was different and acted like I was inferior just because I wasn’t like them… now I finally see the truth… it’s normal people who are inferior and it’s people like me who are better than everyone else!” Good job, God or karma or whatever sent you a trial so you’d have a chance to become more enlightened and learn compassion, and instead you’re repeating the cycle of samsara. Better luck next life.
It’s possible there are all kinds of ways to reach people from different walks of life that would be a good idea. I’m just highly suspicious of any idea that there’s a superior kind of person, suspiciously similar to the person saying who’s superior and who’s not, and that outreach should be focused specifically on that kind of person.
Concrete operationalization: There’s a long tail of search terms that orgs like CEA could do ad spend on that would be terms generated by highly thoughtful people. I would bet they are underspending on these terms. Also focusing on what these terms translate to in other languages, and doing more deep talent search in other countries and trying to integrate those people into our network. Is anyone buying ads on Baidu for the Chinese equivalent of the word “utilitarianism”? There could be a lot of low-hanging fruit like this that hasn’t been considered.
I would totally love somebody to do this; I know of at least one attempt to do something a bit like this a while back, but it wasn’t easy / I don’t think it went anywhere in the end.
It’s possible my team at 80k would be best placed to try it again, so it’s going back on my longlist, thanks :)
I have thought for years that targeted EA outreach to ‘weirdoes’ on the internet is much better than college clubs. I think it’s much more likely to get aligned, interesting people.
Can you elaborate? What does that mean, specifically?
My take was inspired by seeing this take: https://www.lesswrong.com/posts/FuGfR3jL3sw6r8kB4/richard-ngo-s-shortform?commentId=YbqaALPE3G2wRRCGt
EA’s recruitment MO has been to recruit the best elites it can on the margin, which I agree with due to power laws. However I disagree how to measure “elite”. Selecting from people attending Ivy Leagues does adverse selection on the kind of person who gets into Ivy Leagues. Other people get into this rabbithole by following links on the internet. I would rather engage with someone who cares about ideas than someone following the power-seeking gradient. Now, SBF was both someone who was an early contributor to Felicifia and went to an elite university, so it’s not to say that college clubs aren’t drawing from both sets. On the margin though, these clubs will want to recruit themselves more by say tabling at their college, and that makes sense they want to do that but if I was a funder I would rather support something like say paying for some NEET running a Discord server to grow their server (depending on the topic naturally). This does select for less conscientiousness and my specific story for what to do could be wrong, but I think the overall thrust is right that selectivity should be more weird and in the age of AI we have better tooling for this kind of selection.
Concrete operationalization: There’s a long tail of search terms that orgs like CEA could do ad spend on that would be terms generated by highly thoughtful people. I would bet they are underspending on these terms. Also focusing on what these terms translate to in other languages, and doing more deep talent search in other countries and trying to integrate those people into our network. Is anyone buying ads on Baidu for the Chinese equivalent of the word “utilitarianism”? There could be a lot of low-hanging fruit like this that hasn’t been considered.
I’m not sure what I think about this recent take about the attention arms race but I think we share a sense of “changing up how things are advertised”. My point is more about subtle signalling in the information ecology.
It is possible I cached this thought a long time ago and haven’t properly investigated to see whether the evidence reflects this or we are in fact in the world where most of the portfolio of outreach resources are being spent the way I’d endorse. Like maybe more of the resources are going to these new AI safety Youtube videos instead of uni clubs, and the actual form of my critique should be comparing those videos to some other outreach tactic.
I don’t think I agree either with the idea of recruiting people from elite colleges or recruiting “Internet weirdoes”. I’m not against inviting in either of those kinds of people, but why target them specifically? I prefer a version of the EA movement that is more wholesome, populist, inclusive, and egalitarian.
I don’t mean populist in the typical political sense used these days of being against institutions, against experts, highly distrustful, framing things as good people vs. bad people, or adopting the “paranoid style”. I mean populist in the sense of believing average, everyday, ordinary people are good, have a lot to contribute, are diverse and heterogenous, are often talented, wise, intelligent, and moral, and often are full of surprises. A belief in people, in the average person, in the median person, in the diversity of people who are never quite captured by an average or a median.
I don’t like the somewhat more traditional, more institutionalist elitism you sometimes see in EA, and I don’t like the idiosyncratic, anti-institutionalist nerd elitism of the rationalist community, where people seem to think the best people by far, and maybe the only people really worth a damn, are them, or people just like them. I’m a weird person, and I’ve often had to fight to find a place in the world, but I think it’s the wrong lesson to learn to say, “People treated me badly because I was different and acted like I was inferior just because I wasn’t like them… now I finally see the truth… it’s normal people who are inferior and it’s people like me who are better than everyone else!” Good job, God or karma or whatever sent you a trial so you’d have a chance to become more enlightened and learn compassion, and instead you’re repeating the cycle of samsara. Better luck next life.
It’s possible there are all kinds of ways to reach people from different walks of life that would be a good idea. I’m just highly suspicious of any idea that there’s a superior kind of person, suspiciously similar to the person saying who’s superior and who’s not, and that outreach should be focused specifically on that kind of person.
I would totally love somebody to do this; I know of at least one attempt to do something a bit like this a while back, but it wasn’t easy / I don’t think it went anywhere in the end.
It’s possible my team at 80k would be best placed to try it again, so it’s going back on my longlist, thanks :)