Plausible that this post’s comments is not the optimal place for some of this. One may argue that each heading should be it’s own comment, but I’m slightly uncertain what mods prefer.
Skilling up
“to come to the bay over the summer” and “learn stuff and skill up”
I don’t think the meatspace value prop you outlined constitutes skilling up. That needs more justification.
Aligning some social rewards with one’s desire to believe true things or to help fix broken stuff seems plausibly critical for some people to keep their eye on the ball, or even to just take themselves seriously.Skilling up is not really related to this, except in the sense that you have to get whatever emotional structure that powers your motivation behind the grind.
But I’ve been basically worried that emphasis on giving students quickly legible levers early in their exposure to the movement is sort of looting scholarship from the effective altruism of the future, and the way this post evoked “skilling up” to talk about what it was interested in talking about, which I see as boiling down to networking, really triggered me.
I’d like to register a vote for an extremely boring notion of skilling up, which is: yes, textbooks. Yes, exercises. No, nodding along to a talk 3-10x more advanced than you and following the high level but not able to reproduce it at a low level. No, figuring out how to pattern match enough to flatter sr researchers at parties. Yes (often, or usually), starting hard projects quickly before an outside view would think you’ve done enough homework to be able to do it well, but the emphasis is on hard. Gaining friends and status may even be distracting!
Obviously there’s a debate to be had about a scholarship/entrepreneurship tradeoff—my sense is that people think we use to swing too far to the scholarship side, and now there will probably be an overcorrection. (One should also register that few people think getting kickass at stuff looks like locking oneself in a monastary with textbooks, but a longer treatment of the nuances there is out of scope for this comment).
But no, respectfully, I’m sorry but skilling up was not described in the post. Could you elaborate?
I think in star wars 8 there was this “sacred texts” situation, you may recall that after listening to luke rant about the importance of the sacred texts, yoda challenged whether he even read them. Luke says “well...”, implying that he flipped through them, but wasn’t intimately familiar with what they were saying. I’m personally bottlenecked in my current goals by not having knowledge that is sitting in books and papers I haven’t read yet! Which says nothing of doing avant garde or state of the art things. I think this post risks entering a kind of genre of message for young EAs, which is “you get kickass at stuff by exposing yourself to people who are kicking ass”, and I think that’s auxiliary—yes you need to construct the emotional substructure that keeps your eye on the ball (which socialization helps with), and yes you need advice from your elders from time to time—but no, you get kickass at stuff by grinding.
Landscape knowledge
The bay is not necessary for landscape knowledge.
I could’ve easily ended up digging into, say, MIRI’s research, only to realise very late that I actually think their approach is hopeless.
I (gave an 8 month college try at alignment) managed to form a map of and opinions about different threatmodels and research agendas online. I was at EA Hotel but there was only one other alignment person there and he was doing his own thing, not really interfacing with the funded cells of the movement. The discord/slack clusters are really good! AI Safety Camp is great! One widely cited, high status alignment researcher responded to a cold email with a calendly link. The internet remains an awesome place.
Clout and reputation
I’m not gonna say clout and reputation are amoral distractions—I think they’re tools, instruments that help everyone do compression and filtering, oneself included. I roughly model grantmakers as smart enough to know that clout is a proxy for plausible future impact at best, so I’m not gonna come here and say status games are disaligning the movement from it’s goals.
But jan_kulveit is 1000% right and it warrants repeating: networking and kicking ass are different things. What do I think goodharting on clout looks like at a low level? It looks like the social rewards for simply agreeing with the ideology leaving people satisfied, then they end up not doing hard projects.
Keeping one’s eye on the ball
I conjecture that some people
Need consistent social rewards for trying to believe true things or fix broken stuff in order to keep trying to do those things
See superlinearly increasing gains in the extra bandwidth of meatspace, or find a ton of value in the information that medium-bandwidth spaces like discord delete.
I think the bay can be counterfactually very good by increasing the impact of these people!
But I want the Minneapolis EA chapter to be powerful enough to support people who fail the university entrance exam. I don’t want to leave billion dollar bills (or a billion human lives) on the sidewalk because someone wasn’t legible or well connected at the right time. Keeping all our eggs in one basket seems bad in a myriad of ways.
People who can keep their eye on the ball, and grow as asskickers, without the bay should probably resist the bay, so that we build multipolar power. One of the arguments for this I have not advanced here is to do with correlated error, the idea that if we all lived together we may make homogenous mistakes, but perhaps another post.
Networks and caution
We should be cautious about allowing a set of standards to emerge where being good at parties (in the right city) correlates with generating opportunities.
Plausible that this post’s comments is not the optimal place for some of this. One may argue that each heading should be it’s own comment, but I’m slightly uncertain what mods prefer.
Skilling up
I don’t think the meatspace value prop you outlined constitutes skilling up. That needs more justification.
Aligning some social rewards with one’s desire to believe true things or to help fix broken stuff seems plausibly critical for some people to keep their eye on the ball, or even to just take themselves seriously. Skilling up is not really related to this, except in the sense that you have to get whatever emotional structure that powers your motivation behind the grind.
But I’ve been basically worried that emphasis on giving students quickly legible levers early in their exposure to the movement is sort of looting scholarship from the effective altruism of the future, and the way this post evoked “skilling up” to talk about what it was interested in talking about, which I see as boiling down to networking, really triggered me.
I’d like to register a vote for an extremely boring notion of skilling up, which is: yes, textbooks. Yes, exercises. No, nodding along to a talk 3-10x more advanced than you and following the high level but not able to reproduce it at a low level. No, figuring out how to pattern match enough to flatter sr researchers at parties. Yes (often, or usually), starting hard projects quickly before an outside view would think you’ve done enough homework to be able to do it well, but the emphasis is on hard. Gaining friends and status may even be distracting!
Obviously there’s a debate to be had about a scholarship/entrepreneurship tradeoff—my sense is that people think we use to swing too far to the scholarship side, and now there will probably be an overcorrection. (One should also register that few people think getting kickass at stuff looks like locking oneself in a monastary with textbooks, but a longer treatment of the nuances there is out of scope for this comment).
But no, respectfully, I’m sorry but skilling up was not described in the post. Could you elaborate?
I think in star wars 8 there was this “sacred texts” situation, you may recall that after listening to luke rant about the importance of the sacred texts, yoda challenged whether he even read them. Luke says “well...”, implying that he flipped through them, but wasn’t intimately familiar with what they were saying. I’m personally bottlenecked in my current goals by not having knowledge that is sitting in books and papers I haven’t read yet! Which says nothing of doing avant garde or state of the art things. I think this post risks entering a kind of genre of message for young EAs, which is “you get kickass at stuff by exposing yourself to people who are kicking ass”, and I think that’s auxiliary—yes you need to construct the emotional substructure that keeps your eye on the ball (which socialization helps with), and yes you need advice from your elders from time to time—but no, you get kickass at stuff by grinding.
Landscape knowledge
The bay is not necessary for landscape knowledge.
I (gave an 8 month college try at alignment) managed to form a map of and opinions about different threatmodels and research agendas online. I was at EA Hotel but there was only one other alignment person there and he was doing his own thing, not really interfacing with the funded cells of the movement. The discord/slack clusters are really good! AI Safety Camp is great! One widely cited, high status alignment researcher responded to a cold email with a calendly link. The internet remains an awesome place.
Clout and reputation
I’m not gonna say clout and reputation are amoral distractions—I think they’re tools, instruments that help everyone do compression and filtering, oneself included. I roughly model grantmakers as smart enough to know that clout is a proxy for plausible future impact at best, so I’m not gonna come here and say status games are disaligning the movement from it’s goals.
But jan_kulveit is 1000% right and it warrants repeating: networking and kicking ass are different things. What do I think goodharting on clout looks like at a low level? It looks like the social rewards for simply agreeing with the ideology leaving people satisfied, then they end up not doing hard projects.
Keeping one’s eye on the ball
I conjecture that some people
Need consistent social rewards for trying to believe true things or fix broken stuff in order to keep trying to do those things
See superlinearly increasing gains in the extra bandwidth of meatspace, or find a ton of value in the information that medium-bandwidth spaces like discord delete.
I think the bay can be counterfactually very good by increasing the impact of these people!
But I want the Minneapolis EA chapter to be powerful enough to support people who fail the university entrance exam. I don’t want to leave billion dollar bills (or a billion human lives) on the sidewalk because someone wasn’t legible or well connected at the right time. Keeping all our eggs in one basket seems bad in a myriad of ways.
People who can keep their eye on the ball, and grow as asskickers, without the bay should probably resist the bay, so that we build multipolar power. One of the arguments for this I have not advanced here is to do with correlated error, the idea that if we all lived together we may make homogenous mistakes, but perhaps another post.
Networks and caution
We should be cautious about allowing a set of standards to emerge where being good at parties (in the right city) correlates with generating opportunities.