I tried not to imply that OP directly opposes diversity (my comment was initially phrased more harshly and I changed it before publishing) - so I’m sorry that’s still how I came across.
And I don’t really get what you mean by competence differences etc. There’s no single axis, competence at which makes people’s posts on the forum more valuable, and similarly no single axis for getting hired by EA orgs.
There might be some common ones. But even then, I think my logic stands: notice that OP talks about EA orgs in particular. Meaning OP does want to see a higher concentration of posts with views correlated to those of EA org employees. But that means a lower concentration of posts from people with views who don’t directly align with EA orgs—which would cause a cycle of blocking more diverse views.
Edit: I forgot to add, OP could have phrased this differently, saying that people with productive things to say (which I assume is what they may have meant by “better takes”) would be busier doing productive work and have less time to post here. Which I don’t necessarily buy, but let’s roll with it. Instead, they chose to focus on EA orgs in particular.
Edit: I forgot to add, OP could have phrased this differently, saying that people with productive things to say (which I assume is what they may have meant by “better takes”) would be busier doing productive work and have less time to post here. Which I don’t necessarily buy, but let’s roll with it. Instead, they chose to focus on EA orgs in particular.
The causal reason I worded it that way is that I wrote down this list very quickly, and I’m in an office with people who work at EA orgs and would write higher quality posts than average, so it was salient, even if it’s not the only mechanism for having better things to do.
I also want to point out that “people who work at EA orgs” doesn’t imply infinite conformity. It just means they fit in at some role at some organization that is trying to maximize good and/or is funded by OpenPhil/FTX (who fund lots of things, including lots of criticism). I frequently hear minority opinions like these:
Biosecurity is more pressing than alignment due to tractability
Chickens are not conscious and can’t suffer
The best way to do alignment research is to develop a videogame as a testbed for multi-agent coordination problems
Alignment research is not as good as people think due to s-risk from near misses
Instead of trying to find AI safety talent at elite universities, we should go to remote villages in India
I should probably have been a bit more charitable in thinking why you wrote it specifically like this.
minority opinions like these:
These might be minority opinions in the sense that they have some delta from the majority opinions, but they still form a tiny cluster in opinion space together with that majority.
You don’t often hear, for example:
People who think there’s no significant risk from AI
People who think extinction is only slightly worse than a global catastrophy that kills 99.99% of the population
People who think charities are usually net negative, including those with high direct impact
Socialists
Or other categories which are about experience rather than views, like:
Psychologists
People who couldn’t afford time off to interview for an EA org
People who grew up in developing countries (the movement seems to have partial success there, but do these people work in EA orgs yet?)
But even then, I think my logic stands: notice that OP talks about EA orgs in particular. Meaning OP does want to see a higher concentration of posts with views correlated to those of EA org employees. But that means a lower concentration of posts from people with views who don’t directly align with EA orgs—which would cause a cycle of blocking more diverse views.
I suspect OP doesn’t want more posts from employees at EA orgs because they are such employees—I understood OP as wanting higher quality posts, wherever they come from.
True, the post does suggest that employees at EA orgs make higher quality posts on average, and that they may have less time to post on the Forum than the average user, but those are empirical matters (and seem plausible to me, anyway).
Edit to add: I generally didn’t get the feeling that OP wholeheartedly supports intervening on any or each of these possible explanations, or that doing so wouldn’t risk other negative consequences (e.g. increased groupthink).
I suspect OP doesn’t want more posts from employees at EA orgs because they are such employees—I understood OP as wanting higher quality posts, wherever they come from.
Indeed, this is why I wrote “a higher concentration of posts with views correlated to those of EA org employees”. It doesn’t matter whether there’s causality here—encouraging the correlation is itself a problem.
I agree with your edit more than with the rest of your comment.
It would be uncharitable to interpret “takes” to be about people’s specific views. Instead, it’s about things like the following:
Do I learn something from talking to this person? When I dig deeper into the reasons why they believe what they believe, do I find myself surprised by good arguments or depth of thought, or something of the like? Or does it just seem like they’re parroting something or are ideologically clouded and can’t seem to reason well? Do they seem interested in truth-seeking, intellectually honest, etc? Do they seem to have “good judgment” or do they make arguments where it feels like the conclusions don’t even follow from their premises and they’re just generally off about the way things work? [There are tons of other factors that go into this; I’m just gesturing at some of the things.]
Regarding competence, there’s no single axis but that doesn’t mean the concept isn’t meaningful. Lots of concepts work like that – they’re fuzzy but still meaningful.
To be fair, some things might be less about competence and more about not having equally “high standards.” For instance, I notice that sometimes people new to EA make posts on some specific topic that are less thorough than some post from 5-10 years ago that long-term EAs would consider “canonical.” And these new posts don’t add new considerations or even miss important considerations discussed in the older post. In that case, the new person may still be “competent” in terms of intelligence or even reasoning ability, but they would lack a kind of obsessiveness and high standards about what they’re doing (otherwise they’d probably have done more reading about the topic they were going to make a top-level post about – instead of asking questions, which is always an option!). So, it could also be a cultural thing that’s more about lack of obsessiveness (“not bothering to read most of what seems relevant”) or high standards, rather than (just) about “competence.”
(And, for what it’s worth, I think it’s totally forgivable to occasionally make posts that weren’t aware of everything that’s previously been written. It would be absurd to expect newcomers to read everything. It just gets weird if most of someone’s posts are “worse than redundant” in that way and if they make lots of such posts and they’re all confidently phrased so you get the impression that the person writing the post is convinced they’ll be changing the minds of lots of EAs.)
I can imagine a world where the things you wrote, like “Do I learn something from talking to that person”, are the sole measure of “posting quality”. I don’t personally think such a world is favorable (e.g. I’d rather someone who often posts smart things stay off the forum if they promote bigoted views). But I also don’t think that’s the world we’re in.
People cannot separate judgement of these things from judgement of a person’s views, even if they think they can. In practice, forum posts are often judged by the views they express (“capitalism is bad” is frowned upon), and even worse, by their style of reasoning (STEM-like arguments and phrasing is much more accepted, quantification and precision are encouraged even when inappropriate). Object-level engagement is appreciated over other forms, disregarding that it is only sometimes right to engage this way.
As I see it, the vision of a rational, logical, strongly truth-seeking forum is an illusion, and this illusion is used to drive out people with more diverse backgrounds or who come from underrepresented schools of thought.
High Standards
I personally have very high standards. There are many posts I want to write, but I really want them to be thorough and convincing, and to engage with relevant materials. You can see the result—I have written none! Is this actually helpful?
I think there can be value in posts that reiterate old content, perhaps even when they leave out important bits or have problematic logic. I have two reasons:
The forum guides the movement not only through building a common knowledge base, but also through representing the growing community’s views. If, for example, 8 years ago someone had written that it’s acceptable to work for a tobacco company in order to donate to high impact charities—how would you know how many current EAs share that view? The view itself is not an empirical question, and the old post’s karma tells you nothing about this. A new post, letting the community reengage with the ideas, might.
As noted in the OP and elsewhere, EAs love to criticise EA. I’m in favor of that—there are lots of problems, and we need to notice and fix them. Alas, many are noticed but then not fixed. If 8 years ago someone had written about how diversity of experience is important, but nowadays the movement is still composed almost entirely of people from Western countries, and most community building resources also go there—it means no meaningful action is being taken to fix the problem, so it needs to be reiterated.
Re OP’s point 3:
I tried not to imply that OP directly opposes diversity (my comment was initially phrased more harshly and I changed it before publishing) - so I’m sorry that’s still how I came across.
And I don’t really get what you mean by competence differences etc. There’s no single axis, competence at which makes people’s posts on the forum more valuable, and similarly no single axis for getting hired by EA orgs.
There might be some common ones. But even then, I think my logic stands: notice that OP talks about EA orgs in particular. Meaning OP does want to see a higher concentration of posts with views correlated to those of EA org employees. But that means a lower concentration of posts from people with views who don’t directly align with EA orgs—which would cause a cycle of blocking more diverse views.
Edit: I forgot to add, OP could have phrased this differently, saying that people with productive things to say (which I assume is what they may have meant by “better takes”) would be busier doing productive work and have less time to post here. Which I don’t necessarily buy, but let’s roll with it. Instead, they chose to focus on EA orgs in particular.
The causal reason I worded it that way is that I wrote down this list very quickly, and I’m in an office with people who work at EA orgs and would write higher quality posts than average, so it was salient, even if it’s not the only mechanism for having better things to do.
I also want to point out that “people who work at EA orgs” doesn’t imply infinite conformity. It just means they fit in at some role at some organization that is trying to maximize good and/or is funded by OpenPhil/FTX (who fund lots of things, including lots of criticism). I frequently hear minority opinions like these:
Biosecurity is more pressing than alignment due to tractability
Chickens are not conscious and can’t suffer
The best way to do alignment research is to develop a videogame as a testbed for multi-agent coordination problems
Alignment research is not as good as people think due to s-risk from near misses
Instead of trying to find AI safety talent at elite universities, we should go to remote villages in India
Hi, thanks for responding!
I should probably have been a bit more charitable in thinking why you wrote it specifically like this.
These might be minority opinions in the sense that they have some delta from the majority opinions, but they still form a tiny cluster in opinion space together with that majority.
You don’t often hear, for example:
People who think there’s no significant risk from AI
People who think extinction is only slightly worse than a global catastrophy that kills 99.99% of the population
People who think charities are usually net negative, including those with high direct impact
Socialists
Or other categories which are about experience rather than views, like:
Psychologists
People who couldn’t afford time off to interview for an EA org
People who grew up in developing countries (the movement seems to have partial success there, but do these people work in EA orgs yet?)
I suspect OP doesn’t want more posts from employees at EA orgs because they are such employees—I understood OP as wanting higher quality posts, wherever they come from.
True, the post does suggest that employees at EA orgs make higher quality posts on average, and that they may have less time to post on the Forum than the average user, but those are empirical matters (and seem plausible to me, anyway).
Edit to add: I generally didn’t get the feeling that OP wholeheartedly supports intervening on any or each of these possible explanations, or that doing so wouldn’t risk other negative consequences (e.g. increased groupthink).
Indeed, this is why I wrote “a higher concentration of posts with views correlated to those of EA org employees”. It doesn’t matter whether there’s causality here—encouraging the correlation is itself a problem.
I agree with your edit more than with the rest of your comment.
It would be uncharitable to interpret “takes” to be about people’s specific views. Instead, it’s about things like the following:
Do I learn something from talking to this person? When I dig deeper into the reasons why they believe what they believe, do I find myself surprised by good arguments or depth of thought, or something of the like? Or does it just seem like they’re parroting something or are ideologically clouded and can’t seem to reason well? Do they seem interested in truth-seeking, intellectually honest, etc? Do they seem to have “good judgment” or do they make arguments where it feels like the conclusions don’t even follow from their premises and they’re just generally off about the way things work? [There are tons of other factors that go into this; I’m just gesturing at some of the things.]
Regarding competence, there’s no single axis but that doesn’t mean the concept isn’t meaningful. Lots of concepts work like that – they’re fuzzy but still meaningful.
To be fair, some things might be less about competence and more about not having equally “high standards.” For instance, I notice that sometimes people new to EA make posts on some specific topic that are less thorough than some post from 5-10 years ago that long-term EAs would consider “canonical.” And these new posts don’t add new considerations or even miss important considerations discussed in the older post. In that case, the new person may still be “competent” in terms of intelligence or even reasoning ability, but they would lack a kind of obsessiveness and high standards about what they’re doing (otherwise they’d probably have done more reading about the topic they were going to make a top-level post about – instead of asking questions, which is always an option!). So, it could also be a cultural thing that’s more about lack of obsessiveness (“not bothering to read most of what seems relevant”) or high standards, rather than (just) about “competence.”
(And, for what it’s worth, I think it’s totally forgivable to occasionally make posts that weren’t aware of everything that’s previously been written. It would be absurd to expect newcomers to read everything. It just gets weird if most of someone’s posts are “worse than redundant” in that way and if they make lots of such posts and they’re all confidently phrased so you get the impression that the person writing the post is convinced they’ll be changing the minds of lots of EAs.)
Views vs. “Other Things”
I can imagine a world where the things you wrote, like “Do I learn something from talking to that person”, are the sole measure of “posting quality”. I don’t personally think such a world is favorable (e.g. I’d rather someone who often posts smart things stay off the forum if they promote bigoted views). But I also don’t think that’s the world we’re in.
People cannot separate judgement of these things from judgement of a person’s views, even if they think they can. In practice, forum posts are often judged by the views they express (“capitalism is bad” is frowned upon), and even worse, by their style of reasoning (STEM-like arguments and phrasing is much more accepted, quantification and precision are encouraged even when inappropriate). Object-level engagement is appreciated over other forms, disregarding that it is only sometimes right to engage this way.
As I see it, the vision of a rational, logical, strongly truth-seeking forum is an illusion, and this illusion is used to drive out people with more diverse backgrounds or who come from underrepresented schools of thought.
High Standards
I personally have very high standards. There are many posts I want to write, but I really want them to be thorough and convincing, and to engage with relevant materials. You can see the result—I have written none! Is this actually helpful?
I think there can be value in posts that reiterate old content, perhaps even when they leave out important bits or have problematic logic. I have two reasons:
The forum guides the movement not only through building a common knowledge base, but also through representing the growing community’s views. If, for example, 8 years ago someone had written that it’s acceptable to work for a tobacco company in order to donate to high impact charities—how would you know how many current EAs share that view? The view itself is not an empirical question, and the old post’s karma tells you nothing about this. A new post, letting the community reengage with the ideas, might.
As noted in the OP and elsewhere, EAs love to criticise EA. I’m in favor of that—there are lots of problems, and we need to notice and fix them. Alas, many are noticed but then not fixed. If 8 years ago someone had written about how diversity of experience is important, but nowadays the movement is still composed almost entirely of people from Western countries, and most community building resources also go there—it means no meaningful action is being taken to fix the problem, so it needs to be reiterated.