But in many contexts this may not be the case: as I’ve explained, I may profit from reading some discussions which is a kind of engagement. You’ve omitted that part of my response. Or think of philosophers of science discussing the efficiency of scientific research in, say, a specific scientific domain (in which, as philosophers, they’ve never participated). Knowledge-of doesn’t necessarily have to be knowledge obtained by an object-level engagement in the given field.
as I’ve explained, I may profit from reading some discussions which is a kind of engagement.
OK, sure. But when I look at conversations about near term issues on this forum I see perfectly good discussion (e.g. http://effective-altruism.com/ea/xo/givewells_charity_recommendations_require_taking/), and nothing that looks bad. And the basic idea that a forum can’t talk about a particular cause productively merely because most of them reject that cause (even if they do so for poor reasons) is simply unsubstantiated and hard to believe in the first place, on conceptual grounds.
Or think of philosophers of science discussing the efficiency of scientific research in, say, a specific scientific domain (in which, as philosophers, they’ve never participated).
This kind of talk has a rather mixed track record, actually. (source: I’ve studied economics and read the things that philosophers opine about economic methodology)
Right, and I agree! But here’s the thing (which I haven’t mentioned so far, so maybe it helps): I think some people just don’t participate in this forum much. For instance, there is a striking gender imbalance (I think more than 70% on here are men) and while I have absolutely no evidence to correlate this with near/far-future issues, I wouldn’t be surprised if it’s somewhat related (e.g. there are not so many tech-interested non-males in EA). Again, this is now just a speculation. And perhaps it’s worth a shot to try an environment that will feel safe for those who are put-off by AI-related topics/interests/angles.
I think some people just don’t participate in this forum much.
Absofuckinglutely, so let’s not make that problem worse by putting them into their own private Discord. As I said at the start, this is creating the problem that it is trying to solve.
And perhaps it’s worth a shot to try an environment that will feel safe for those who are put-off by AI-related topics/interests/angles.
EA needs to adhere to high standards of intellectual rigor, therefore it can’t fracture and make wanton concessions to people who feel emotional aversion to people with a differing point of view. The thesis that our charitable dollars ought to be given to x-risk instead of AMF is so benign and impersonal that it beggars belief that a reasonable person will feel upset or unsafe upon being exposed to widespread opinion in favor of it. Remember that the “near-term EAs” have been pushing a thesis that is equally alienating to people outside EA. For years, EAs of all stripes have been saying to stop giving money to museums and universities and baseball teams, that we must follow rational arguments and donate to faraway bed net charities which are mathematically demonstrated to have the greatest impact, and (rightly) expect outsiders to meet these arguments with rigor and seriousness; for some of these EAs to then turn around and object that they feel “unsafe”, and need a “safe space”, because there is a “bubble” of people who argue from a different point of view on cause prioritization is damningly hypocritical. The whole point of EA is that people are going to tell you that you are wrong about your charitable cause, and you shouldn’t set it in protective concrete like faith or identity.
While I largely agree with your idea, I just don’t understand why you think that a new space would divide people who anyway aren’t on this forum to begin with? Like I said, 70% on here are men. So how are you gonna attract more non-male participants? This topic may be unrelated, but let’s say we find out that the majority of non-males have preferences that would be better align with a different type of venue. Isn’t that a good enough reason to initiate it? Why would it that be conflicting, rather than complementary with this forum?
I just don’t understand why you think that a new space would divide people who anyway aren’t on this forum to begin with
I stated the problems in my original comment.
So how are you gonna attract more non-male participants
The same ways that we attract male participants, but perhaps tailored more towards women.
let’s say we find out that the majority of non-males have preferences that would be better align with a different type of venue. Isn’t that a good enough reason to initiate it?
It depends on the “different type of venue.”
Why would it that be conflicting, rather than complementary with this forum?
Because it may entail the problems that I gave in my original comment.
I didn’t reduce it. I only claim that it requires personal experience as a significant part of the picture.
But in many contexts this may not be the case: as I’ve explained, I may profit from reading some discussions which is a kind of engagement. You’ve omitted that part of my response. Or think of philosophers of science discussing the efficiency of scientific research in, say, a specific scientific domain (in which, as philosophers, they’ve never participated). Knowledge-of doesn’t necessarily have to be knowledge obtained by an object-level engagement in the given field.
OK, sure. But when I look at conversations about near term issues on this forum I see perfectly good discussion (e.g. http://effective-altruism.com/ea/xo/givewells_charity_recommendations_require_taking/), and nothing that looks bad. And the basic idea that a forum can’t talk about a particular cause productively merely because most of them reject that cause (even if they do so for poor reasons) is simply unsubstantiated and hard to believe in the first place, on conceptual grounds.
This kind of talk has a rather mixed track record, actually. (source: I’ve studied economics and read the things that philosophers opine about economic methodology)
Right, and I agree! But here’s the thing (which I haven’t mentioned so far, so maybe it helps): I think some people just don’t participate in this forum much. For instance, there is a striking gender imbalance (I think more than 70% on here are men) and while I have absolutely no evidence to correlate this with near/far-future issues, I wouldn’t be surprised if it’s somewhat related (e.g. there are not so many tech-interested non-males in EA). Again, this is now just a speculation. And perhaps it’s worth a shot to try an environment that will feel safe for those who are put-off by AI-related topics/interests/angles.
Absofuckinglutely, so let’s not make that problem worse by putting them into their own private Discord. As I said at the start, this is creating the problem that it is trying to solve.
EA needs to adhere to high standards of intellectual rigor, therefore it can’t fracture and make wanton concessions to people who feel emotional aversion to people with a differing point of view. The thesis that our charitable dollars ought to be given to x-risk instead of AMF is so benign and impersonal that it beggars belief that a reasonable person will feel upset or unsafe upon being exposed to widespread opinion in favor of it. Remember that the “near-term EAs” have been pushing a thesis that is equally alienating to people outside EA. For years, EAs of all stripes have been saying to stop giving money to museums and universities and baseball teams, that we must follow rational arguments and donate to faraway bed net charities which are mathematically demonstrated to have the greatest impact, and (rightly) expect outsiders to meet these arguments with rigor and seriousness; for some of these EAs to then turn around and object that they feel “unsafe”, and need a “safe space”, because there is a “bubble” of people who argue from a different point of view on cause prioritization is damningly hypocritical. The whole point of EA is that people are going to tell you that you are wrong about your charitable cause, and you shouldn’t set it in protective concrete like faith or identity.
While I largely agree with your idea, I just don’t understand why you think that a new space would divide people who anyway aren’t on this forum to begin with? Like I said, 70% on here are men. So how are you gonna attract more non-male participants? This topic may be unrelated, but let’s say we find out that the majority of non-males have preferences that would be better align with a different type of venue. Isn’t that a good enough reason to initiate it? Why would it that be conflicting, rather than complementary with this forum?
I stated the problems in my original comment.
The same ways that we attract male participants, but perhaps tailored more towards women.
It depends on the “different type of venue.”
Because it may entail the problems that I gave in my original comment.