See also answers here mentioning that EA feels “intellectually stale”. A friend says he thinks a lot of impressive people have left the EA movement because of this :(
I feel bad, because I think maybe I was one of the first people to push the “avoid accidental harm” thing.
“Stagnation” was also the 5th most often mentioned reason for declining interest in EA, over the last 12 months, when we asked about this in the 2019 EA Survey, accounting for about 7.4% of responses.
There was some discussion about the issue of EA intellectual stagnation in this thread (like I say in my comment, I don’t agree that EA is stagnating).
Yeah, I think it’s very difficult to tell whether the trend which people take themselves be perceiving is explained by there having been a larger amount of low hanging fruit in the earlier years of EA, which led to people encountering a larger number of radical new ideas in the earlier years, or whether there’s actually been a slowdown in EA intellectual productivity. (Similarly, it may be that because people tend to encounter a lot of new ideas when they are first getting involved in EA, people perceive the insights being generated by EA as slowing down). I think it’s hard to tell whether EA is stagnating in a worrying sense in that it is not clear how much intellectual progress we should expect to see now that some of the low hanging fruit is already picked.
That said, I actually think that the positive aspects of EA’s professionalisation (which you point to in your other comment) may explain some of the perceptions described here, which I think are on the whole mistaken. I think in earlier years, there was a lot of amateur, broad speculation for and against various big questions in EA (e.g. big a priori arguments about AI versus animals, much of which was pretty wild and ill-informed). I think, conversely, we now have a much healthier ecosystem, with people making progress on the myriad narrower, technical problems that need to be addressed in order to address those broader questions.
Thanks David, this is more or less what I was trying to express with my response to Stefan in that thread.
I want to add that “making intellectual progress” has two different benefits: One is the obvious one, figuring out more true things so they can influence our actions to do more good. As you say, we may actually be doing better on that one.
The other one is to attract people to the community by it being an intellectually stimulating place. We might be losing the kind of people who answered ‘stagnation’ in the poll above, as they are not able to participate in the professionalised debates, if they happen in public at all.
On the other hand, this might mean that we are not deterring people anymore who may have felt like they need to be into intellectual debates to join the EA community. I don’t know what the right trade-off is, but I suspect it’s actually more important not to put latter group off.
I actually think the principles of deference to expertise and avoiding accidental harm are in principle good and we should continue using them. However, in EA the barrier to being seen as an expert is very low—often its enough to have written a blog or forum post on something, having invested less than 100 hours in total. For me an expert is someone who has spent the better part of his or her career working in a field, for example climate policy. While I think the former is still useful to give an introduction to a field, the latter form of expertise has been somewhat undervalued in EA.
I guess it depends on what topics you’re referring to, but regarding many topics, the bar for being seen as an expert within EA seems substantially higher than 100 hours.
See also answers here mentioning that EA feels “intellectually stale”. A friend says he thinks a lot of impressive people have left the EA movement because of this :(
I feel bad, because I think maybe I was one of the first people to push the “avoid accidental harm” thing.
“Stagnation” was also the 5th most often mentioned reason for declining interest in EA, over the last 12 months, when we asked about this in the 2019 EA Survey, accounting for about 7.4% of responses.
Thanks, David, for that data.
There was some discussion about the issue of EA intellectual stagnation in this thread (like I say in my comment, I don’t agree that EA is stagnating).
Yeah, I think it’s very difficult to tell whether the trend which people take themselves be perceiving is explained by there having been a larger amount of low hanging fruit in the earlier years of EA, which led to people encountering a larger number of radical new ideas in the earlier years, or whether there’s actually been a slowdown in EA intellectual productivity. (Similarly, it may be that because people tend to encounter a lot of new ideas when they are first getting involved in EA, people perceive the insights being generated by EA as slowing down). I think it’s hard to tell whether EA is stagnating in a worrying sense in that it is not clear how much intellectual progress we should expect to see now that some of the low hanging fruit is already picked.
That said, I actually think that the positive aspects of EA’s professionalisation (which you point to in your other comment) may explain some of the perceptions described here, which I think are on the whole mistaken. I think in earlier years, there was a lot of amateur, broad speculation for and against various big questions in EA (e.g. big a priori arguments about AI versus animals, much of which was pretty wild and ill-informed). I think, conversely, we now have a much healthier ecosystem, with people making progress on the myriad narrower, technical problems that need to be addressed in order to address those broader questions.
Thanks David, this is more or less what I was trying to express with my response to Stefan in that thread.
I want to add that “making intellectual progress” has two different benefits: One is the obvious one, figuring out more true things so they can influence our actions to do more good. As you say, we may actually be doing better on that one.
The other one is to attract people to the community by it being an intellectually stimulating place. We might be losing the kind of people who answered ‘stagnation’ in the poll above, as they are not able to participate in the professionalised debates, if they happen in public at all.
On the other hand, this might mean that we are not deterring people anymore who may have felt like they need to be into intellectual debates to join the EA community. I don’t know what the right trade-off is, but I suspect it’s actually more important not to put latter group off.
I actually think the principles of deference to expertise and avoiding accidental harm are in principle good and we should continue using them. However, in EA the barrier to being seen as an expert is very low—often its enough to have written a blog or forum post on something, having invested less than 100 hours in total. For me an expert is someone who has spent the better part of his or her career working in a field, for example climate policy. While I think the former is still useful to give an introduction to a field, the latter form of expertise has been somewhat undervalued in EA.
I guess it depends on what topics you’re referring to, but regarding many topics, the bar for being seen as an expert within EA seems substantially higher than 100 hours.