I think that these factors might be making it socially harder to be a non-longtermist who engages with the EA community, and that is an important and missing part of the ongoing discussion about EA community norms changing.
This has felt very true for me!
I came across EA way back around 2011 when I was at university, pre-longtermism… EA at that point formalised a lot of my existing thinking/values and I made graduate career decisions in line with 80k advice at the time. I started getting more involved again about a year ago and was surprised to see how things had changed! I’ve been increasingly engaging over the past year (including starting an EA job), but have often felt a strong sense of disconnection, and have heard similar from colleagues and friends who have followed EA for a while.
How has this impacted my interactions? Well this is actually my first comment on any EA Forum post! As an example, I remember reading a post recently about 80k’s updated view on climate change—it was almost entirely focused on whether it was an existential risk. That didn’t seem right to me and I almost wrote a comment, but in the end I felt like I was just coming from such a different perspective that it wasn’t worth it. I knew I hadn’t done much longtermist reading and I felt like I’d get shot down.
Kudos to the EA criticism contest for getting me to engage with this disengagement, look more closely at my gut feeling against long-termism and work through more ideas and reading. I’m hoping I’ll finding something useful to share as part of the contest—currently thinking it may be along the lines of trying to more eloquently express what I think gets missed when we simplify camps into “neartermism vs longtermism”. I feel like “neartermist” EA aligns with some values (fairness? reduction of inequality?) that longtermist EA may not, but also that we can do more to evaluate near-term causes (or even just less obviously evaluable longterm causes) with longterm methods/thinking.
Still a long way to go on this, but if you think I should look at any particular forum posts or reading in this area, please let me know.
I think something you raise here that’s really important is that there are probably fairly important tensions to explore between the worlds that having a neartermist view and longtermist view suggest we ought to be trying to build, and that tension seems underexplored in EA. E.g. an inherent tension between progress studies and x-risk reduction.
This has felt very true for me!
I came across EA way back around 2011 when I was at university, pre-longtermism… EA at that point formalised a lot of my existing thinking/values and I made graduate career decisions in line with 80k advice at the time. I started getting more involved again about a year ago and was surprised to see how things had changed! I’ve been increasingly engaging over the past year (including starting an EA job), but have often felt a strong sense of disconnection, and have heard similar from colleagues and friends who have followed EA for a while.
How has this impacted my interactions? Well this is actually my first comment on any EA Forum post! As an example, I remember reading a post recently about 80k’s updated view on climate change—it was almost entirely focused on whether it was an existential risk. That didn’t seem right to me and I almost wrote a comment, but in the end I felt like I was just coming from such a different perspective that it wasn’t worth it. I knew I hadn’t done much longtermist reading and I felt like I’d get shot down.
Kudos to the EA criticism contest for getting me to engage with this disengagement, look more closely at my gut feeling against long-termism and work through more ideas and reading. I’m hoping I’ll finding something useful to share as part of the contest—currently thinking it may be along the lines of trying to more eloquently express what I think gets missed when we simplify camps into “neartermism vs longtermism”. I feel like “neartermist” EA aligns with some values (fairness? reduction of inequality?) that longtermist EA may not, but also that we can do more to evaluate near-term causes (or even just less obviously evaluable longterm causes) with longterm methods/thinking.
Still a long way to go on this, but if you think I should look at any particular forum posts or reading in this area, please let me know.
I think something you raise here that’s really important is that there are probably fairly important tensions to explore between the worlds that having a neartermist view and longtermist view suggest we ought to be trying to build, and that tension seems underexplored in EA. E.g. an inherent tension between progress studies and x-risk reduction.