To be frank, I think most of these criticisms are nonsense and I am happy that the EA community is not spending its time engaging with whatever the ‘metaphysical implications of the psychedelic experience’ are.
...
If the EA community has not thought sufficiently about a problem, anyone is very welcome to spend time thinking about it and do a write-up of what they learned… I would even wager that if someone wrote a convincing case for why we should be ‘taking dharma seriously’, then many would start taking it seriously.
These two bits seem fairly contradictory to me.
If you think a position is “nonsense” and you’re “happy that the EA community is not spending its time engaging with” it, is someone actually “very welcome” to do a write-up about it on the EA Forum?
In a world where a convincing case can be written for a weird view, should we really expect EAs to take that view seriously, if they’re starting from your stated position that the view is nonsense and not worth the time to engage with? (Can you describe the process by which a hypothetical weird-but-correct view would see widespread adoption?)
And, who would take the time to try & write up such a case? Milan said he thinks EA “basically can’t hear other flavors of important feedback”, suggesting a sense in which he agrees with your first paragraph—EAs tend to think these views are nonsense and not worth engaging with, therefore there is no point in defending them at length because no one is listening.
We were told by some that our critique is invalid because the community is already very cognitively diverse and in fact welcomes criticism… It was these same people that then tried to prevent this paper from being published.
It doesn’t feel contradictory to me, but I think I see where you’re coming from. I hold the following two beliefs which may seem contradictory :
1. Many of the aforementioned blindspots seem like nonsense, and I would be surprised if extensive research in any would produce much of value. 2. At large, people should form and act on their own beliefs rather than differing to what is accepted by some authority.
There’s an endless number of things which could turn out to be important. All else equal, EA’s should prioritise researching the things which seem the most likely to turn out to be important.
This is why I am happy that the EA community is not spending time engaging with many of these research directions, as I think they’re unlikely to bear fruit. That doesn’t mean I’m not willing to change my mind if I were presented a really good case for their importance!
If someone disagrees with my assessment then I would very much welcome research and write-ups, after which I would not be paying the cost of
”should I (or someone else) prioritise researching psychedelics over this other really important thing”
but rather
”should I prioritise reading this paper/writeup, over the many other potentially less important papers?”
If everyone would refuse to engage with even a short writeup on the topic, I would agree that there was a problem and to be fair I think there are some issues with misprioritisation due to poor use of proxies such as “does the field sound too weird” or “is the author high status”. But I think in the far majority of cases, what happens is simply that the writeup wasn’t sufficiently convincing to justify moving away resources from other important research fields to engage further. This will of course seem like a mistake to the people who are convinced of the topic’s importance, but like the correct action to those who aren’t.
These two bits seem fairly contradictory to me.
If you think a position is “nonsense” and you’re “happy that the EA community is not spending its time engaging with” it, is someone actually “very welcome” to do a write-up about it on the EA Forum?
In a world where a convincing case can be written for a weird view, should we really expect EAs to take that view seriously, if they’re starting from your stated position that the view is nonsense and not worth the time to engage with? (Can you describe the process by which a hypothetical weird-but-correct view would see widespread adoption?)
And, who would take the time to try & write up such a case? Milan said he thinks EA “basically can’t hear other flavors of important feedback”, suggesting a sense in which he agrees with your first paragraph—EAs tend to think these views are nonsense and not worth engaging with, therefore there is no point in defending them at length because no one is listening.
I’m reminded of this post which stated:
It doesn’t feel contradictory to me, but I think I see where you’re coming from. I hold the following two beliefs which may seem contradictory :
1. Many of the aforementioned blindspots seem like nonsense, and I would be surprised if extensive research in any would produce much of value.
2. At large, people should form and act on their own beliefs rather than differing to what is accepted by some authority.
There’s an endless number of things which could turn out to be important. All else equal, EA’s should prioritise researching the things which seem the most likely to turn out to be important.
This is why I am happy that the EA community is not spending time engaging with many of these research directions, as I think they’re unlikely to bear fruit. That doesn’t mean I’m not willing to change my mind if I were presented a really good case for their importance!
If someone disagrees with my assessment then I would very much welcome research and write-ups, after which I would not be paying the cost of
”should I (or someone else) prioritise researching psychedelics over this other really important thing”
but rather
”should I prioritise reading this paper/writeup, over the many other potentially less important papers?”
If everyone would refuse to engage with even a short writeup on the topic, I would agree that there was a problem and to be fair I think there are some issues with misprioritisation due to poor use of proxies such as “does the field sound too weird” or “is the author high status”. But I think in the far majority of cases, what happens is simply that the writeup wasn’t sufficiently convincing to justify moving away resources from other important research fields to engage further. This will of course seem like a mistake to the people who are convinced of the topic’s importance, but like the correct action to those who aren’t.