I appreciate the response here, but would flag that this came off, to me, as a bit mean-spirited.
One specific part: > If you think that not trusting you is good, because you are liable to certain suboptimal mechanisms established early on, then are you acknowledging that your recommendations are suboptimal? Where would you suggest that impact-focused donors in EA look?
1. He said “less trust”, not “not trust at all”. I took that to mean something like, “don’t place absolute reverence in our public messaging.” 2. I’m sure anyone reasonable would acknowledge that their recommendations are less than optimal. 3. “Where would you suggest that impact-focused donors in EA look” → There’s not one true source that you should only pay attention to. You should probably look at a diversity of sources, including OP’s work.
“less trust”, not “not trust at all”. I took that to mean something like, “don’t place absolute reverence in our public messaging.” … look at a diversity of sources, including OP’s work.
I appreciate the response here, but would flag that this came off, to me, as a bit mean-spirited.
One specific part:
> If you think that not trusting you is good, because you are liable to certain suboptimal mechanisms established early on, then are you acknowledging that your recommendations are suboptimal? Where would you suggest that impact-focused donors in EA look?
1. He said “less trust”, not “not trust at all”. I took that to mean something like, “don’t place absolute reverence in our public messaging.”
2. I’m sure anyone reasonable would acknowledge that their recommendations are less than optimal.
3. “Where would you suggest that impact-focused donors in EA look” → There’s not one true source that you should only pay attention to. You should probably look at a diversity of sources, including OP’s work.
That makes sense, probably the solution.