This post uses an alarmist tone to trigger emotions (“the vultures are circling”). I’d like to see more light and less heat. How common is this? What’s the evidence?
People have strong aversions to cheating and corruption, which is largely a good thing—but it can also lead to conversations on such issues getting overly emotional in a way that’s not helpful.
I might be in the minority view here but I liked the style this post was written in, emotive language and all. It was flowery language but that made it fun to read it and I did not find it to be alarmist (e.g. it clearly says “this problem has yet to become an actual problem”).
And more importantly I think the EA Forum is already a daunting place and it is hard enough for newcomers to post here without having to face everyone upvoting criticisms of their tone / writing style / post title. It Is not the perfect post (I think there is a very valid critique in what Stefan says that the post could have benefited from linking to some examples / evidence) but not everything here needs to be in the perfect EA-speak. Especially stuff from newcomers.
So welcome CitizenTen. Nice to have you here and to hear your views. I want to say I enjoyed reading the post (don’t fully agree tho) and thank you for it. :-)
My bad. Any good ideas for what the title should change to? Also, I’d just like to note that this is not yet very common at all. My evidence is just hearsay, anecdotes, and people that I’ve talked to. So if it was overly alarmist I’m sorry. That was not my attention. Once again, I’m more noting the change in tone on how some people are treating the grants then anything. Instead of being excited about cause area X and then using the grants as a way to achieve their goals, people are instead excited about cause area X because they can get easy funding. Once again, I don’t think we should be alarmist about this, as funding less great/risky people would be a failure mode. I just wanted it to be common knowledge that this is happening (and probably?) going to get worse over time.
My evidence is just hearsay, anecdotes, and people that I’ve talked to
Once again, I don’t think we should be alarmist about this
Well, it’s pretty clear you said that you read this on the internet:
On many internet circles, there’s been a worrying tone. “You should apply for [insert EA grant], all I had to do was pretend to care about x, and I got $$!” Or, “I’m not even an EA, but I can pretend, as getting a 10k grant is a good instrumental goal towards [insert-poor-life-goals-here]” Or, “Did you hear that a 16 year old got x amount of money? That’s ridiculous! I thought EA’s were supposed to be effective!” Or, “All you have to do is mouth the words community building and you get thrown bags of money.”
Your “boulder splash” is annoying. It pushes on the very issue and the adverse affects you claim to worry about. Noise and heat is self perpetuating. This transition into a higher funding environment is delicate, and outcomes depend on initial conditions and the liminal states. Reactions and confidence from the community and nascent leaders is important and benefits from a firm hand and the right tone.
I don’t think this is malice, but it’s clumsy and emotionally manipulative.
Yeah. I agree. Pointing to this problem can make the problem worse. It’s a little bit of an info-hazard in that respect? But yeah, I’ll agree it was slightly clumsy. I wanted to tell everyone that this was a thing that was happening, without creating a backlash that would destroy the genuinely valuable parts of doing what were doing. Furthermore, it is genuinely really valuable to have such a high trust community and I don’t want that to change. I guess whether or not I succeeded on walking this tightrope is for others to decide.
One of the devices not mentioned in my comment below is the utility of LessWrong as a filter/proxy for values. This can work but has a weakness because institutional literacy and intellectual honesty isn’t at the right aesthetic. You’ve demonstrated that with your post and your comment, which is poetic.
I know someone who has been trying to work on the problem (which isn’t very well elaborated on in your post) with sort of four arms:
Show, not tell, instances of the issue you worry about to increase literacy about it (without getting defunded)
Show, not tell, issues with intellectual honesty and subcultures (without getting defunded)
Setup institutions and mechanisms to solve the issue
Increase literacy about the design of EA and how funding is decided
It turns out this project is pretty hard.
The money thing itself isn’t that hard. “Meta-AI” stuff in business is everywhere. What’s tricky is showing not telling, and handling the cause area activism/proxy and consequent issues. If you’re trying to stand in multiple cause areas, which is necessary, it’s an absurd situation right now and unfair to work in.
This post uses an alarmist tone to trigger emotions (“the vultures are circling”). I’d like to see more light and less heat. How common is this? What’s the evidence?
People have strong aversions to cheating and corruption, which is largely a good thing—but it can also lead to conversations on such issues getting overly emotional in a way that’s not helpful.
I might be in the minority view here but I liked the style this post was written in, emotive language and all. It was flowery language but that made it fun to read it and I did not find it to be alarmist (e.g. it clearly says “this problem has yet to become an actual problem”).
And more importantly I think the EA Forum is already a daunting place and it is hard enough for newcomers to post here without having to face everyone upvoting criticisms of their tone / writing style / post title. It Is not the perfect post (I think there is a very valid critique in what Stefan says that the post could have benefited from linking to some examples / evidence) but not everything here needs to be in the perfect EA-speak. Especially stuff from newcomers.
So welcome CitizenTen. Nice to have you here and to hear your views. I want to say I enjoyed reading the post (don’t fully agree tho) and thank you for it. :-)
I also thought that the post provided no support for its main claim, which is that people think that EAs are giving money away in a reckless fashion.
Even if people are new, we should not encourage poor epistemic norms.
The claim sounds plausible to me and that’s enough to warrant a post to encourage people to think about this.
:-)
My bad. Any good ideas for what the title should change to? Also, I’d just like to note that this is not yet very common at all. My evidence is just hearsay, anecdotes, and people that I’ve talked to. So if it was overly alarmist I’m sorry. That was not my attention. Once again, I’m more noting the change in tone on how some people are treating the grants then anything. Instead of being excited about cause area X and then using the grants as a way to achieve their goals, people are instead excited about cause area X because they can get easy funding. Once again, I don’t think we should be alarmist about this, as funding less great/risky people would be a failure mode. I just wanted it to be common knowledge that this is happening (and probably?) going to get worse over time.
Fair enough—thanks for your gracious response.
Well, it’s pretty clear you said that you read this on the internet:
Your “boulder splash” is annoying. It pushes on the very issue and the adverse affects you claim to worry about. Noise and heat is self perpetuating. This transition into a higher funding environment is delicate, and outcomes depend on initial conditions and the liminal states. Reactions and confidence from the community and nascent leaders is important and benefits from a firm hand and the right tone.
I don’t think this is malice, but it’s clumsy and emotionally manipulative.
Yeah. I agree. Pointing to this problem can make the problem worse. It’s a little bit of an info-hazard in that respect? But yeah, I’ll agree it was slightly clumsy. I wanted to tell everyone that this was a thing that was happening, without creating a backlash that would destroy the genuinely valuable parts of doing what were doing. Furthermore, it is genuinely really valuable to have such a high trust community and I don’t want that to change. I guess whether or not I succeeded on walking this tightrope is for others to decide.
One of the devices not mentioned in my comment below is the utility of LessWrong as a filter/proxy for values. This can work but has a weakness because institutional literacy and intellectual honesty isn’t at the right aesthetic. You’ve demonstrated that with your post and your comment, which is poetic.
I know someone who has been trying to work on the problem (which isn’t very well elaborated on in your post) with sort of four arms:
Show, not tell, instances of the issue you worry about to increase literacy about it (without getting defunded)
Show, not tell, issues with intellectual honesty and subcultures (without getting defunded)
Setup institutions and mechanisms to solve the issue
Increase literacy about the design of EA and how funding is decided
It turns out this project is pretty hard.
The money thing itself isn’t that hard. “Meta-AI” stuff in business is everywhere. What’s tricky is showing not telling, and handling the cause area activism/proxy and consequent issues. If you’re trying to stand in multiple cause areas, which is necessary, it’s an absurd situation right now and unfair to work in.