I appreciate the impressive epistemic humility it must have taken for one of the original and most prestigious alignment research orgs to decide that right now prioritising policy and communications work over research might be the best course to follow. I would imagine that might be a somewhat painful decision for technical people who have devoted their life to finding a technical solution. Nice one!
“Although we plan to pursue all three of these priorities, it’s likely that policy and communications will be a higher priority for MIRI than research going forward.”
I understand what it’s like to think of a really funny joke and not want to waste it. But this isn’t an appropriate environment to substitute charisma for substance.
If EA grows by, say, 30% per year, then that means at any given time there’s going to be a large number of people on the forum who will see this, think it’s normal, and upvote it (reinforcing that behavior). Even if professional norms hold strong, it will still make the onboarding process that much harder and more confusing for the new people, as they are misled into making serious social-status-damaging faux passes, and that reputation might follow them around in the community for years regardless of how talented or valuable they become.
Yes I was being sincere. I might have missed some meta thing here as obviously I’m not steeped in AI alignment. Perhaps Trevor intended to reply on another comment but mistakenly replied here?
Oops! I’m off my groove today, sorry. I’m going to go read up on some of the conflict theory vs. mistake theory literature on my backlog in order to figure out what went wrong and how to prevent it (e.g. how human variation and inferential distance causes very strange mistakes due to miscommunication).
This is why I don’t think the goal should be to grow the movement. Movements that grow by seeking converts usually end up drifting far from their original mission and taking on negative, irrational aspects of the societies they emerge from. Religious and political history provide dozens of examples of this process taking place.
EA should be about quality over quantity just in my opinion, and “social status” is both figuratively and literally worthless in the face of extinction.
I appreciate the impressive epistemic humility it must have taken for one of the original and most prestigious alignment research orgs to decide that right now prioritising policy and communications work over research might be the best course to follow. I would imagine that might be a somewhat painful decision for technical people who have devoted their life to finding a technical solution. Nice one!
“Although we plan to pursue all three of these priorities, it’s likely that policy and communications will be a higher priority for MIRI than research going forward.”
Strong downvoted. This isn’t a laughing matter.
I understand what it’s like to think of a really funny joke and not want to waste it. But this isn’t an appropriate environment to substitute charisma for substance.
If EA grows by, say, 30% per year, then that means at any given time there’s going to be a large number of people on the forum who will see this, think it’s normal, and upvote it (reinforcing that behavior). Even if professional norms hold strong, it will still make the onboarding process that much harder and more confusing for the new people, as they are misled into making serious social-status-damaging faux passes, and that reputation might follow them around in the community for years regardless of how talented or valuable they become.
I assumed Nick was being sincere?
Yes I was being sincere. I might have missed some meta thing here as obviously I’m not steeped in AI alignment. Perhaps Trevor intended to reply on another comment but mistakenly replied here?
Oops! I’m off my groove today, sorry. I’m going to go read up on some of the conflict theory vs. mistake theory literature on my backlog in order to figure out what went wrong and how to prevent it (e.g. how human variation and inferential distance causes very strange mistakes due to miscommunication).
I’m confused. The comment reads as sincere to me? What part of it did you think was a joke?
This is why I don’t think the goal should be to grow the movement. Movements that grow by seeking converts usually end up drifting far from their original mission and taking on negative, irrational aspects of the societies they emerge from. Religious and political history provide dozens of examples of this process taking place.
EA should be about quality over quantity just in my opinion, and “social status” is both figuratively and literally worthless in the face of extinction.