I like this comment. To respond to just a small part of it:
And not just by convincing others of “rationalism” or AI risk worries specifically (though I also don’t understand why the author discounts this type of ‘winning’)
I’ve also only read the excerpt, not the full post. There, the author seems to only exclude/discount as ‘winning’ convincing others of rationalism, not AI risk worries.
I had interpreted this exclusion/discounting as motivated by something like a worry about pyramid schemes. If the only way rationalism made one systematically more likely to ‘win’ was by making one better at convincing others of rationalism, then that ‘win’ wouldn’t provide any real value to the world; it could make the convincers rich and high-status, but by profiting off of something like a pyramid scheme.
This would seem similar to a person writing a book or teaching a course on something like how to get rich quick, but with that person seeming to have gotten rich quick only via those books or courses.
(I think the same thing would maybe be relevant with regards to convincing people of AI risk worries, if those worries were unfounded. But my view is that the worries are well-founded enough to warrant attention.)
But I think that, if rationalism makes people systematically more likely to ‘win’ in other ways as well, then convincing others of rationalism:
should also be counted as a ‘proper win’
would be more like someone being genuinely good at running businesses as well as being good at getting money for writing about their good approaches to running businesses, rather than like a pyramid scheme
I like this comment. To respond to just a small part of it:
I’ve also only read the excerpt, not the full post. There, the author seems to only exclude/discount as ‘winning’ convincing others of rationalism, not AI risk worries.
I had interpreted this exclusion/discounting as motivated by something like a worry about pyramid schemes. If the only way rationalism made one systematically more likely to ‘win’ was by making one better at convincing others of rationalism, then that ‘win’ wouldn’t provide any real value to the world; it could make the convincers rich and high-status, but by profiting off of something like a pyramid scheme.
This would seem similar to a person writing a book or teaching a course on something like how to get rich quick, but with that person seeming to have gotten rich quick only via those books or courses.
(I think the same thing would maybe be relevant with regards to convincing people of AI risk worries, if those worries were unfounded. But my view is that the worries are well-founded enough to warrant attention.)
But I think that, if rationalism makes people systematically more likely to ‘win’ in other ways as well, then convincing others of rationalism:
should also be counted as a ‘proper win’
would be more like someone being genuinely good at running businesses as well as being good at getting money for writing about their good approaches to running businesses, rather than like a pyramid scheme