I strongly think that people are sleeping on massive EV from encouraging others to read HPMOR, or to reread it if it’s been years. It costs nothing to read it, and it likely offers broad and intense upskilling.
I think OP omitted many details of why it might be plausible, and I wouldn’t expect the disagree voters to have any idea about what’s going on there:
To me, the literary value of EA stories is thinking through psychological context of trying to think more clearly, trying to be good, whatever. Building empathy for “how you would derive EA-shaped things and then build on them from within a social reward surface that isn’t explicitly selecting for that?”, “what emotional frictions or mistake classes would I expect?”, seems plausibly quite valuable for a ton of people.
Basically every other thing you could say about the value prop of reading Methods is downstream of this! “upskilling” was an intense word choice, but only 95% wrong.
With that in mind, I do think a ton of topics salient to EAs show up within Methods and many of them get thoroughly explored.
Additional thought: if we assume that people can gain skills from reading fiction (or from otherwise engaging in imaginary world, such as via films or games), does HPMOR give the best “return on investment” per hour spent? Is it better than reading War and Peace, or watching John Green videos, or playing Life is Strange? I’m skeptical that on EAs tend to be bias in favor of it, and therefore we would neglect other options.
(I’m not really expecting anyone to have actual data on this, but I’d be curious to see people bounce the idea around a bit)
Could you clarify what kind of upskiling you expect to come from reading Harry Potter fan fiction?
My “not rigorously thought out perspective” is that if someone has never encountered the idea of rationality or counterfactual, thinking, then this might introduce them to it. But I’m guessing that roughly similar benefits could be had from reading a much shorter book that is more directly targeted at teaching these skills (maybe Thinking Fast and Slow?).
I think it was unhelpful to refer to “Harry Potter fanfiction” here instead of perhaps “a piece of fiction”—I don’t think it’s actually more implausible that a fanfic would be valuable to read than some other kind of fiction, and your comment ended up seeming to me like it was trying to use the dishonest rhetorical strategy of implying without argument that the work is less likely to be valuable to read because it’s a fanfic.
Adjusted for popularity or likelihood of recommendation, you might naively expect fiction that someone is presented with to be more likely to stand the test of time than fan fiction, since the selection effects are quite different.
I think that is a fair and accurate criticism. I do view most fan fiction as fairly low quality, but even if that is true it doesn’t imply that all fan fiction is low quality. And I do think that some fiction can be used for self-improvement purposes.
The literary quality of the fiction (which, incidentally, I thought was terrible when I glanced at it out of curiosity), is fairly irrelevant to whether it helps people be more rational (which I am also skeptical of, but that’s a separate point.)
I do suspect that some Bay Are folk would benefit from reading at least one book of “serious” “literary” fiction with zero wizards or spaceships, like Middlemarch or To the Lighthouse, but I might just being snobby/trying to justify the 2 years of undergrad courses in Eng Lit I did.
I read to the lighthouse, not far away in time from when I read methods, and I was annoyed or confused about why I was reading it. And there was a while ten years ago when satantango and gravity’s rainbow were occupying a massive subset of my brain at all times, so I’m not like “too dumb for litfic” or whatever.
Sure. Not everyone has to like every book! I don’t like Don Quixote, which has frequently been claimed to be the greatest novel ever. I loved War and Peace when I was 18, but I’d cooled on its conservatism and misogyny when I last read it, though I still understood why people see it as “great”.
But I do think there is a tendency towards the grandiose and speculative in rationalist/nerd taste and away from the realist*, domestic, psychological, etc that I (probably pompously) think can be a bit limiting. Analogous to preferring Wagner to Mozart in opera, or prog metal to the Velvet Underground in “serious” rock music. (Also reminds me of Holden Karnosfky’s blogpost about being baffled by why Pet Sounds has such a strong reputation among rock critics when its so “pop” and allegedly lacking in “complexity”) I’ve never read Satantango, but Gravity’s Rainbow is verging pretty strongly on sci-fi, and has a very apocalyptic vibe, and a general desire to overwhelm.
Not that I’m slagging it: I really liked Gravity’s Rainbow when I read it, though that’s 18 years ago now, and I’ve hated the other Pynchon I’ve tried since. And I’m not slagging also being massive nerd. I will never be a huge prog/metal fan, but I have just leapt from replaying Dragon Age: Inquisition to starting Baldur’s Gate III.
*Technically speaking, I think “To the Lighthouse” is modernism not realism as lit profs would classify it.. But in this context that really just means “about posh people, not about a war, dense prose, interior monologues’, which isn’t really incompatible with “realism” in any ordinary sense.
I totally agree, but like the sequences those books consume energy that is normally spent on work, or at least hobbies, whereas HPMOR is optimized to replace time that would otherwise have been spent on videos, social media, socializing, other novels, etc. and is therefore the best bet I know of to boost EA as a whole.
HPMOR technically isn’t built to be time-efficient, the highlights of the sequences is better for that. HPMOR is meant to replace other things you do for fun like reading fun novels or TV shows or social media, and replace that with material that offers passive upskilling. In that sense, it is profoundly time-efficient, because it replaces fun time spent not upskilling at all, with fun time spent upskilling.
A very large proportion of EA-adjacent people in the bay area swear by it as a way to become more competent in a very broad and significant way, but I’m not sure how it compares with other books like Discworld which are also intended for slack/leisure time. AFAIK CEA has not even done a survey explicitly asking about the self-improvement caused by HPMOR, let alone study measuring the benefits of having different kinds of people read it.
We already have plenty of people whose worldview is shaped by Yudkowsky and “the sequences”. We need more people from different backgrounds, who can take a skeptical eye to these texts and point out their flaws and synthesize the good parts into a greater whole.
I get what you’re saying (the Rationalist toolbox should equip you to be skeptical of the sequences themselves), and I think it’s partially true in that rationalists have been highly critical of Yudkowsky lately.
However, I don’t think it’s enough. I think someone who learns similar principles to the sequences from different sources (textbooks, pop-science books, domain level knowledge) will do a better job at skeptically eyeing the sequences than someone who just read the sequences, for obvious reasons. I am one of those people, and I’ve spotted several issues with the sequences that the rationalists seemingly haven’t.
Yeah I thought of it from the perspective of “not being told what to think but being told what to think about”—Like you could say “the most profitable (in karma of a website) strategy is to disagree with a ‘founder’-like figure of that very website” of course, but indeed if you’ve accepted his frame of the debate then didn’t he “win” in a sense? This seems technically true often (not always!) but I find it uncompelling.
I’ve spotted several issues with the sequences that the rationalists seemingly haven’t.
I did an in depth write-up on debunking one claim here (that of a superintelligence inventing general relativity from a blade of grass).
I haven’t gotten around to in depth write-ups for other things, but here are some brief descriptions of other issues I’ve encountered:
the description of Aumanns agreement theorem in “defy the data” is false, leaving behind important caveats that render his use of it incorrect.
Yud implies that “Einsteins arrogance” is some sort of mystery (and people have cited that article as a reason to be as arrogant as Einstein about speculative forecasts). In fact, Einsteins arrogance was completely justified by the available evidence and is not surprising at all, in a manner in no way comparable to speculative forecasts.
The implications of the “AI box experiment” have been severely overstated. It does not at all prove that an AGI cannot be boxed. “rationalists are gullible” fits the evidence provided just as well.
Yudkowsky treats his case for the “many worlds hypothesis” as a slam-dunk that proves the triumph of Bayes, but in fact it is only half-done. He presents good arguments against “collapse is real”, but fails to argue that this means many worlds is the truth, rather than one of the other many interpretations which do not involve a real collapse.
The use of Bayesianism in Rationalism is highly simplified, and often doesn’t actually involve using bayes rule at all. It rarely resembles bayes as actually applied in science, and is likely to lead to errors in certain situations, like forecasting low-probability events.
Yud’s track record of predictions is fairly bad, but he has a habit of pretending it isn’t by being vague and refusing to make predictions that can be actually checked. In general he displays an embarrassing lack of intellectual humility.
I strongly think that people are sleeping on massive EV from encouraging others to read HPMOR, or to reread it if it’s been years. It costs nothing to read it, and it likely offers broad and intense upskilling.
I think you’re committing a typical mind fallacy if you think most people would benefit from reading HPMOR as much as you did.
I think OP omitted many details of why it might be plausible, and I wouldn’t expect the disagree voters to have any idea about what’s going on there:
To me, the literary value of EA stories is thinking through psychological context of trying to think more clearly, trying to be good, whatever. Building empathy for “how you would derive EA-shaped things and then build on them from within a social reward surface that isn’t explicitly selecting for that?”, “what emotional frictions or mistake classes would I expect?”, seems plausibly quite valuable for a ton of people.
Basically every other thing you could say about the value prop of reading Methods is downstream of this! “upskilling” was an intense word choice, but only 95% wrong.
With that in mind, I do think a ton of topics salient to EAs show up within Methods and many of them get thoroughly explored.
Additional thought: if we assume that people can gain skills from reading fiction (or from otherwise engaging in imaginary world, such as via films or games), does HPMOR give the best “return on investment” per hour spent? Is it better than reading War and Peace, or watching John Green videos, or playing Life is Strange? I’m skeptical that on EAs tend to be bias in favor of it, and therefore we would neglect other options.
(I’m not really expecting anyone to have actual data on this, but I’d be curious to see people bounce the idea around a bit)
Could you clarify what kind of upskiling you expect to come from reading Harry Potter fan fiction?
My “not rigorously thought out perspective” is that if someone has never encountered the idea of rationality or counterfactual, thinking, then this might introduce them to it. But I’m guessing that roughly similar benefits could be had from reading a much shorter book that is more directly targeted at teaching these skills (maybe Thinking Fast and Slow?).
I think it was unhelpful to refer to “Harry Potter fanfiction” here instead of perhaps “a piece of fiction”—I don’t think it’s actually more implausible that a fanfic would be valuable to read than some other kind of fiction, and your comment ended up seeming to me like it was trying to use the dishonest rhetorical strategy of implying without argument that the work is less likely to be valuable to read because it’s a fanfic.
Adjusted for popularity or likelihood of recommendation, you might naively expect fiction that someone is presented with to be more likely to stand the test of time than fan fiction, since the selection effects are quite different.
I think that is a fair and accurate criticism. I do view most fan fiction as fairly low quality, but even if that is true it doesn’t imply that all fan fiction is low quality. And I do think that some fiction can be used for self-improvement purposes.
The literary quality of the fiction (which, incidentally, I thought was terrible when I glanced at it out of curiosity), is fairly irrelevant to whether it helps people be more rational (which I am also skeptical of, but that’s a separate point.)
I do suspect that some Bay Are folk would benefit from reading at least one book of “serious” “literary” fiction with zero wizards or spaceships, like Middlemarch or To the Lighthouse, but I might just being snobby/trying to justify the 2 years of undergrad courses in Eng Lit I did.
I read to the lighthouse, not far away in time from when I read methods, and I was annoyed or confused about why I was reading it. And there was a while ten years ago when satantango and gravity’s rainbow were occupying a massive subset of my brain at all times, so I’m not like “too dumb for litfic” or whatever.
Sure. Not everyone has to like every book! I don’t like Don Quixote, which has frequently been claimed to be the greatest novel ever. I loved War and Peace when I was 18, but I’d cooled on its conservatism and misogyny when I last read it, though I still understood why people see it as “great”.
But I do think there is a tendency towards the grandiose and speculative in rationalist/nerd taste and away from the realist*, domestic, psychological, etc that I (probably pompously) think can be a bit limiting. Analogous to preferring Wagner to Mozart in opera, or prog metal to the Velvet Underground in “serious” rock music. (Also reminds me of Holden Karnosfky’s blogpost about being baffled by why Pet Sounds has such a strong reputation among rock critics when its so “pop” and allegedly lacking in “complexity”) I’ve never read Satantango, but Gravity’s Rainbow is verging pretty strongly on sci-fi, and has a very apocalyptic vibe, and a general desire to overwhelm.
Not that I’m slagging it: I really liked Gravity’s Rainbow when I read it, though that’s 18 years ago now, and I’ve hated the other Pynchon I’ve tried since. And I’m not slagging also being massive nerd. I will never be a huge prog/metal fan, but I have just leapt from replaying Dragon Age: Inquisition to starting Baldur’s Gate III.
*Technically speaking, I think “To the Lighthouse” is modernism not realism as lit profs would classify it.. But in this context that really just means “about posh people, not about a war, dense prose, interior monologues’, which isn’t really incompatible with “realism” in any ordinary sense.
I totally agree, but like the sequences those books consume energy that is normally spent on work, or at least hobbies, whereas HPMOR is optimized to replace time that would otherwise have been spent on videos, social media, socializing, other novels, etc. and is therefore the best bet I know of to boost EA as a whole.
It costs time to read it! Do you happen to know of a 10 minute summary of the key points?
HPMOR technically isn’t built to be time-efficient, the highlights of the sequences is better for that. HPMOR is meant to replace other things you do for fun like reading fun novels or TV shows or social media, and replace that with material that offers passive upskilling. In that sense, it is profoundly time-efficient, because it replaces fun time spent not upskilling at all, with fun time spent upskilling.
A very large proportion of EA-adjacent people in the bay area swear by it as a way to become more competent in a very broad and significant way, but I’m not sure how it compares with other books like Discworld which are also intended for slack/leisure time. AFAIK CEA has not even done a survey explicitly asking about the self-improvement caused by HPMOR, let alone study measuring the benefits of having different kinds of people read it.
‘it likely offers broad and intense upskilling’ What’s the evidence for this?
We already have plenty of people whose worldview is shaped by Yudkowsky and “the sequences”. We need more people from different backgrounds, who can take a skeptical eye to these texts and point out their flaws and synthesize the good parts into a greater whole.
disagree that worldview shaped by yudkowsky would not correlate with your skeptical eye clause.
I get what you’re saying (the Rationalist toolbox should equip you to be skeptical of the sequences themselves), and I think it’s partially true in that rationalists have been highly critical of Yudkowsky lately.
However, I don’t think it’s enough. I think someone who learns similar principles to the sequences from different sources (textbooks, pop-science books, domain level knowledge) will do a better job at skeptically eyeing the sequences than someone who just read the sequences, for obvious reasons. I am one of those people, and I’ve spotted several issues with the sequences that the rationalists seemingly haven’t.
Yeah I thought of it from the perspective of “not being told what to think but being told what to think about”—Like you could say “the most profitable (in karma of a website) strategy is to disagree with a ‘founder’-like figure of that very website” of course, but indeed if you’ve accepted his frame of the debate then didn’t he “win” in a sense? This seems technically true often (not always!) but I find it uncompelling.
where did you write these down?
I did an in depth write-up on debunking one claim here (that of a superintelligence inventing general relativity from a blade of grass).
I haven’t gotten around to in depth write-ups for other things, but here are some brief descriptions of other issues I’ve encountered:
the description of Aumanns agreement theorem in “defy the data” is false, leaving behind important caveats that render his use of it incorrect.
Yud implies that “Einsteins arrogance” is some sort of mystery (and people have cited that article as a reason to be as arrogant as Einstein about speculative forecasts). In fact, Einsteins arrogance was completely justified by the available evidence and is not surprising at all, in a manner in no way comparable to speculative forecasts.
The implications of the “AI box experiment” have been severely overstated. It does not at all prove that an AGI cannot be boxed. “rationalists are gullible” fits the evidence provided just as well.
Yudkowsky treats his case for the “many worlds hypothesis” as a slam-dunk that proves the triumph of Bayes, but in fact it is only half-done. He presents good arguments against “collapse is real”, but fails to argue that this means many worlds is the truth, rather than one of the other many interpretations which do not involve a real collapse.
The use of Bayesianism in Rationalism is highly simplified, and often doesn’t actually involve using bayes rule at all. It rarely resembles bayes as actually applied in science, and is likely to lead to errors in certain situations, like forecasting low-probability events.
Yud’s track record of predictions is fairly bad, but he has a habit of pretending it isn’t by being vague and refusing to make predictions that can be actually checked. In general he displays an embarrassing lack of intellectual humility.