I don’t understand the formula that appears after “For each of our variables, we define a relative version:”, could you clarify? Then its says “Remember that rVresident...” but I can’t “remember”, since there’s no definition of it earlier (only of rV). A definition of rVresident appears later [but it incorporates new concepts R and W that aren’t defined very clearly (what’s a “resource” and what exactly does “value after controlling for resources” mean, for those of us that are not statisticians? Well, you start to elaborate on W, but … by this point I’m confused enough to find the discussion harder to follow.)]
I have a thought about EA hotel which this analysis likely doesn’t capture: the general intuition that EAs should be taken care of—that we should “take care of our own”.
Today I read that research by Holt-Lunstad “shows that being disconnected [lonely] poses comparable danger to smoking 15 cigarettes a day, and is more predictive of early death than the effects of air pollution or physical inactivity.” While I’m not exactly lonely*, I have no EA friends (my city is not an EA hub), and my productivity is extremely low as I’m (1) currently unemployed and (2) have virtually stopped working on altruistic projects due to a lack of emotional support and a loss of faith that I can succeed**. I may soon get a job and will then earn-at-least-partly-to-give (probable donations: $15,000/yr, perhaps more later), but this is not as fulfilling as a project would be, or as fulfilling as EA friendships would be. I’ve tried job hunting in the Bay Area where I might have been able to be near EAs, but was turned down by a few companies and gave up; besides, the idea of spending roughly 100% of the additional income I would earn in the Bay Area on rent… it’s repugnant.
By extension, I believe that for some EAs, EA hotels could offer improvements to mental heath and future good-doing potential that aren’t otherwise available. Intuitively, it seems like the EA community ought to be able to take care of its own adherents. One simple justification of this is simply that poorer mental health limits the amount of good done by each EA; another one is that if the EA movement can’t take care of its non-central members, it will be more difficult to grow and spread the movement; e.g. a reputation for loneliness among EAs would suggest to others that they shouldn’t become an EA, and EAs who are lonely are less likely to encourage others to become EAs.
Since creating more EAs presumably creates more good in the world—especially as we can anticipate exponential growth—the question of how to create more EAs is valuable to ponder. While EA hotels (and similar projects) are not a solution by themselves, they may be an important component of such growth. So the EA hotel is one of my favorite ideas and if I were in the UK I might be living there now.
* I live with my best friend, who doesn’t think at all like a EA/rationalist. I’m also married to a non-EA but the Canadian government keeps us separated (thanks IRCC). But this touches on a related issue—I plan to have a child, and as long as we don’t have a rationalist/EA Sunday School system to teach our values, I’m curious whether growing up inside or close to an EA hotel would work as a substitute. Seems worth a try!
** as I’m writing software, the value of the project is a highly nonlinear function of the input effort, requiring much more manpower to become valuable, i.e. a minimum viable product. Working on it has become harder in turns of willpower requirement over the years.
E stands for expected value. rVresident=E[Vresident]E[Vhire], or the expected value of the resident relative to a hire. In the first formula you refer to there is also the equivalent for the counterfactual resident (rVcresident) and the counterfactual hire (rVchire).
I don’t understand the formula that appears after “For each of our variables, we define a relative version:”, could you clarify? Then its says “Remember that rVresident...” but I can’t “remember”, since there’s no definition of it earlier (only of rV). A definition of rVresident appears later [but it incorporates new concepts R and W that aren’t defined very clearly (what’s a “resource” and what exactly does “value after controlling for resources” mean, for those of us that are not statisticians? Well, you start to elaborate on W, but … by this point I’m confused enough to find the discussion harder to follow.)]
I have a thought about EA hotel which this analysis likely doesn’t capture: the general intuition that EAs should be taken care of—that we should “take care of our own”.
Today I read that research by Holt-Lunstad “shows that being disconnected [lonely] poses comparable danger to smoking 15 cigarettes a day, and is more predictive of early death than the effects of air pollution or physical inactivity.” While I’m not exactly lonely*, I have no EA friends (my city is not an EA hub), and my productivity is extremely low as I’m (1) currently unemployed and (2) have virtually stopped working on altruistic projects due to a lack of emotional support and a loss of faith that I can succeed**. I may soon get a job and will then earn-at-least-partly-to-give (probable donations: $15,000/yr, perhaps more later), but this is not as fulfilling as a project would be, or as fulfilling as EA friendships would be. I’ve tried job hunting in the Bay Area where I might have been able to be near EAs, but was turned down by a few companies and gave up; besides, the idea of spending roughly 100% of the additional income I would earn in the Bay Area on rent… it’s repugnant.
By extension, I believe that for some EAs, EA hotels could offer improvements to mental heath and future good-doing potential that aren’t otherwise available. Intuitively, it seems like the EA community ought to be able to take care of its own adherents. One simple justification of this is simply that poorer mental health limits the amount of good done by each EA; another one is that if the EA movement can’t take care of its non-central members, it will be more difficult to grow and spread the movement; e.g. a reputation for loneliness among EAs would suggest to others that they shouldn’t become an EA, and EAs who are lonely are less likely to encourage others to become EAs.
Since creating more EAs presumably creates more good in the world—especially as we can anticipate exponential growth—the question of how to create more EAs is valuable to ponder. While EA hotels (and similar projects) are not a solution by themselves, they may be an important component of such growth. So the EA hotel is one of my favorite ideas and if I were in the UK I might be living there now.
* I live with my best friend, who doesn’t think at all like a EA/rationalist. I’m also married to a non-EA but the Canadian government keeps us separated (thanks IRCC). But this touches on a related issue—I plan to have a child, and as long as we don’t have a rationalist/EA Sunday School system to teach our values, I’m curious whether growing up inside or close to an EA hotel would work as a substitute. Seems worth a try!
** as I’m writing software, the value of the project is a highly nonlinear function of the input effort, requiring much more manpower to become valuable, i.e. a minimum viable product. Working on it has become harder in turns of willpower requirement over the years.
Of note: this newer post argues persuasively for the hotel in a different way than OP or me.
E stands for expected value. rVresident=E[Vresident]E[Vhire], or the expected value of the resident relative to a hire. In the first formula you refer to there is also the equivalent for the counterfactual resident (rVcresident) and the counterfactual hire (rVchire).