Looks like this was either fixed, or I was on mobile before and am now on desktop
David M
I copied this related facebook comment by Kerry Vaughan from 6th September 2018 (from this public thread):
> (This post represents my views and not necessarily the views of everyone at CEA) [for whom Kerry worked at the time]
> [...] I think there are some biases in how the community allocates social status which incentivize people to do things that aren’t their comparative advantage.> If you want to be cool in EA there are a few things you can do: (1) make sure you’re up to date on whatever the current EA consensus is on relevant topics; (2) work on whatever is the Hot New Thing in EA; and (3) have skills in some philosophical or technical area. Because most people care a lot about social acceptance, people will tend to do the things that are socially incentivized.
> This can cause too many EAs to try to become the shape necessary to work on AI-Safety or clean meat or biosecurity even if that’s not their comparative advantage. In the past these dynamics caused people to make themselves fit the shape of earning to give, research, and movement building (or feeling useless because they couldn’t). In the future, it will probably be something else entirely. And this isn’t just something people are doing on their own—at times it’s been actively encouraged by official EA advice.
> The problem is that following the social incentives in EA sometimes encourages people to have less impact instead of more. Following social incentives (1) disincentivizes people from actually evaluating the ideas for themselves and discourages healthy skepticism about whatever the intellectual consensus happens to be. (2) means that EAs are consistently trying to go into poorly-understood, ill-defined areas with poor feedback loops instead of working in established areas where we know how to generate impact or where they have a comparative advantage. (3) means that we tend to value people who do research more than people who do other types of work (e.g. operations, ETG).
> My view is that we should be praising people who’ve thought hard about the relevant issues and happen to have come to different conclusions than other people in EA. We should be praising people who know themselves, know what their skills are, know what they’re motivated to do, and are working on projects that they’re well-suited for. We should be praising people who run events, work a job and donate, or do accounting for an EA org, as well as people who think about abstract philosophy or computer science.
> CEA and others have taken some steps to help address this problem. Last year’s EA Global theme—Doing Good Together—was designed to highlight the ideas of comparative advantage, of seeing our individual work in the context of the larger movement and of not becoming a community of 1,000 shitty AI Safety researchers. We worked with 80K to communicate the importance of operations management (https://80000hours.org/articles/operations-management/) and CEA ran a retreat specifically for people interested in ops. We also supported the EA Summit because we felt that it was aiming to address some of these issues.
> Yet, there’s more work to be done. If we want to have a major impact on any cause we need to deploy the resources we have as effectively as possible. That means helping people in the community actually figure out their comparative advantage instead of distorting themselves to fit the Hot New Thing. It also means praising people who have found their comparative advantage whatever that happens to be.
So excited for you!
The GWWC podcast includes several ‘member story’ episodes. https://www.givingwhatwecan.org/podcast
The EA UK newsletter often includes stories.
From the May newsletter:
Zeke—Story of a career/mental health failure
Aaron Gertler—Life in a Day: The film that opened my heart to effective altruism
Amber Dawn talking to Daniel Wu on being on the EA fringes, healthcare entrepreneurship and trying out different career paths
@DavidNash can probably send you the back catalogue.
For tagging, how do I tag someone whose username contains a space? I want to be able to tag ‘Amber Dawn’ without tagging ‘Amber’.
Several serious posts are drowned out on April 1st each year. I half intended to write a round up of these to help them avoid being drowned out, but didn’t get around to it before the work week; now I’m requesting that the EA Forum team consider doing this. In future years (assuming your timelines are that long) I would also be in favour of having a separate section for April fools (like the community section) even though this dampens the humour.
Isn’t that a bit self-aggrandising? I prefer “aspiring EA-adjacent”
This is a great suggestion for how to make billionaires’ money go further.
We love to see / hear it!
Can you confirm that you will be using text-to-audio AI voices that have been trained on Rob’s actual voice and the other guests’ and hosts’ voices? Kelsey Piper’s new blog, Planned Obsolescence, does this, and it works very well (with a delicious undertone of uncanny valley). I think it would be great for authenticity, and strengthening the parasocial bond between audience and creator.
If this proves too costly, I would suggest it’s still worth just getting the hosts and guests to read out the transcripts themselves, because the benefits of voice authenticity are hard to overstate.
A six month pause? No, longer is needed.
In the post Will said:
Unfortunately, I think that’s a minimum of 2 months, and I’m still sufficiently unsure on timing that I don’t want to make any promises on that front. I’m sorry about that: I’m aware that this will be very frustrating for you; it’s frustrating for me, too.
Up/downvoting a post shouldn’t be possible within 30 seconds of opening a (not very short) post (prevent upvoting based on title only), or should be weighted less
Pleased to hear this:
This year I plan to investigate ways to increase demographic diversity among the people we reach, and indeed it’s possible that our work here could really do quite a lot to move the needle in a positive direction for the community as a whole.
This is the first I’ve heard of swapcard’s spreadsheet mode, that sounds potentially very useful!
I couldn’t easily find info about it, do you have a link?
19 hours later the posts have dropped off the front page
On mobile so can’t upload a screenshot, but I have one
(Making this the official errata thread) The deadline on the form and in the forum post are different from one another.
Yeah, I’ll note because the memory might slip away that my initial reaction to the TIME article paragraph about Owen was:
- horror/disgust
- hope that the person was not as central as implied in the text
- (get distracted by my own work/life and allow the news to slip into the background of my mind, and allow the hope to transform into an implicit feeling that the person was, hopefully, not as central as Owen was)
- have an unjustified implicit belief that the person is not core to EA
- find out that that was wrong <-- I am here, and the only reason I can detect my previous implicit beliefs is from the current feeling of surprisal
As a snapshot of the landscape 1 year on, post-FTX:
80,000 Hours lists 62 roles under the skill sets ‘software engineering’ (50) and ‘information security’ (18) when I use the filter to exclude ‘career development’ roles.
This sounds like a wealth of roles, but note that the great majority (45) are in AI (global health and development is the distant second-place runner up, at 6); and the great majority are in the Bay Area (35; London is second with 5).
Of course, this isn’t a perfectly fair test, as I just did the quick thing of using filters on the 80K job board rather than checking all the organisations as Sebastian did last year.