I also think the website design seems a bit off to me
james
Intro to ML Safety virtual program: 12 June − 14 August
Skill up in ML for AI safety with the Intro to ML Safety course (Spring 2023)
Yeah I think Wise could actually just work on its own alone
It also appears that the link to ELK in this section is incorrect
Making use of an AI’s internal state,2 not just its outputs. For example, giving positive reinforcement to an AI when it seems likely to be “honest” based on an examination of its internal state (and negative reinforcement when it seems likely not to be). Eliciting Latent Knowledge provides some sketches of how this might look.
The link to ELK in this bullet point is broken.
It’s not currently clear how to find training procedures that train “giving non-deceptive answers to questions” as opposed to “giving answers to questions that appear non-deceptive to the most sophisticated human arbiters” (more at Eliciting Latent Knowledge).
It may intend to point to here: https://www.alignmentforum.org/posts/qHCDysDnvhteW7kRd/arc-s-first-technical-report-eliciting-latent-knowledge
This is cool, thanks for writing it!
Transcript of a talk on The non-identity problem by Derek Parfit at EAGxOxford 2016
I also recommend https://www.athenago.com for full time remote executive assistants
For more on criterion of rightness vs decision procedure
I don’t think my particular VAs have more capacity, but I believe Virtalent has other VAs ready to match with clients.
It is unclear to me whether I’ve just gotten lucky. But with Virtalent you can switch VA and the minimum commitment is very low, which is why I think the best strategy is just to try
I like the term “Summit”
Advice on how to get a remote personal/executive assistant
Hey Theo—I’m James from the Global Challenges Project :)
Thanks so much for taking the time to write this—we need to think hard about how to do movement building right, and its great for people like you to flag what you think is going wrong and what you see as pushing people away.
Here’s my attempt to respond to your worries with my thoughts on what’s happening!
First of all, just to check my understanding, this is my attempt to summarise the main points in your post:
My summary of your main points
We’re missing out on great people as a result of how community building is going at student groups. A stronger version of this claim would be that current CB may be selecting against people who could most contribute to current talent bottlenecks. You mention 4 patterns that are pushing people away:
EA comes across as totalising and too demanding, which pushes away people who could nevertheless contribute to pressing cause areas. (Part 1.1)
Organisers come across as trying to push particular conclusions to complex questions in a way that is disingenuous and also epistemically unjustified. (Part 1.2)
EA comes across as cult-like; primarily through appearing to be trying to hard to be persuasive, pattern matching to religious groups, coming across as disingenuously friendly (Part 1.3, your experience)
There aren’t as many ways for neartermist-interested EAs to get involved in the community, despite them being able to contribute to EA cause areas (Part 1.4)
My understanding is that you find patterns (2) and (3) especially concerning. So to elaborate on them, you’re worried about:
EA outreach is over-optimising on persuasion/conversion in a way that makes epistemically rigorous and skeptical people extremely averse to EA outreach. You feel like student group leaders are trying to persuade people into certain conclusions rather than letting people decide for themselves.
EA student group leaders are generally unaware and out-of-the-loop on how they are coming across poorly to other people.
EA student group leaders are often themselves pretty new to EA, yet are getting funded to do EA outreach. This is bad because they won’t really how best to do outreach due to being so new.
You think these worrying patterns are being driven upstream by a strategic mistake of over-optimising for a metric of “highly engaged EAs”. This is a poor choice of metric because:
A large fraction of people who could excel in an EA career won’t get engaged in EA quickly, but will be slow to come to arrive at EA conclusions due to their desire to reason carefully and skeptically. Thus you worry that these people will be ignored by EA outreach because they don’t come across as a “highly engaged EA”.
You then suggest some possible changes that student group leaders could make (here I’m just focusing on changes that SG leaders could do):
Don’t think in terms of producing “highly engaged EAs”; in general beware of over-optimising on getting people who quickly agree with EA ideas.
Try and get outside perspectives on whether what you’re doing might be off-putting to others.
Actively seek out criticisms and opinions of people who might have been put off by EA.
Seek to improve your epistemics; do the hard and virtuous thing of being open to criticism even though it’s naturally aversive.
Beware social dynamics that incentivise people to agree with conclusions in return for social approval.
Sorry that was such a long summary (and if I missed out key parts, please do let me know)! I think you’re making many great points.
Here are some of my thoughts in reply:
My thoughts in reply
Over-optimising on HEAs
I agree with all of your specific pieces of advice in your final section. I think they’re great heuristics that every person doing EA outreach should try and adopt.
My overall impression is that many student group leaders also agree with the direction of your advice, but find it hard to implement in real life because in general it’s hard to do stuff right. My impression is that most student group leaders are super overstretched, have lots of university work going on, and are only able to spend several hours per week doing EA outreach work (and generally find it stressful and difficult to stay on top of things).
I think the core failure mode of “getting the people who already initially express the most interest/agreement in EA” does go on, but I think that what drives it is a general tendency to do what’s easier (which is true of any activity) instead of necessarily over-optimising on an explicit metric. Since group leaders are so time-constrained, it’s easier for them to talk to and engage with people who already agree because they don’t have the time or patience to grapple with people who initially disagree.
If group leaders were feeling a lot of pressure to get HEAs from funding bodies, this would be super bad. I’m not sure to the extent that this is really going on that much: CEA’s HEA metric is kinda vague and I haven’t got an impression from group leaders I’ve talked to that people are trying to optimise super hard on it (would love to hear contrary anecdotes). In general I find most student groups to be small, somewhat chaotically run, and so not very good at optimising for anything in particular.
If this claim is true, then I think that would be an argument for investing more resources into student groups to get them to a state where they have more capacity to make better decisions and spend time engaging with Alice-types.
Here are some of my thoughts on EA coming across as cult-like:
I agree that EA can come off as weird and cult-like at times. I think this is because: (i) there’s a lot of focus on outreach, (ii) EA is an intense idea that people take very seriously in their lives.
I think it’s such a shame that EA comes across this way. At its core I think it’s because it’s so unusual to have communities that are this serious about things. To give a personal anecdote, when I was at university I felt pretty disillusioned and distant from my peers. I felt that things were so messed up in the world and it made me sad that many of my friends didn’t seem to notice or care. When I first met EAs I found it so inspiring how they were so serious about taking personal responsibility for making the world better; no matter what society’s default expectations were.
When I was first getting into EA I was really fervent about doing outreach, and I think I did a pretty bad job. It seemed so important to me that everyone should agree with EA ideas because of the huge amount of suffering that was going on in the world. I found it confusing and disheartening when many of those I talked to simply didn’t agree with EA or who seemed to agree but then not do anything about it. I would argue back in an unconvincing way, which made little progress. Because EA conclusions seemed obvious to me, I didn’t get how people didn’t immediately also agree.
With all that in mind, here is a quick guess of additional heuristics (beyond your suggestions) that student leaders could bear in mind:
It’s not your job to make someone an EA: I think a better framing is to view your responsibility as making sure that people have the opportunity to hear about and engage with EA ideas. But at the end of the day, if they don’t agree it’s not your job to make them agree. There’s a somewhat paradoxical subtlety to it—through coming to peace with the fact that some people won’t agree, you can better approach conversations with a genuine desire to help people make up their own minds.
Look at things from an outsider’s perspective: I don’t have immediately thoughts on tactical decisions like how to use CRMs (although I do find the business jargon quite ugh) or book giveaways. It seems to me that there are good ways and bad ways to do these sorts of things. But your suggestion of checking in with non EA-s about whether they’d find it weird seems great and so I just wanted to doubly reiterate it here!
Embrace the virtue of patience: I think it’s important to approach EA outreach conversations with a virtue of patience. It can be difficult to embrace because for EAs outreach feels so high-stakes and valuable. However if you don’t have patience then you’ll be tempted to do outreach in a hurry, which leads to sloppy epistemics or at worse deceitfulness. A patient EA carefully explains ideas, a hurried EA aims to persuade.
I think it would be a shame if we lost the good qualities of EA that make it so unique in the world—that it’s a community of people who are unusually serious about doing the most good they can in their lives. But I think we can do better as a community in not coming across cult-like by being more balanced in our outreach efforts, being mindful of the bad effects of aiming for persuasion, and coming to a peace with the idea that some people just won’t be that into EA and that’s okay (and that doesn’t make them a bad person).
Other strategy suggestions which I think could improve the status quo:
I’d be excited to see more EA adjacent and cause specific outreach. I think having lots of different brands and sub-communities broadens the appeal of EA ideas to different sorts of audiences, and lets people get involved in EA stuff to different extents (so that EA isn’t as all-or-nothing). I’d be keen to see people restart effective giving groups, rationality groups, EA outreach focused on entrepreneurs, and cause specific groups like animal welfare and AI alignment.
Thanks again for taking the time to write the post—it seems like it’s generated great discussion and that its something that a lot of people agree with :)
UK lawyers https://ignition.law/
Cool! Have you considered turning those notes into a post? Could be a great way for more people to see rhem
I’m not familiar with CIO’s unfortunately, so don’t know :(
Great suggestions. I didn’t know that Osome would also do company formations, thanks for the tip. I simply just listed the process I used.
Also excited to check out Starling and Free Agent. Thanks for the recommendations.
I agree that this post would be substantially enhanced with summaries of key responsibilities. I’d love for you to contribute to that and perhaps we can update this post and add you as a coauthor?
I’m probably not going to work on this post myself more, as I just wanted to spend a few minutes writing a quick post. But if you feel excited about it I think it could be valuable for you to draft extensions to this post :)
USA recommendations
I recommend Stripe Atlas to set up a USA for-profit entity https://stripe.com/atlas. It costs $500 for them to set up your entity.
I also recommend https://mercury.com/ for US banking
I thought the team behind the EAGx designs were really great and I loved them. Have you considered reaching out to them to make designs for your store?