Apply for Red Team Challenge [May 7 - June 4]
Summary
Red Team Challenge is a programme that calls small teams together to “red team” important ideas within effective altruism. The inaugural challenge will run from May 7 - June 4 2022.
This programme will provide training in “red teaming” best practices and then pair small teams of 2-4 people together to critique a particular claim and publish the results. It was directly inspired by this post (so please shower them in counterfactual karma).
There are three distinct parts:
Workshop
A 2-3 hour workshop focused on learning red teaming best practices.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea.
Publication:
A small panel of experienced EA researchers will then judge each submission and select a winning team. Experienced EA researchers will judge & provide feedback to each team’s submission
Our goal is to help aspiring researchers test their fit and hone their reasoning ability while producing a concrete output which is useful to the EA community at large.
Express interest in joining the judging panel here.
Apply here by 15th April.
What is “red teaming”?
A red team is an independent group that challenges a particular organisation, idea or claim in order to improve it. Red teaming is the practice of using red teams.
Within effective altruism, “red teaming” refers to attempts to identify problems or errors in popular or prestigious views held by members of this community, such as views about the value of different causes or organizations. Related concepts include minimal-trust investigations, epistemic spot-checks, and hypothetical apostasy.
Red Team Challenge seeks to promote this practice within effective altruism. This will help break up groupthink, identify ways in which the EA community could be erring, and generally foster strong epistemic norms.
Goal
The goal of this programme is for participants to:
Test their fit for a career in EA-aligned research or an adjacent path
Build skills relevant to becoming a researcher
Working independently & autonomously, deconstructing ideas, identifying weaknesses in arguments, reasoning transparently, etc.
Publish a piece of low-stakes research to build career capital
An additional goal is to foster strong epistemic norms within the EA community, scrutinising new and existing ideas.
Structure of the programme
The program has three discrete parts:
Workshop
A 2-3 hour workshop focused on learning & applying red teaming best practises.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea. The total time spent per person is estimated to be ~16 hours.
A list of suggested ideas will be provided by TFG but participants can choose to red team any idea of their choosing.
Groups can choose to participate together as a team, however we expect that the majority of participants will enter without an existing team.
Publication
We encourage all teams to publish their submissions publicly on the EA Forum.
A small panel of experienced EA researchers will then judge each submission and select a winning team.
Who should apply
Anyone who’s interested.
Whether you’re deeply familiar with effective altruism, have just finished reading Doing Good Better, currently work as a researcher, or have never used Google before, this programme is open to everyone.
However, we’ve specifically designed this programme for with the following group in mind:
Engaged with EA for at least ~30 hours (e.g. have participated in an Introductory EA Program).
However, we’re also excited to see applications from people that have engaged with EA concepts for as many as 300+ hours.
Interested in pursuing a career in EA-aligned research
It’s okay if you’re not yet committed to research—Red Team Challenge will help you test your fit for research while building relevant skills.
Apply now
To apply for Red Team Challenge, please fill out this short application form by 15th April.
If you’re interested in helping judge the red teams which are produced, please fill out this short expression of interest form.
- Announcing a contest: EA Criticism and Red Teaming by 1 Jun 2022 18:58 UTC; 276 points) (
- Pre-announcing a contest for critiques and red teaming by 25 Mar 2022 11:52 UTC; 173 points) (
- Rowing and Steering the Effective Altruism Movement by 9 Jan 2022 17:28 UTC; 146 points) (
- Training for Good—Update & Plans for 2023 by 15 Nov 2022 16:02 UTC; 80 points) (
- Resource for criticisms and red teaming by 1 Jun 2022 18:58 UTC; 61 points) (
- Some benefits and risks of failure transparency by 27 Mar 2022 2:09 UTC; 53 points) (
- EA Updates for April 2022 by 31 Mar 2022 16:43 UTC; 32 points) (
- Red teaming a model for estimating the value of longtermist interventions—A critique of Tarsney’s “The Epistemic Challenge to Longtermism” by 16 Jul 2022 19:05 UTC; 21 points) (
- Announcing a contest: EA Criticism and Red Teaming by 2 Jun 2022 20:27 UTC; 17 points) (LessWrong;
- 23 Sep 2022 16:00 UTC; 7 points) 's comment on The motivated reasoning critique of effective altruism by (
- 7 Apr 2022 0:39 UTC; 2 points) 's comment on Idea: Red-teaming fellowships by (
This is brilliant! Sounds like a lot of fun and a great learning opportunity.
Before I pass this on to my local group, I want to ask—is it possible to apply as a team? I imagine that this might make it more appealing to some people to register with teammates they already know and share the same language.
Yep, it’s possible to apply as a team (of up to 4 people). Though we expect that most people will apply as individuals and that we’ll assign them to a team.
It’s also possible to apply with an idea that you’d like to red team. Though, again, we expect most people to apply without an idea and will provide a suggested list for participants.
Noting my excitement that you picked up on the idea and will actually make this happen!
The structure you lay out sounds good.
Regarding the winning team, will there be financial rewards? I’d give it >70% that someone would fund at least a ~$1000 award for the best team.
Thanks Simon! Currently, we don’t plan to provide a financial reward to the winning team (though I must admit, we haven’t given this much thought). It’s a good point though & we’ll consider it further in the coming weeks.
If anyone reading this is interested in funding an award for the winning team, please do get in touch.
Potentially of use in running a short workshop is the effectiveness of pedagogical techniques. From engaging with the literature on such, the highest quality systematic review I could find pointed to four techniques as showing robust effect size across many contexts and instantiations. They are
Deliberate practice
Cuing elaboration of context
Regular low stakes quizzing
Teaching the material to others
This sounds awesome! Thank you for running it! Do you expect to have additional runs of this in the future?
I imagine we’ll continue to run Red Team Challenge somewhere between 1-4 times per year moving forward (though this largely depends on how well the first iteration goes).
Do you have any examples of suggested ideas to red team? No worries if not - - just wanted to get a sense of what the suggested list will be like.
I’m not sure if this is just happening for me, but the form seems to have some glitchy parameters that make it impossible to submit, such as a ‘character count which needs to be bellow 0’ for question 12, and question 7 doesn’t tell you why you can’t submit the answer even when it is below 300 words.
Thanks for flagging this Kaleem, it should now be fixed. Let me know if you’re still having issues :)
It’s fixed—thanks !