Red Team Challenge is a programme that calls small teams together to “red team” important ideas within effective altruism. The inaugural challenge will run from May 7 - June 4 2022.
This programme will provide training in “red teaming” best practices and then pair small teams of 2-4 people together to critique a particular claim and publish the results. It was directly inspired by this post (so please shower them in counterfactual karma).
There are three distinct parts:
Workshop
A 2-3 hour workshop focused on learning red teaming best practices.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea.
Publication:
A small panel of experienced EA researchers will then judge each submission and select a winning team. Experienced EA researchers will judge & provide feedback to each team’s submission
Our goal is to help aspiring researchers test their fit and hone their reasoning ability while producing a concrete output which is useful to the EA community at large.
Express interest in joining the judging panel here.
A red team is an independent group that challenges a particular organisation, idea or claim in order to improve it. Red teaming is the practice of using red teams.
Within effective altruism, “red teaming” refers to attempts to identify problems or errors in popular or prestigious views held by members of this community, such as views about the value of different causes or organizations. Related concepts include minimal-trust investigations, epistemic spot-checks, and hypothetical apostasy.
Red Team Challenge seeks to promote this practice within effective altruism. This will help break up groupthink, identify ways in which the EA community could be erring, and generally foster strong epistemic norms.
Goal
The goal of this programme is for participants to:
Test their fit for a career in EA-aligned research or an adjacent path
Build skills relevant to becoming a researcher
Working independently & autonomously, deconstructing ideas, identifying weaknesses in arguments, reasoning transparently, etc.
Publish a piece of low-stakes research to build career capital
An additional goal is to foster strong epistemic norms within the EA community, scrutinising new and existing ideas.
Structure of the programme
The program has three discrete parts:
Workshop
A 2-3 hour workshop focused on learning & applying red teaming best practises.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea. The total time spent per person is estimated to be ~16 hours.
A list of suggested ideas will be provided by TFG but participants can choose to red team any idea of their choosing.
Groups can choose to participate together as a team, however we expect that the majority of participants will enter without an existing team.
Publication
We encourage all teams to publish their submissions publicly on the EA Forum.
A small panel of experienced EA researchers will then judge each submission and select a winning team.
Who should apply
Anyone who’s interested.
Whether you’re deeply familiar with effective altruism, have just finished reading Doing Good Better, currently work as a researcher, or have never used Google before, this programme is open to everyone.
However, we’ve specifically designed this programme for with the following group in mind:
Engaged with EA for at least ~30 hours (e.g. have participated in an Introductory EA Program).
However, we’re also excited to see applications from people that have engaged with EA concepts for as many as 300+ hours.
Interested in pursuing a career in EA-aligned research
It’s okay if you’re not yet committed to research—Red Team Challenge will help you test your fit for research while building relevant skills.
Apply for Red Team Challenge [May 7 - June 4]
Summary
Red Team Challenge is a programme that calls small teams together to “red team” important ideas within effective altruism. The inaugural challenge will run from May 7 - June 4 2022.
This programme will provide training in “red teaming” best practices and then pair small teams of 2-4 people together to critique a particular claim and publish the results. It was directly inspired by this post (so please shower them in counterfactual karma).
There are three distinct parts:
Workshop
A 2-3 hour workshop focused on learning red teaming best practices.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea.
Publication:
A small panel of experienced EA researchers will then judge each submission and select a winning team. Experienced EA researchers will judge & provide feedback to each team’s submission
Our goal is to help aspiring researchers test their fit and hone their reasoning ability while producing a concrete output which is useful to the EA community at large.
Express interest in joining the judging panel here.
Apply here by 15th April.
What is “red teaming”?
A red team is an independent group that challenges a particular organisation, idea or claim in order to improve it. Red teaming is the practice of using red teams.
Within effective altruism, “red teaming” refers to attempts to identify problems or errors in popular or prestigious views held by members of this community, such as views about the value of different causes or organizations. Related concepts include minimal-trust investigations, epistemic spot-checks, and hypothetical apostasy.
Red Team Challenge seeks to promote this practice within effective altruism. This will help break up groupthink, identify ways in which the EA community could be erring, and generally foster strong epistemic norms.
Goal
The goal of this programme is for participants to:
Test their fit for a career in EA-aligned research or an adjacent path
Build skills relevant to becoming a researcher
Working independently & autonomously, deconstructing ideas, identifying weaknesses in arguments, reasoning transparently, etc.
Publish a piece of low-stakes research to build career capital
An additional goal is to foster strong epistemic norms within the EA community, scrutinising new and existing ideas.
Structure of the programme
The program has three discrete parts:
Workshop
A 2-3 hour workshop focused on learning & applying red teaming best practises.
Red Team
A 4 week period where teams of 2-4 people work to independently “red team” a specific idea. The total time spent per person is estimated to be ~16 hours.
A list of suggested ideas will be provided by TFG but participants can choose to red team any idea of their choosing.
Groups can choose to participate together as a team, however we expect that the majority of participants will enter without an existing team.
Publication
We encourage all teams to publish their submissions publicly on the EA Forum.
A small panel of experienced EA researchers will then judge each submission and select a winning team.
Who should apply
Anyone who’s interested.
Whether you’re deeply familiar with effective altruism, have just finished reading Doing Good Better, currently work as a researcher, or have never used Google before, this programme is open to everyone.
However, we’ve specifically designed this programme for with the following group in mind:
Engaged with EA for at least ~30 hours (e.g. have participated in an Introductory EA Program).
However, we’re also excited to see applications from people that have engaged with EA concepts for as many as 300+ hours.
Interested in pursuing a career in EA-aligned research
It’s okay if you’re not yet committed to research—Red Team Challenge will help you test your fit for research while building relevant skills.
Apply now
To apply for Red Team Challenge, please fill out this short application form by 15th April.
If you’re interested in helping judge the red teams which are produced, please fill out this short expression of interest form.