A red team is an independent group that challenges an organization or movement in order to improve it. Red teaming is the practice of using red teams.
History of the term
The term “red teaming” appears to originate in the United States military. A common exercise was to pitch an offensive “red team”, representing the enemy, against a defensive “blue team”, representing the U.S. The purpose of the exercise was to identify vulnerabilities and develop effective countermeasures.[1] The term was later extended to cover related practices in other fields, including information security and intelligence analysis.
Red teaming in effective altruism
Within effective altruism, “red teaming” refers to attempts to identify problems or errors in popular or prestigious views held by members of this community, such as views about the value of different causes or organizations.[2]
Related concepts include minimal-trust investigations,[3] epistemic spot-checks,[4] and hypothetical apostasy.[5]
Further reading
Räuker, Max et al. (2022) Idea: Red-teaming fellowships, Effective Altruism Forum, February 2.
Vaintrob, Lizka & Fin Moorhouse (2022) Resource for criticisms and red teaming, Effective Altruism Forum, June 1.
Zhang, Linchuan (2021) Red teaming papers as an EA training exercise?, Effective Altruism Forum, June 22.
Related entries
criticism of effective altruism | epistemology | epistemic deference | tabletop exercises
- ^
Johnson, Rowland (2015) How your red team penetration testers can help improve your blue team, SC Magazine, August 18.
- ^
Räuker, Max et al. (2022) Idea: Red-teaming fellowships, Effective Altruism Forum, February 2.
- ^
Karnofsky, Holden (2021) Minimal-trust investigations, Effective Altruism Forum, November 23.
- ^
Ravid, Yoav (2020) Epistemic spot check, LessWrong Wiki, August 7.
- ^
Bostrom, Nick (2009) Write your hypothetical apostasy, Overcoming Bias, February 21.