I appreciate that you’ve raised this issue and provided a reasonably
thorough discussion of it. I would like to highlight a bunch of
aspects based on my experience editing Wikipedia as well as studying
its culture in some depth. While the paid editing phase and the
subsequent fallout inform my views partly, these are actually based
on several years of experience before (and some after) that incident.
While none of what I say falsifies what you wrote, it is in tension
with some of your tone and emphasis. So in some ways these
observations are critical of your post.
How much reverence does Wikipedia’s process deserve?
I think that, if your goal is to spend a lot of time editing
Wikipedia, it’s really important to study Wikipedia’s policies—both
the de jure ones and the de facto ones. That’s because the
policies are not completely intuitive, and the enforcement can often
be swift and unfriendly—giving you little chance to recover once
you get on the bad side of it.
So in that sense, it is important to respect and understand
Wikipedia’s policies.
But, that is not the same as being reverent toward the policies and
enforcement mechanisms. I think your post has some of that reverence,
as well as a “just world” belief, that the policies and their
enforcement are sensible and just, and align with effective altruist
ideals. For instance, you write:
Therefore, anyone considering making contributions to Wikipedia
should become familiar with its rules, and in particular adhere to
the requirement not to approach editing as an advocacy tool. This is
important both because trying to paint an overly favourable picture
of EA-related topics will, as Brian notes above, likely backfire,
and because observing such a requirement is in line with EA’s
commitment to intellectual honesty and moral cooperation. Wikipedia
is one of the world’s greatest altruistic projects—their
contributors share many of our core values, and we should respect
their norms and efforts to maintain Wikipedia’s high quality.
and:
Don’t feel like you need to have read all articles about Wikipedia
rules and norms before you can start to edit. While reading them
upfront may help you avoid some frustrating experiences later, the
biggest failure mode is getting overwhelmed and being discouraged
from ever taking the first step on your editing journey. Most of
Wikipedia’s rules and norms are commonsensical, and you are bound to
become familiar with them as you gather editing experience.
In contrast, my take on understanding the Wikipedia system is that it
bears many resemblances to other legal and bureaucratic systems --
many of the rules make sense in theory, and have good rationales, but
their application is often really bad. Going in with a positive “just
world” belief in Wikipedia seems like a recipe for falling down rather
hard at the first incident. I think the best way is to be
well-prepared in terms of understanding the dynamics and the kinds of
attacks you may endure, so that then once you do get in there you have
no false expectations, and if you do get into a fight you can bow down
and stay cool without feeling rattled.
On that note, what kind of preparation is necessary?
Based on my experience editing Wikipedia, and seeing my edited
articles spend several years surviving, growing, getting deleted, or
shrinking—all of which have happened to me—I can say it’s
important to be prepared when editing Wikipedia in a few ways:
Prepare for your work getting deleted or maimed: On a process
level, this means keeping off-Wikipedia backups (Issa and I
implemented technical solutions to back up the content of articles
we were editing automatically, in addition to manual syncing we did
at every edit). During a mass deletion episode following the paid
editing, we almost lost the content of several articles, but were
fortunately able to retrieve it. At an emotional level, it means
accepting the possibility that stuff you spent a lot of time
writing can, sometimes immediately and sometimes after years,
randomly get deleted or maimed beyond recognition. And even if
reasons are proffered for the maiming or deletion, you are unlikely
to consider them good reasons.
Prepare to be attacked or questioned in ways you might find
ridiculous: This may not happen to you for years, and then may
suddenly happen even if you are on your best behavior—because
somebody somewhere notices something. While there are a number of
strategies to reduce the probability of this happening (don’t get
into fights, avoid editing controversial stuff, avoid overtly
promotional or low-quality edits) they are no guarantee. And if you
have a large corpus of edits, once somebody is suspicious of you,
they can go after your whole body of work. The emotional and
psychological preparation for that—and the background knowledge
of it so that you can make an informed decision to edit
Wikipedia—is important.
A few specific tripping points of effective altruist projects to edit Wikipedia
When do you get into trouble on Wikipedia, keep in mind these likely
truths about the other side (though this could vary a lot from
situation to situation, and you could well get lucky enough for these
not to apply to you):
The bulk of the people will be highly suspicious of you.
Those opposing you probably have a lot more time than you do and a
better ability to navigate Wikipedia’s channels.
They will not be impressed by your efforts to defend yourself, even
against points you consider clearly illogical.
Efforts to point to noble goals (e.g. effective altruism) or
measurement tools (e.g. pageviews) will make them more suspicious of
yours, as it will be taken as evidence of a conflict of interest.
Your efforts to recruit people through off-Wikipedia channels (e.g.,
this EA Forum post) may make matters worse, as it might lead to
accusations of canvassing.
Being mindful of your feelings will not be a priority for them.
What kind of Wikipedia editing might still be safe and okay to do?
This will vary from person to person. I think the following are likely
to be okay for anybody altruistically inclined but moderately
risk-averse:
Drive-by fixes to spelling, grammar, punctuation, formatting, broken
links, etc.: Once you have acquired basic familiarity with Wikipedia
editing, making these fixes when you notice issues is quick and easy.
Substantive edits or even new page creations where you have fairly
high confidence that your edits will pass under the radar of zealous
attackers (this tends to work well for obscure but protected topics;
some academic areas such as in higher mathematics could be like
this).
Substantive edits or even new page creations where, even if the edit
gets reverted or the page deleted, the output you create (in terms
of update to your state of mind, or the off-Wikipedia copy of the
updated content) makes it worthwhile.
A positive note to end on
I will end with a wholehearted commendation of the spirit of your
post; as I see it, this is about being prosocial in a broad sense,
“giving back” to a great resource, and finding opportunities to
benefit multiple communities and work in a collaborative fashion with
different groups to create more for the world. I generally favor
producing public output while learning new topics; where the format
and goals allow it, this could be Wikipedia pages! Issa Rice has even
documented this “paper trail” approach I
follow.
PS: I thank Issa Rice for some of the links
and thoughts that I’ve included in this comment as well as for
reviewing my draft of the comment. Responsibility for errors and
omissions is fully mine; I did not incorporate all of Issa’s feedback.
One downside you don’t mention: having a Wikipedia article can be a liability when editors are malicious, for all the reasons it is a benefit when it is high-quality like its popularity and mutability. A zealous attacker or deletionist destroying your article for jollies is bad, but at least it merely undoes your contribution and you can mirror it; an article being hijacked (which is what a real attacker will do) can cause you much more damage than you would ever have gained as it creates a new reality which will echo everywhere.
My (unfortunately very longstanding) example of this is the WP article on cryonics: you will note that the article is surprisingly short for a topic on which so much could be said and reads like it’s been barely touched in half a decade. Strikingly, while having almost no room for any information on minor topics like how cryonics works or how current cryonics orgs operate or the background on why it should be possible in principle or remarkable research findings like the progress on bringing pigs back from the dead, instead, the introduction, and an entire section, harp on how corporations go bankrupt and it is unlikely that a corporation today will be around in a century and how ancient pre-1973 cryonics companies have all gone bankrupt and so on. These claims are mostly true, but you will then search the article in vain for any mention that the myriad of cryonics bankruptcies alluded to is like 2 or 3 companies, that cryonics for the past 50 years isn’t done solely by corporations precisely because of that when it became apparent that cryonics was going to need to be a long-term thing & families couldn’t be trusted to pay, they are structured as trusts (the one throwaway comma mentioning trusts is actively misleading by implying that they are optional and unusual, rather than the status quo), and that there have been few or no bankruptcies or known defrostings since. All attempts to get any of this basic information into the article is blocked by editors. Anyone who comes away with an extremely negative opinion of cryonics can’t be blamed when so much is omitted to put it in the worst possible light. You would have to be favorably disposed to cryonics already to be reading this article and critically thinking to yourself, “did cryonicists really learn nothing from the failures? how do cryonicists deal with these criticisms when they are so obvious, it doesn’t seem to say? if the cryonics orgs go bankrupt so often, why doesn’t it name any of the many bankruptcies in the 49 years between 1973 and 2022, and how are any of these orgs still around?” etc.
More recently, the Scott Alexander/NYT fuss: long-time WP editor & ex-LWer David Gerard finally got himself outright topic-banned from the SA WP article when he overreached by boasting on Twitter how he was feeding claims to the NYT journalist so the journalist could print them in the article in some form and Gerard could then cite them in the WP article (and safe to say, any of the context or butt-covering caveats in the article version would be sanded away and simplified in the WP version to the most damaging possible version, which would then be defended as obviously relevant and clearly WP:V to an unimpeachable WP:RS). Gerard and activists also have a similar ‘citogenesis’ game going with Rational Wiki and friendly academics laundering into WP proper: make allegations there, watch them eventually show up in a publication of some sort, however tangential, and now you can add to the target article “X has been described as a [extremist / white supremacist / racist / fringe figure / crackpot] by [the SPLC / extremism researchers / the NYT / experts / the WHO]<ref></ref>”. Which will be true—there will in fact be a sentence, maybe even two or three about it in the ref. And there the negative statements will stay forever if they have anything to say about it (which they do), while everything else positive in the article dies the death of a thousand cuts. This can then be extended: do they have publications in some periodicals? Well, extremist periodicals are hardly WP:RSes now are they and shouldn’t be cited (WP:NAZI)… Scott’s WP article may not be too bad right now, but one is unlikely to be so lucky to get such crystal-clear admissions of bad faith editing, a large audience of interested editors going beyond the usual suspects of self-selected activist-editors who are unwilling to make excuses for the behavior, and despite all that, who knows how the article will read a year or a decade from now?
Hi Darius!
I appreciate that you’ve raised this issue and provided a reasonably thorough discussion of it. I would like to highlight a bunch of aspects based on my experience editing Wikipedia as well as studying its culture in some depth. While the paid editing phase and the subsequent fallout inform my views partly, these are actually based on several years of experience before (and some after) that incident.
While none of what I say falsifies what you wrote, it is in tension with some of your tone and emphasis. So in some ways these observations are critical of your post.
How much reverence does Wikipedia’s process deserve?
I think that, if your goal is to spend a lot of time editing Wikipedia, it’s really important to study Wikipedia’s policies—both the de jure ones and the de facto ones. That’s because the policies are not completely intuitive, and the enforcement can often be swift and unfriendly—giving you little chance to recover once you get on the bad side of it.
So in that sense, it is important to respect and understand Wikipedia’s policies.
But, that is not the same as being reverent toward the policies and enforcement mechanisms. I think your post has some of that reverence, as well as a “just world” belief, that the policies and their enforcement are sensible and just, and align with effective altruist ideals. For instance, you write:
and:
In contrast, my take on understanding the Wikipedia system is that it bears many resemblances to other legal and bureaucratic systems -- many of the rules make sense in theory, and have good rationales, but their application is often really bad. Going in with a positive “just world” belief in Wikipedia seems like a recipe for falling down rather hard at the first incident. I think the best way is to be well-prepared in terms of understanding the dynamics and the kinds of attacks you may endure, so that then once you do get in there you have no false expectations, and if you do get into a fight you can bow down and stay cool without feeling rattled.
You’ve linked to Gwern’s inclusionism article already; a few other links I recommend: Wikipedia Bureaucracy (continued), Robert Walker’s answer on frustrating aspects of being a Wikipedia editor, and Gwern’s piece on dark side editing.
On that note, what kind of preparation is necessary?
Based on my experience editing Wikipedia, and seeing my edited articles spend several years surviving, growing, getting deleted, or shrinking—all of which have happened to me—I can say it’s important to be prepared when editing Wikipedia in a few ways:
Prepare for your work getting deleted or maimed: On a process level, this means keeping off-Wikipedia backups (Issa and I implemented technical solutions to back up the content of articles we were editing automatically, in addition to manual syncing we did at every edit). During a mass deletion episode following the paid editing, we almost lost the content of several articles, but were fortunately able to retrieve it. At an emotional level, it means accepting the possibility that stuff you spent a lot of time writing can, sometimes immediately and sometimes after years, randomly get deleted or maimed beyond recognition. And even if reasons are proffered for the maiming or deletion, you are unlikely to consider them good reasons.
Prepare to be attacked or questioned in ways you might find ridiculous: This may not happen to you for years, and then may suddenly happen even if you are on your best behavior—because somebody somewhere notices something. While there are a number of strategies to reduce the probability of this happening (don’t get into fights, avoid editing controversial stuff, avoid overtly promotional or low-quality edits) they are no guarantee. And if you have a large corpus of edits, once somebody is suspicious of you, they can go after your whole body of work. The emotional and psychological preparation for that—and the background knowledge of it so that you can make an informed decision to edit Wikipedia—is important.
A few specific tripping points of effective altruist projects to edit Wikipedia
When do you get into trouble on Wikipedia, keep in mind these likely truths about the other side (though this could vary a lot from situation to situation, and you could well get lucky enough for these not to apply to you):
The bulk of the people will be highly suspicious of you.
Those opposing you probably have a lot more time than you do and a better ability to navigate Wikipedia’s channels.
They will not be impressed by your efforts to defend yourself, even against points you consider clearly illogical.
Efforts to point to noble goals (e.g. effective altruism) or measurement tools (e.g. pageviews) will make them more suspicious of yours, as it will be taken as evidence of a conflict of interest.
Your efforts to recruit people through off-Wikipedia channels (e.g., this EA Forum post) may make matters worse, as it might lead to accusations of canvassing.
Being mindful of your feelings will not be a priority for them.
What kind of Wikipedia editing might still be safe and okay to do?
This will vary from person to person. I think the following are likely to be okay for anybody altruistically inclined but moderately risk-averse:
Drive-by fixes to spelling, grammar, punctuation, formatting, broken links, etc.: Once you have acquired basic familiarity with Wikipedia editing, making these fixes when you notice issues is quick and easy.
Substantive edits or even new page creations where you have fairly high confidence that your edits will pass under the radar of zealous attackers (this tends to work well for obscure but protected topics; some academic areas such as in higher mathematics could be like this).
Substantive edits or even new page creations where, even if the edit gets reverted or the page deleted, the output you create (in terms of update to your state of mind, or the off-Wikipedia copy of the updated content) makes it worthwhile.
A positive note to end on
I will end with a wholehearted commendation of the spirit of your post; as I see it, this is about being prosocial in a broad sense, “giving back” to a great resource, and finding opportunities to benefit multiple communities and work in a collaborative fashion with different groups to create more for the world. I generally favor producing public output while learning new topics; where the format and goals allow it, this could be Wikipedia pages! Issa Rice has even documented this “paper trail” approach I follow.
PS: I thank Issa Rice for some of the links and thoughts that I’ve included in this comment as well as for reviewing my draft of the comment. Responsibility for errors and omissions is fully mine; I did not incorporate all of Issa’s feedback.
One downside you don’t mention: having a Wikipedia article can be a liability when editors are malicious, for all the reasons it is a benefit when it is high-quality like its popularity and mutability. A zealous attacker or deletionist destroying your article for jollies is bad, but at least it merely undoes your contribution and you can mirror it; an article being hijacked (which is what a real attacker will do) can cause you much more damage than you would ever have gained as it creates a new reality which will echo everywhere.
My (unfortunately very longstanding) example of this is the WP article on cryonics: you will note that the article is surprisingly short for a topic on which so much could be said and reads like it’s been barely touched in half a decade. Strikingly, while having almost no room for any information on minor topics like how cryonics works or how current cryonics orgs operate or the background on why it should be possible in principle or remarkable research findings like the progress on bringing pigs back from the dead, instead, the introduction, and an entire section, harp on how corporations go bankrupt and it is unlikely that a corporation today will be around in a century and how ancient pre-1973 cryonics companies have all gone bankrupt and so on. These claims are mostly true, but you will then search the article in vain for any mention that the myriad of cryonics bankruptcies alluded to is like 2 or 3 companies, that cryonics for the past 50 years isn’t done solely by corporations precisely because of that when it became apparent that cryonics was going to need to be a long-term thing & families couldn’t be trusted to pay, they are structured as trusts (the one throwaway comma mentioning trusts is actively misleading by implying that they are optional and unusual, rather than the status quo), and that there have been few or no bankruptcies or known defrostings since. All attempts to get any of this basic information into the article is blocked by editors. Anyone who comes away with an extremely negative opinion of cryonics can’t be blamed when so much is omitted to put it in the worst possible light. You would have to be favorably disposed to cryonics already to be reading this article and critically thinking to yourself, “did cryonicists really learn nothing from the failures? how do cryonicists deal with these criticisms when they are so obvious, it doesn’t seem to say? if the cryonics orgs go bankrupt so often, why doesn’t it name any of the many bankruptcies in the 49 years between 1973 and 2022, and how are any of these orgs still around?” etc.
More recently, the Scott Alexander/NYT fuss: long-time WP editor & ex-LWer David Gerard finally got himself outright topic-banned from the SA WP article when he overreached by boasting on Twitter how he was feeding claims to the NYT journalist so the journalist could print them in the article in some form and Gerard could then cite them in the WP article (and safe to say, any of the context or butt-covering caveats in the article version would be sanded away and simplified in the WP version to the most damaging possible version, which would then be defended as obviously relevant and clearly WP:V to an unimpeachable WP:RS). Gerard and activists also have a similar ‘citogenesis’ game going with Rational Wiki and friendly academics laundering into WP proper: make allegations there, watch them eventually show up in a publication of some sort, however tangential, and now you can add to the target article “X has been described as a [extremist / white supremacist / racist / fringe figure / crackpot] by [the SPLC / extremism researchers / the NYT / experts / the WHO]
<ref></ref>
”. Which will be true—there will in fact be a sentence, maybe even two or three about it in the ref. And there the negative statements will stay forever if they have anything to say about it (which they do), while everything else positive in the article dies the death of a thousand cuts. This can then be extended: do they have publications in some periodicals? Well, extremist periodicals are hardly WP:RSes now are they and shouldn’t be cited (WP:NAZI)… Scott’s WP article may not be too bad right now, but one is unlikely to be so lucky to get such crystal-clear admissions of bad faith editing, a large audience of interested editors going beyond the usual suspects of self-selected activist-editors who are unwilling to make excuses for the behavior, and despite all that, who knows how the article will read a year or a decade from now?