EA forum content might be declining in quality. Here are some possible mechanisms.
Note: This was originally posted as a shortform with the first 8 points, and I added more based on the replies to that shortform.
Newer EAs have worse takes on average, because the current processes of recruitment and outreach produce a worse distribution than the old ones
Newer EAs are too junior to have good takes yet. It’s just that the growth rate has increased so there’s a higher proportion of them.
People who have better thoughts get hired at EA orgs [edit: or have other better things to do] and are too busy to post. There is anticorrelation between the amount of time someone has to post on EA Forum and the expected quality of their post.
Controversial content, rather than good content, gets the most engagement.
Although we want more object-level discussion, everyone can weigh in on meta/community stuff, whereas they only know about their own cause areas. Therefore community content, especially shallow criticism, gets upvoted more. There could be a similar effect for posts by well-known EA figures.
Contests like the criticism contest decrease average quality, because the type of person who would enter a contest to win money on average has worse takes than the type of person who has genuine deep criticism. There were 232 posts for the criticism contest, and 158 for the Cause Exploration Prizes, which combined is more top-level posts than the entire forum in any month except August 2022.
EA Forum is turning into a place primarily optimized for people to feel welcome and talk about EA, rather than impact.
All of this is exacerbated as the most careful and rational thinkers flee somewhere else, expecting that they won’t get good quality engagement on EA Forum.
(pointed out by Larks) “We also seem to get a fair number of posts that make basically the same point as an earlier article, but the author presumably either didn’t read the earlier one or wanted to re-iterate it.”
(pointed out by ThomasW): “There are many people who have very high bars for how good something should be to post on the forum. Thus the forum becomes dominated by a few people (often people who aren’t aware of or care about forum norms) who have much lower bars to posting.”
(pointed out by John_Maxwell) “Forum leadership encouraging people to be less intimidated and write more off-the-cuff posts—see e.g. this or this.”
(pointed out by HaydnBelfield) “There is a lot more posted on the forum, mostly from newer/more junior people. It could well be the case that the average quality of posts has gone down. However, I’m not so sure that the quality of the best posts has gone down, and I’m not so sure that there are fewer of the best posts every month. Nevertheless, spotting the signal from the noise has become harder. ”
(I thought of this since last week) The appearance of quality decline is an illusion; people judge quality relative to their own understanding, which tends to increase. Thus even though quality stays constant, any given person’s perception of quality decreases.
(edited to add) Stagnation; EA Forum content is mostly drawn from the same distribution and many of the good thoughts have already been said. Contributing factors may be people not reading/building on previous posts (see also (9)), and lack of diversity in e.g. career specialties.
- 12 Dec 2022 22:27 UTC; 101 points)'s comment on Jonas Vollmer’s Shortform by (
- What is the most pressing feature to add to the forum? Upvote for in general, agreevote to agree with reasoning. by 5 Oct 2022 15:38 UTC; 66 points) (
- Why don’t people post on the Forum? Some anecdotes by 3 Oct 2022 15:06 UTC; 55 points) (
- EA & LW Forums Weekly Summary (26 Sep − 9 Oct 22′) by 10 Oct 2022 23:58 UTC; 24 points) (
- 19 Dec 2022 1:50 UTC; 21 points)'s comment on Thomas Kwa’s Shortform by (
- Monthly Overload of EA—October 2022 by 1 Oct 2022 12:32 UTC; 13 points) (
- EA & LW Forums Weekly Summary (26 Sep − 9 Oct 22′) by 10 Oct 2022 23:58 UTC; 13 points) (LessWrong;
- 5 Feb 2023 6:26 UTC; 8 points)'s comment on We’re no longer “pausing most new longtermist funding commitments” by (
I’m not convinced quality has been declining, but I’m open to the possibility, and it’s hard to judge.
Might be useful to ask EA Forum moderators if they can offer any data on metrics across time (e.g. the last few years) such as
overall number of EA Forum members
number who participate at least once a month
average ratio of upvotes to downvotes
average number of upvoted comments per long post
We could also just run a short poll of EA Forum users to ask about perceptions of quality.
Poll: EA Forum content is declining in quality (specifically, the content that appears on the frontpage is lower quality than frontpage content from a year ago).
(Use agree/disagree voting to vote on this proposition. This is not a perfect poll but it gives some indication)
I’m one of the people who submitted a post right before the deadline of the criticism contest. FWIW I think number 6 is off base. In my case, the deadline felt like a Schelling point. My post was long and kind of technical, and I didn’t have any expectation of getting money—though having a fake deadline was very helpful and I would probably not have written it without the contest. I don’t think that any of the posts that got prizes were written with an expectation of making a profit. They all looked like an investment of multiple hours by talented people who could have made much more money (at least in expectation) by doing something else. In order for someone profit-motivated to take advantage of this they would have to be involved and knowledgeable enough to write a competitive post, and unable to make money in essentially any other way. This seems like an unlikely combination, but if it does exist then I’d assume that supporting such people financially is an additional benefit rather than a problem.
I like hypothesis generation, and I particularly like that in this post a few of the points are mutually exclusive (like numbers 7 and 10), which should happen in a hypothesis generation post. However this list, as well as the topic, feels lazy to me, in the sense of needing much more specificity in other to generate more light than heat.
I think my main issue is the extremely vague use of”quality” here. It’s ok to use vague terms when a concept is hard to define, but in this case it feels like there are more useful ways to narrow it down. For example you could say “the average post seems less informative/well-researched” or “the average poster seems less experienced/ qualified”, or “I learned more from the old forum than the new one” (I think especially a focus on your experience would make the issue more precise, and open up new options such as “posts became less fun once I learned all the basics and new people who are just learning them became less interesting to me”). I would like to see a hypothesis generation post that focuses much more on the specific ways that posts are “worse” (and generates hypotheses on what they are) rather than on reasons for this to be the case. I suspect that once a concrete question is asked, the potential reasons will become more concrete and testable.
Another issue is that I think a lot of the points are more properly “reasons that posts on a forum can be bad” rather than issues with current vs old posts and I have trouble believing that these issues were absent or better in the past. This would also be solved by trying to make the complaint specific.
I agree that this list is “lazy”, and I’d be excited about someone doing a better analysis.
Pablo made a survey for the first 8 points, and people seem to agree most with 1 (newer EAs have worse takes on average) and 5 (meta/community stuff gets more attention), with mixed opinions about the rest.
I suspect a strong selection bias, since the survey was posted in a comment on your shortform, which would mostly be read by people who know you.
Edit: I suggest reading Thomas’ reply before voting on this comment.
Of the 15 people other than me who commented on the shortform, I only remember ever meeting 4. I would guess that for shortforms in general most of the attention comes from the feed.
There’s still a selection bias even if they don’t know you. People who read your shortform are probably not representative of the EA community. Maybe you only care about engaged, long-time EAs, but significant selection bias still seems probable for that case imo.
FWIW, I’m finding the forum less useful and enjoyable than before and I’m less motivated to contribute. I think the total number of posts has gone up, whereas the number of posts I want to read is about the same or a bit lower.
When I log on I see lots of (1) recycled topics, that is, things that have been discussed many times before (admittedly new users can’t really be blamed for this, but still) and (2) hot(ter) takes, where people are sharing something without having really thought or researched it. Clearly, there is some overlap between (1) and (2).
We could have something like the LW/AI Alignment Forum system, where some posts, commenter and comments get approved to a separate and more exclusive forum. Kind of icky, though.
If we were going to go down the multi-forum path, a less icky option would be to have both forums be open, but one explicitly for more in-depth, more advanced, more aggregative or more whatever posts, and moderation that moved ones that didn’t meet the bar back to the more beginner-friendly forum.
Or as the forum currently is, we could just add tags that capture whatever it is the OP is trying to capture - ‘beginner-friendly’, ‘not beginner-friendly’ or whatever.
If that’s not enough, I’d imagine there’s some middle ground UX that we could implement.
I would be pretty excited about this!
Ironically, I don’t think the comment I made that Thomas included above rises to my usual standard for my posts. You can see in the thread there are some issues with it, and if I were going to make a post about it I’d probably collect some data. I tend to have a lower standard for making comments.
It might be possible and useful to quantify decline in forum quality (measurement is hard it seems plausible to use engagement with promising or established users, and certain voting patterns might be a mark for quality).
I think the forum team should basically create/find content, for example by inviting guest writers. The resulting content would be good themselves and in turn might draw in high quality discussion. This would be particularly useful in generating high quality object level discussion.
(14) seems fairly likely to me; that was one of my original hypotheses upon reading your shortform (I meant to comment then but got stuck on some technicalities).
I strongly downvoted this post.
It reads mostly like “There are too many people I disagree with coming into the forum, and that’s a bad thing.”
It is very, very elitist. Both in the phrases you chose:
And in the actual arguments. Especially point 3 - you want the distribution of styles and opinions (what you think is “quality of thought”) to be as close as possible to that of people already employed by EA organisations—which would mean blocking diversification as much as possible.
You also assume things about the EA community (or maybe the expected impact of things), which I’m entirely not sure are right, like:
that “we” want more object-level content on the forum (rather than, say, people doing object-level work mostly in their jobs). This one could actually be measured, though
that “rational thinkers” are important for the forum. I’d agree, for example, that some limited applications of rationality are wanted—but I do not expect people to be, or even try to be, rational.
“Quality of person” sounds bad to me too. I also find it weird that someone already gave the same feedback on the shortform and the OP didn’t change it.
The other wordings seem fine to me. I understand that not everyone would want to phrase things that way, but we need some kind of language to express differences in quality of people’s contributions. Less direct wordings wouldn’t be, in my opinion, obviously better. Maybe they come across as kinder, but the sort of rephrasings I’m now envisioning can easily seem a bit fake/artificial in the sense that it’s clear to anyone what’s being communicated. If someone thought my “takes” were bad, I’d rather they tell me that in clear language instead of saying something that sounds stilted and has me infer that they also don’t expect me to be capable of hearing criticism.
(I might feel differently in a context where I care a lot about what others think of me as a person, like if I was among friends or roomates. By contrast, most people on the EA forums are “loose acquaintances” in a context that’s more about “figuring things out” or “getting things done” than it’s about being together in a community. In that context, friendliness and respect still remain important, but it isn’t per se unfriendly [and sometimes it’s even a positive mark of respect] to say things one considers to be true and important.)
Based on the OP’s own beliefs, they don’t primarily “want the distribution of styles and opinions to be as close as possible to that of people already employed by EA organisations.” The OP’s view is “competence differences exist and paying attention to them is important for making the world better.” (I think this view is obviously correct.) Therefore, the driver in their hypothesis about people working at EA orgs was obviously an assumption like “EA orgs that try to that try to hire the best candidates succeed more often than average.”Somehow, you make it sound like the OP has some kind of partiality bias or anti-diversity stance when all they did was voice a hypothesis that makes sense on the view “competence differences exist and paying attention to them is important for making the world better.” I think that’s super unfair.
Thanks for pointing this out. I just edited the wording.
Re OP’s point 3:
I tried not to imply that OP directly opposes diversity (my comment was initially phrased more harshly and I changed it before publishing) - so I’m sorry that’s still how I came across.
And I don’t really get what you mean by competence differences etc. There’s no single axis, competence at which makes people’s posts on the forum more valuable, and similarly no single axis for getting hired by EA orgs.
There might be some common ones. But even then, I think my logic stands: notice that OP talks about EA orgs in particular. Meaning OP does want to see a higher concentration of posts with views correlated to those of EA org employees. But that means a lower concentration of posts from people with views who don’t directly align with EA orgs—which would cause a cycle of blocking more diverse views.
Edit: I forgot to add, OP could have phrased this differently, saying that people with productive things to say (which I assume is what they may have meant by “better takes”) would be busier doing productive work and have less time to post here. Which I don’t necessarily buy, but let’s roll with it. Instead, they chose to focus on EA orgs in particular.
The causal reason I worded it that way is that I wrote down this list very quickly, and I’m in an office with people who work at EA orgs and would write higher quality posts than average, so it was salient, even if it’s not the only mechanism for having better things to do.
I also want to point out that “people who work at EA orgs” doesn’t imply infinite conformity. It just means they fit in at some role at some organization that is trying to maximize good and/or is funded by OpenPhil/FTX (who fund lots of things, including lots of criticism). I frequently hear minority opinions like these:
Biosecurity is more pressing than alignment due to tractability
Chickens are not conscious and can’t suffer
The best way to do alignment research is to develop a videogame as a testbed for multi-agent coordination problems
Alignment research is not as good as people think due to s-risk from near misses
Instead of trying to find AI safety talent at elite universities, we should go to remote villages in India
Hi, thanks for responding!
I should probably have been a bit more charitable in thinking why you wrote it specifically like this.
These might be minority opinions in the sense that they have some delta from the majority opinions, but they still form a tiny cluster in opinion space together with that majority.
You don’t often hear, for example:
People who think there’s no significant risk from AI
People who think extinction is only slightly worse than a global catastrophy that kills 99.99% of the population
People who think charities are usually net negative, including those with high direct impact
Or other categories which are about experience rather than views, like:
People who couldn’t afford time off to interview for an EA org
People who grew up in developing countries (the movement seems to have partial success there, but do these people work in EA orgs yet?)
I suspect OP doesn’t want more posts from employees at EA orgs because they are such employees—I understood OP as wanting higher quality posts, wherever they come from.
True, the post does suggest that employees at EA orgs make higher quality posts on average, and that they may have less time to post on the Forum than the average user, but those are empirical matters (and seem plausible to me, anyway).
Edit to add: I generally didn’t get the feeling that OP wholeheartedly supports intervening on any or each of these possible explanations, or that doing so wouldn’t risk other negative consequences (e.g. increased groupthink).
Indeed, this is why I wrote “a higher concentration of posts with views correlated to those of EA org employees”. It doesn’t matter whether there’s causality here—encouraging the correlation is itself a problem.
I agree with your edit more than with the rest of your comment.
It would be uncharitable to interpret “takes” to be about people’s specific views. Instead, it’s about things like the following:
Do I learn something from talking to this person? When I dig deeper into the reasons why they believe what they believe, do I find myself surprised by good arguments or depth of thought, or something of the like? Or does it just seem like they’re parroting something or are ideologically clouded and can’t seem to reason well? Do they seem interested in truth-seeking, intellectually honest, etc? Do they seem to have “good judgment” or do they make arguments where it feels like the conclusions don’t even follow from their premises and they’re just generally off about the way things work? [There are tons of other factors that go into this; I’m just gesturing at some of the things.]
Regarding competence, there’s no single axis but that doesn’t mean the concept isn’t meaningful. Lots of concepts work like that – they’re fuzzy but still meaningful.
To be fair, some things might be less about competence and more about not having equally “high standards.” For instance, I notice that sometimes people new to EA make posts on some specific topic that are less thorough than some post from 5-10 years ago that long-term EAs would consider “canonical.” And these new posts don’t add new considerations or even miss important considerations discussed in the older post. In that case, the new person may still be “competent” in terms of intelligence or even reasoning ability, but they would lack a kind of obsessiveness and high standards about what they’re doing (otherwise they’d probably have done more reading about the topic they were going to make a top-level post about – instead of asking questions, which is always an option!). So, it could also be a cultural thing that’s more about lack of obsessiveness (“not bothering to read most of what seems relevant”) or high standards, rather than (just) about “competence.”
(And, for what it’s worth, I think it’s totally forgivable to occasionally make posts that weren’t aware of everything that’s previously been written. It would be absurd to expect newcomers to read everything. It just gets weird if most of someone’s posts are “worse than redundant” in that way and if they make lots of such posts and they’re all confidently phrased so you get the impression that the person writing the post is convinced they’ll be changing the minds of lots of EAs.)
Views vs. “Other Things”
I can imagine a world where the things you wrote, like “Do I learn something from talking to that person”, are the sole measure of “posting quality”. I don’t personally think such a world is favorable (e.g. I’d rather someone who often posts smart things stay off the forum if they promote bigoted views). But I also don’t think that’s the world we’re in.
People cannot separate judgement of these things from judgement of a person’s views, even if they think they can. In practice, forum posts are often judged by the views they express (“capitalism is bad” is frowned upon), and even worse, by their style of reasoning (STEM-like arguments and phrasing is much more accepted, quantification and precision are encouraged even when inappropriate). Object-level engagement is appreciated over other forms, disregarding that it is only sometimes right to engage this way.
As I see it, the vision of a rational, logical, strongly truth-seeking forum is an illusion, and this illusion is used to drive out people with more diverse backgrounds or who come from underrepresented schools of thought.
I personally have very high standards. There are many posts I want to write, but I really want them to be thorough and convincing, and to engage with relevant materials. You can see the result—I have written none! Is this actually helpful?
I think there can be value in posts that reiterate old content, perhaps even when they leave out important bits or have problematic logic. I have two reasons:
The forum guides the movement not only through building a common knowledge base, but also through representing the growing community’s views. If, for example, 8 years ago someone had written that it’s acceptable to work for a tobacco company in order to donate to high impact charities—how would you know how many current EAs share that view? The view itself is not an empirical question, and the old post’s karma tells you nothing about this. A new post, letting the community reengage with the ideas, might.
As noted in the OP and elsewhere, EAs love to criticise EA. I’m in favor of that—there are lots of problems, and we need to notice and fix them. Alas, many are noticed but then not fixed. If 8 years ago someone had written about how diversity of experience is important, but nowadays the movement is still composed almost entirely of people from Western countries, and most community building resources also go there—it means no meaningful action is being taken to fix the problem, so it needs to be reiterated.
Thanks for the explanation. I read their comment quickly and didn’t even consider Guy’s interpretation of what Thomas said, and at first believed that Guy’s paragraph on point #3 was an entire non-sequitur.
(apologies for the overly meta comment)
Sorry for not explaining myself well enough. But I still stand behind my interpretation. Does my new comment help?
I have mixed feelings because I understand what the post is getting at but think this is a good example of a person writing their thoughts without considering how others will perceive them. E.g. there is no need to say ‘quality of person’ to get the point across, but doing so might make more sense if the author’s mental process is simply ‘writing down, as accurately as possible, what I believe’ and there’s no flagging of how a message might be received.
This problem seems common to me in the rationality community. Not meaning to dig at Thomas in particular, only to point it out, since I think it could reduce the diversity of the EA community along important lines.
I agree with the overall premise of this post that, generally speaking, the quality of engagement on the forum, through posts or comments, has decreased, though I am not convinced (yet) that some of the points made by the author as evidence for this are completely accurate. What follows are some comments on what a reduced average post quality of the forum could mean, along with a closing comment or two.
If it is true that certain aspects of EAF posts have gotten worse over time, it’s worth examining exactly which aspects of comments and posts have degraded, and I think, in this regard, this post will be / has been helpful. Point 13 does claim that the degradation of the average quality of the forum’s content may be an illusion, but HaydnBelfield’s comment that “spotting the signal from the noise has become harder” seems to be stronger evidence that the average quality has indeed decreased, which is important as this means that people’s time is being wasted sorting through posts. This can be solved partially by subscribing only to the forum posters who produce the best content, but this won’t work for people who are new to the forum and produce high quality content, so other interventions or approaches should be explored.
While the question of how “quality” engagement on forum should be measured is not discussed in this post, I imagine that in most people’s minds engagement “quality” on this site is probably some function of the proportion of different types of posts (e.g., linkposts, criticisms, meta-EA, analyses, summaries, etc...), the proportion of the types of content of posts (e.g., community building, organizational updates, AI safety, animal welfare, etc...), and the epistemics of each post (this last point might be able to integrated with the first point). The way people engage with the forum, the forum’s optics, and how much impact is generated as a result of the forum existing are all affected by the average quality of the forum’s posts, so there seems to be a lot at stake.
I don’t have a novel solution for improving the quality of forum’s posts and comments. Presently, downvotes can be used to disincentivize certain content, comments can be used to point out epistemic flaws to the author of a post and to generally improve the epistemics of a discussion, high quality posters can create more high quality posts to alter the proportion of posts that are high quality, and forum moderators can disincentivize poor epistemic practices. In the present state of the forum, diffusing or organizing contest posts might make it easier to locate high quality posts. Additionally, having one additional layer of moderated review for posts created by users with less than some threshold of karma might go a long way in increasing the average quality of post’s made on the forum (e.g., that the Metaculus moderation team reviews its questions seems to help maintain the epistemic baseline of the site—of course, there are counterexamples, but in terms of average quality, extra review usually works, though it is somewhat expensive).
The forum metrics listed in Miller’s comment seem useful as well for getting a more detailed description of how engagement has changed over the years as the number of forum posters has changed.
As for the points themselves, I will comment that I think point (4) should be fleshed out in more detail—what are some examples here. Also, I think point 7 and 11 can be merged, and that more attention should be diverted to this. There can be inadvertent and countervalue consequences of welcoming the reduction in strength of people’s conversational filters on the forum. As such, moderators should consider these things more deeply (they may have weighed the pros and cons of taking actions to incentivize more engagement on the forum, and determined that this is best for the long term potential of the forum and EA more generally, but I do not know of the existence of such efforts).
Thank you Thomas Kwa for contributing this take to the forum; I think it could lead to an increase in some people’s threshold for posting and might lead to forum figures searching for ways to organize similar posts (e.g., creating a means to organize contest spam) and move the average post quality upwards.
If you state an opinion, it’s thought that opinion should be scrupulously challenged.
If you state a feeling you had, especially a difficult or conflicted one, it’s thought that it should be welcomed and certainly not challenged.
Individually, these attitudes make sense, but together I would expect that they will make Forum posts much more focused on emotional reactions than careful and unemotional pieces.
To clarify, I want both and think emotional reactions can be very important. But at least once, I’ve seen a detailed but unemotional post buried under a less well thought through post describing someone’s emotional reaction to a similar issue. Perhaps we should be welcoming of posts that try hard to do careful and rational analysis, even if they seem/are misguided or unsuccessful.
This. Emotional, controversial reactions are highly interesting, but they are the least useful posts.
I’d invert that heuristic and promote boring posts over controversial or emotional posts.
I think it’s valuable to have a low-barrier way for people to engage with EA without having years of experience and spending hours on writing a high-quality post. Do you have any ideas for how to avoid the trade-off between quality and accessibility?
Personally, I find the ‘frontpage’ and ‘curated’ filters to work pretty well. Regardless of the average quality of posts, as long as the absolute number of high-quality posts doesn’t decrease (and I see no reason why that would happen), filters like these should be able to keep a more curated experience intact, no?
2nd paragraph was my first thought as well. If the highlighted posts that most people see remain high quality, who cares if the average quality goes down?
I wrote this in 2013, might be of interest to those concerned:
On this forum, different people have different vote strengths. The metric that gets used is however karma instead of the registration date. If you believe that the registration rate is a better metric than karma, what’s your case for that?
I don’t have a view on that, but it would be cool if it was available as a forum setting (”Weight votes by account age”) and some people might like it better that way.
The software on which is forum is run is created by Lightcone Infrastructure. It’s possible to convince them to add new features, but I would expect that requires more than saying “it would be cool if”.
I was thinking of reasons why I feel like I get less value from EA Forum. But this is not the same as reasons EAF might be declining in quality. So the original list would miss more insidious (to me) mechanisms by which EAF could actually be getting worse. For example I often read something like “EA Forum keeps accumulating more culture/jargon; this is questionably useful, but posts not using the EA dialect are received increasingly poorly.” There are probably more that I can’t think of, and it’s harder for me to judge these...
The added point 14 doesn’t have the problems I talked about in my other comment (rather the opposite). But contrary to your point about contests, I think the OpenPhil Cause Exploration Prize has helped to improve this! It produced many dozens of high-quality, object-level posts which were novel, interesting, productive and hope-inspiring.
12 sounds right. None of the other mechanisms obviously suggest that you’d expect the absolute number of high quality posts to decline (or even not to grow). I would echo that it’s not clear the average quality is diminishing either, but the forum filtering UI might not be keeping up with the level of input.
I’d be curious to learn about CEA’s content moderation strategy re: preparing for significant participation growth on the Forum after What We Owe the Future has been published. The Forum’s user experience is going to change, and other communities have already experienced this. We shouldn’t be taken by surprise.
I’d be interested to know if people thought this carried over to other EA forum adjacent spaces too, like EA twitter and lesswrong? My impression is that content might be slightly worse here too, but maybe not to the same extent as the forum. It seems like we might expect some of these mechanisms to translate to these spaces too, but not all of them, which would be useful for trying to determine which of these factors are actually important.
EA Twitter has been getting better, honestly, we’re having a great time
“Better” could mean lots of things here. Including: more entertaining; higher quality discussion; more engagement; it’s surpassed a ‘critical mass’ of people to sustain a regular group of posters and a community; better memes; more intellectually diverse; higher frequency of high quality takes; the best takes are higher quality; more welcoming and accessible conversations etc.
The aims of EA Twitter are different to the forum. But I think the most important metrics are the “quantity of discussion” ones.
My impression is that:
There are more “high quality takes” on EA Twitter now than a year ago (mostly due to more people being on it and people posting more frequently).
The “noise:quality ratio” is pretty bad on EA Twitter. Most of the space seems dominated by shit posting and in-group memes to me.
Obvs, shit posting is fine if that’s what you want. But I think it’s useful to be clear what you mean when you say “better”. If someone was looking for high quality discussion about important ideas in the world, I would personally not recommend them EA Twitter.
I think twitter is a suboptimal place to do this; the whole platform has been optimised for the wrong things and I’ve decided to not use twitter as a result. The behavioural changes it causes are subtle but real. For instance it causes you to over time reduce your own bar for posting things, post more frequently, comment more frequently, be more prone to checking notifications and getting distracted from other tasks, and so on. And it becomes easier to lose track of the main task of “actually making progress on hard problem X” in favour of “I’m bored and want social interaction, let me use discussion of problem X to get that.”
An interesting related question (and complement to (3)) is to what extent do higher quality conversations move on to other online spaces (as opposed to “being too busy to post” or taking their communications entirely offline).
For example, I’d be interested in if people have thoughts on whether the quality of conversations on LessWrong, Alignment Forum, the forecasting sites, etc., has gotten better or worse in the last few years.