To me it feels easier to participate in discussions on Twitter than on (e.g.) the EA Forum, even though you’re allowed to post a forum comment with fewer than 280 characters. This makes me a little worried that people feel intimidated about offering “quick takes” here because most comments are pretty long. I think people should feel free to offer feedback more detailed than an upvote/downvote without investing a lot of time in a long comment.
JoshYou
Not from the podcast but here’s a talk Rob gave in 2015 about potential arguments against growing the EA community: https://www.youtube.com/watch?v=TH4_ikhAGz0
EAs are probably more likely than the general public to keep money they intend to donate invested in stocks, since that’s a pretty common bit of financial advice floating around the community. So the large drop in stock prices in the past few weeks (and possible future drops) may affect EA giving more than giving as a whole.
How far do you think we are from completely filling the need for malaria nets, and what are the barriers left to achieving that goal?
What are your high-level goals for improving AI law and policy? And how do you think your work at OpenAI contributes to those goals?
Seems like its mission sits somewhere between GiveWell’s and Charity Navigator’s. GiveWell studies a few charities to find the very highest impact ones according to its criteria. Charity Navigator attempts to rate every charity, but does so purely on procedural considerations like overhead. ImpactMatters is much broader and shallower than GiveWell but unlike Charity Navigator does try to tell you what actually happens as the result of your donation.
I think I would be more likely to share my donations this way compared to sharing them myself, because it would feel easier and less braggadocious (I currently do not really advertise my donations).
Among other things, I feel a sense of pride and accomplishment when I do good, the way I imagine that someone who cares about, say, the size of their house feels when they think about how big their house is.
Absolutely, EAs shouldn’t be toxic, inaccurate, or uncharitable on Twitter or anywhere else. But I’ve seen a few examples of people effectively communicating about EA issues on Twitter, such as Julia Galef and Kelsey Piper, at a level of fidelity and niceness far above the average for that website. On the other hand they are briefer, more flippant, and spend more time responding to critics outside the community than they would on other platforms.
Yep, though I think it takes a while to learn how to tweet, whom to follow, and whom to tweet at before you can get a consistently good experience on Twitter and avoid the nastiness and misunderstandings it’s infamous for.
There’s a bit of an extended universe of Vox writers, economists, and “neoliberals” that are interested in EA and sometimes tweet about it, and I think it would be potentially valuable to add some people who are more knowledgeable about EA into the mix.
On point 4, I wonder if more EAs should use Twitter. There are certainly many options to do more “ruthless” communication there, and it might be a good way to spread and popularize ideas. In any case it’s a pretty concrete example of where fidelity vs. popularity and niceness vs. aggressive promotion trade off.
This all seems to assume that there is only one “observer” in the human mind, so that if you don’t feel or perceive a process, then that process is not felt or perceived by anyone. Have you ruled out the possibility of sentient subroutines within human minds?
Sadly, Jiwoon passed away last year.
Some links if you haven’t seen them yet:
https://reducing-suffering.org/advanced-tips-on-personal-finance/
https://80000hours.org/2013/06/how-to-create-a-donor-advised-fund/
I don’t use a DAF but I’ve considered it in the past. In my view, the chief advantage is that they allow you to claim the tax deduction when you deposit money into the DAF, before you actually make the donation. They’re also exempt from capital gains taxes, although you can also avoid capital gains taxes by donating appreciated assets directly to the charity, but that depends on whether the organization will accept them (not sure how universal this is). They also charge fees, which can be fairly expensive but are cheaper than capital gains taxes on expectation.
Open Phil would be a good candidate for this, though that’s a difficult proposition due to its sheer size. It is a somewhat odd situation that Open Phil throws huge amounts of money around, much of which happens without any comment from the EA community.
I wonder if the lack of tax deductibility and the non-conventional fundraising platform (GoFundMe) nudge people into not donating or donating less than they would to a more respectable-seeming charity.
(As a tangent, there’s a donation swap opportunity for the EA Hotel that most people are probably not aware of).
Speaking as someone with a undergrad degree in math, I would have found a non-technical summary for this post to be helpful. So I expect this would apply much more to many other forum readers.
For one of the work tests I did for Open Phil, the instruction sheet specifically asked that the work test not be shared with anyone. That might have been intended as a temporary restriction, I’m not sure, but I’m not planning on sharing it unless I hear otherwise.
Agreed. I don’t see any “poor journalism” in any of the pieces mentioned. A few of them would be “poor intervention reports” if we chose to judge them by that standard.
From what I understand, since Three Gorges is a gravity dam, meaning it uses the weight of the dam to hold back water rather than its tensile strength, a failure or collapse would not necessarily be catastrophic one. So if some portion falls, the rest will stay standing. That means there’s a distribution of severity within failures/collapses, it’s not just a binary outcome.