Is it not possible that, if it became public knowledge that grants were insured against clawback, lawyers would try harder to get them? If the money is already spent and it’s a bunch of broke individuals, it may not be worth the expense of trying to claw it back. I guess that would just be something Bill would have to account for.
I agree with most of this—clusters probably not very accurate, divisive religious terminology, him identifying with one of the camps while describing them.
Can you elaborate a bit more on why you think binary labels are harmful for further progress? Would you say they always are? How much of your objection here is these particular labels and how Scott defines them, and how much of it is that you don’t think the shape can be usefully divided into two clusters?
I find that, on topics that I understand well, I often object intuitively to labels on the grounds that they aren’t very accurate, or don’t describe enough nuance, but for topics I am not expert on, I sometimes find it useful to be able to gesture at the general shape of things.
I guess I’m still interested in possible paths to understanding AI risk that don’t require accepting some of the “weirder” arguments up front, but might eventually get you there.
Agreed. I find Kevin to be an excellent communicator on the subject. There are a few other posts on the forum with podcasts and videos featuring him, easily found my searching on his name, for those who are interested in further content.
I noticed when I was setting up the recurring donations, that it is preselected to donate $15 to Every.org. If you don’t want to do that, you have to slide the slider to the left, down to $0.
Also, if you don’t want to continue with the recurring donations after the match, be sure to set a reminder to disable it in December.
@WilliamKiely, you might want to consider adding either of these points to the original post.
I agree, but I think it has to be a consideration when trying to market something widely these days. That said, my general impression is that it’s less of an issue with food than in other areas.
I am rooting for this so hard (for selfish as much as altruistic reasons). I passed it on to the only person I know who funds alt-protein stuff, though I imagine he would have seen it anyway. I am not sure what else I can do to help. If you ever need small scale help with a website (IE we need to update this info quickly, or add this small feature, but probably not we need to design a whole website), DM me and I will make it happen.
I do also really like the idea that this is going for more highbrow, rather than fast food, which is so crowded with alt-protein options these days (don’t need another realistic burger or chicken nuggets, thanks).
And offsets too FWIW. Something about avoiding doing something bad makes me feel like a good person, in a way that doing something bad and then making up for it by doing something good just doesn’t.
I’m not sure if the two events are just too far apart in time, or if my EA/rational side just kicks in and I can’t feel good about donating to offset a particular thing instead of to the most effective thing. Or maybe I just can’t emotionally get over my sense of “can’t undo the bad thing”.
The second paragraph really hits on the nose how I feel, without having ever been able to put it into words—regarding both eating less animal products and recycling.
I noticed a typo in the transcript that is pretty confusing. Probably important to fix, since this article is being used in several curriculum for AI alignment.
“power is useful for loss of objectives”
“power is useful for lots of objectives”
Great to see someone else doing one of these :)
Thank you for the correction!
(I don’t think this is done already but downvote or comment if so)
Some optional/additional way to weight karma as a percentage of total users (active users? readers?) so that sorting all posts by karma doesn’t show only newer posts at the top, and older popular posts way down with newer less-popular posts.
RSS feeds for tags (be surprised anyone else wants this but maybe?)
Credit this post for bringing the LW post to my attention.
To answer my own question, in case someone ends up here in the future, wondering the same thing, there are some options to do this.