Can you share any of those tools?
Mikolaj Kniejski
[Question] is this golfist repulsive?
RE: democracy preservation. This area doesn’t seem to be as neglected as for instance AI welfare. There are multiple organizations covering this general area (tho they are not focused on AI specifically).
I modeled how much LW karma crossposts get based on EA karma and topic
I am creating a comparative analysis of cross-posted posts on LW and EAF. Make your bets!
I will pull all the posts that were posted on both LW and EAF and compare how different topics get different amount of karma and comments (and maybe sentiment of comments) as a proxy for how interested people are and how much they agree with different claims. Make your bests and see if it tells you anything new! I suspect that LW users are much more interested in AI safety and less vegan. They care less about animals and are more skeptical towards utility maximization. Career related posts will fare better on EAF, while rationality (as in rationality as an art) will go better on LW. Productivity posts will have more engagement on EAF.
It is not possible to check all the bets since the number of cross-posted posts is not that big and they are limited to specific topics.
Do you take your friendships seriously? Do you care about your friends? People often care about their friends and do it for non-moral reasons. I’ve been thinking about how to live, have friends and so on and have an impartial perspective on beings. If you are impartial you are supposed to act as if you care about X of everyone equally where X is some value that your moral system promotes.
But I care about my friends and family much more than about other people. I could tell myself that I need them to stay productive so I can in the end be better of taking care of everyone but I feel awful about it. Is this a reason to drop impartiality?
Either I have to drop impartiality or drop moral tyranny.
As moral tyranny I mean something as there being only one moral value or rules that I am supposed to live with and maximize it. Under moral tyranny e.g. art is only supposed to be created for the X moral value.
If I drop moral tyranny I can keep neat moral systems and
but it would reduce how helpful is my moral system for making decisions.
Can you clarify your view on suffering to me? Are you saying that suffering is undesirable simply because we made it so? I would say there is something more to it, since all animals try to avoid, not only humans. Humans mostly try to avoid and when they don’t they sometimes come up with elaborate ideas on how to justify suffering, e.g. saying it’s a catalyst for self-development.
I experienced people being very honest with me but it might be because I mostly interact with AI safety people
I guess a solution here could be color coded badges e.g. red one says “Please don’t photograph me and if you accidentally included me in a picture, don’t share that picture)”
I really like the website. It has nice design.
I would try to talk to people at EAG or EAGx and ask for total honesty. I would say that most people might think it’s low impact.
I gave it about 15 minutes to come up with some reasons why people might be reluctant to support you: in general EAs don’t focus that much on climate change. Out of all the things that you can do for climate change, planting trees doesn’t seem to be the most effective thing to do. You can read about it here. Also, there is a bunch of organizations already doing that (and they also roughly plant 1 tree per dollar).
I think that badges with names on EAGx and EAGs are a bad idea. There are some people who would rather not be connected to the EA movement—some animal advocates or AI safety people. I feel like I’m speculating here, but I imagine a scenario like this:
Some people take a picture at EAG
The picture gets posted online
The badge and the person are in that picture, somewhere in the description/comments something says EAG/EA/AI safety or something similar
Some people find it at some point, or other people notice it and connect things
Some political opponents of that person make everyone aware that this person has connections to EA brand (think about the upcoming movie and FTX) or that that person receives money from some specific sources.
The only use cases for names on badges I can see are that you can:
Have people recognize you right away. You don’t need to tell your name to everyone
People can take a picture of your badge to keep in touch with you later.
Security can verify that you are the person on the badge
I see people using the badges for the first two things from time to time but I don’t think it’s a huge use case. Some alternatives for third use case:
Badges with pictures make it even easier to verify that the person on the badge is the person. This is nice but introduces a lot of friction
Just show your ticket to security
I think there should at least be an option to have badges that don’t have names, and that it should be normalized to have badges like that. It’s not obvious to some people that they can cover their badge. Other options include:
Optional name badges (let people choose)
First names only
Pseudonyms/handles
Color-coded privacy preferences
Does the promise have a website? Do you happen to know how many people have taken the promise so far? BTW There is a notion page where EAs can sign up to host others so it’s kind of like promise already.
“why do i find myself less involved in EA?”
You go over more details later and answer other questions like what caused some reactions to some EA-related things, but an interesting thing here is that you are looking for a cause of something that is not.
> it feels like looking at the world through an EA frame blinds myself to things that i actually do care about, and blinds myself to the fact that i’m blinding myself.
I can strongly relate, had the same experience. i think it’s due to christian upbringing or some kind of need for external validation. I think many people don’t experience that, so I wouldn’t say that’s an inherently EA thing, it’s more about the attitude.
I’m a huge fan of self-hosting and even better writing simple and ugly apps, in my dream world every org would have its resident IT guy who would just code an app that would have all the features they need.
Do people enjoy using Slack? I hate Slack and I think that Slack has bad ergonomics. I’m in about 10 channels and logging into them is horrible. There is no voice chat. I’m not getting notifications (and I fret the thought of setting them up correctly—I just assume that if someone really wanted to get in touch with me immediately, they will find a way) I’m pretty sure it would be hard to create a tool better than Slack (I’m sure one could create a much better tool for a narrower use case, but would find it hard to cover all the Slack’s features) but let’s assume I could. Is it worth it? Do you people find Slack awful as well or is it only me?
not if the ai increases intelligence via speed up or other methods which don’t change the goals.
I realized that the concept of utility as a uniform, singular value is pretty off-putting to me. I consider myself someone who is inherently aesthetic and needs to place myself in a broader context of the society, style and so on. I require a lot of experiences— in some way, I need more than just happiness to reach a state of fulfillment. I need to have aesthetic experience of beauty, the experience of calmness, the anxiety of looking for answers, the joy of building and designing.
The richness of everyday experience might be reducible to two dimensions: positive and negative feelings but this really doesn’t capture what a fulfilling human life is.
All the EA-committed dollars in the world are a tiny drop in the ocean of the world’s problems and it takes really incredible talent to leverage those dollars in a way that would be more effective than adding to them.
This seems false to me. I agree that earning to give should be highly rewarded and so on, but I don’t think that, for example, launching an effective giving organization requires an incredible amount of talent. There have been many launched recently, either by CE or local groups (I was part of the team that launched one in Denmark). Recently, EAIF said that they are not funding-constrained, and there are a lot of projects being funded on Manifund. It looks more like funders are looking for new projects to fund. So either most of the funders are wrong in their assessment and should just grant to existing opportunities, or there is still room for new projects.
If anything my experience was that the bar for direct work is way lower than I expected and part of reason why I thought that way was that there are comments like this.
The short version of the argument is that excessive praise for ‘direct work’ has caused a lot of people who fail to secure direct work to feel un-valued and bounce off EA.
Interesting! Is there any data that supports this?
what is the difference between net negative and negative in expectation?