Also, let me add from a private message exchange: in my previous comment on love seeking to benefit another rather than use them to benefit yourself...let’s get a little meta...This whole EA movement seeks to benefit humanity...and especially if seeking to benefit humans far in the future, that is really really selfless...EA people today will work hard to benefit someone they don’t know 400 years from now...Wow, that is about as lovingly unselfish as it gets. In other words this is a massively love based movement, even though it doesn’t see itself in the mirror that way. When I searched the topic tags you can add to your post, I searched all 740 of them and the word love is not found there. From a meta perspective it says a lot when love is your core motivation yet you never even speak of love. Psychologists might have some interesting comments on that. I have a more urgent one...process begets process...if I want to model being open about your shit to people, I should be open about my shit in front of them. You don’t see many beer bellied personal trainers. If my whole huge goal is to Align AI so it doesn’t destroy humanity, and I’m motivated by love to do that, and I agree that pretty much all humans just really want to love and be loved, and that’s just how our brains are wired...wouldn’t it kinda seem obvious that trying to print/copy a human brain into a digital version might include the very thing at the core driving actual human brains? Namely love. And essentially as the last line in my post, if AI had love for humans as it’s motivation, if it was the new “Man’s best friend”...all would be well...EA/Longtermism would be a triumphant success.
Also, let me add from a private message exchange: in my previous comment on love seeking to benefit another rather than use them to benefit yourself...let’s get a little meta...This whole EA movement seeks to benefit humanity...and especially if seeking to benefit humans far in the future, that is really really selfless...EA people today will work hard to benefit someone they don’t know 400 years from now...Wow, that is about as lovingly unselfish as it gets. In other words this is a massively love based movement, even though it doesn’t see itself in the mirror that way. When I searched the topic tags you can add to your post, I searched all 740 of them and the word love is not found there. From a meta perspective it says a lot when love is your core motivation yet you never even speak of love. Psychologists might have some interesting comments on that. I have a more urgent one...process begets process...if I want to model being open about your shit to people, I should be open about my shit in front of them. You don’t see many beer bellied personal trainers. If my whole huge goal is to Align AI so it doesn’t destroy humanity, and I’m motivated by love to do that, and I agree that pretty much all humans just really want to love and be loved, and that’s just how our brains are wired...wouldn’t it kinda seem obvious that trying to print/copy a human brain into a digital version might include the very thing at the core driving actual human brains? Namely love. And essentially as the last line in my post, if AI had love for humans as it’s motivation, if it was the new “Man’s best friend”...all would be well...EA/Longtermism would be a triumphant success.