AI safety
had a health crisis
Getting back on track
sergeivolodin
I don’t get this—I posted it as a comment and the post itself went like −2 but was like +3 or something… Causality, correlation? Shall I delete it? Just a joke folks, so strange, people don’t want to discuss dating …
And now it’s +8
Maybe I’m just seeing things that are not there, oh la la
I wrote something about empathy and strength, vibes and rationality, maybe that helps (“left and right brain”, not taken well by modern neuroscientists, treat it as an abstraction rather than direct mapping)
I feel and think I need both. With only the left brain I forget about now (like now, not “what my actions would give” or “what happened in the past when similar actions were taken”). And I not see some options available if I forget the now. With only the right brain I forget about systems that sometimes help. Left brain implements models, right brain tunes and “reloads” them when they become too bloated, too contradictory and need new axioms. Those spiritual vibes (can be seen instantiated in music styles, for example) can be seen as different “cultures” of axiom-building. That’s why people who recover from trauma (like me) talk about spirituality and vibes—old axioms went into contradictions and couldn’t explain or predict or give actions to adapt to the environment. So it’s time for music and new axioms.
LRLRLR
Such an oh la la, I feel and think that EA forum needs more jokes or it’s only L and no R
Sorry, couldn’t resist. Glanced at the spreadsheet:
Gender Man Man Man Man Man Man Man Man Man Man Man Man Woman Man
To increase the efficiency of altruism of social animals that humans represent and obtain belonging and love in an effective, optimal and fully rational way that has low risks and has proven to work since the Roman empire and has perfect forecasts in terms of QALYs created via studies of Roman culture by renowned EA scientists, and make EAs more productive, I propose a new cause area: questioning gay tendencies in the Effective Gay therapy in order to increase personal fit for participation in EA, boost morale, release dopamine and maximize productivity and finally obtain optimal effectiveness of longtermist cause areas and complete, coherent and extrapolated rationality, uninterrupted by unnecessary risky behaviours like self-search or outdated mechanisms like human emotion.
Be Gay For EA
Be Gay For Future Generations
William said today: be gay for me. Come to our events. Please read the instructions first.
(joke yes yes. or maybe… hmm...)
Question: would an impactful but not cool/popular/elegant topic interest you? What’s your balance between coolness and impactfulness?
I have failed to do any meaningful work on recommender systems alignment. We launched an association, YouTube acknowledged the problem with disinformation when we talked to them privately (about COVID, for example, coming from Russia, for example), but said they will not do anything, with or without us. We worked alone, I was the single developer. I burned out to the point of being angry and alienating people around me (I understand what Timnit Gebru has went through, because Russia (my home country) is an aggressor country, and there is a war in Tigray as well, which is related to Ethiopia, her home country). I have sent many angry/confusing emails that made perfect sense for me at the time… I went through homelessness and unemployment after internships at CHAI Berkeley and Google and a degree from a prestigious European university. I felt really bad for not being able to explain the importance of the problem and stop Putin before it was too late… Our colleagues’ papers on the topic were silenced by their employers. Now I’m slowly recovering and feel I want to write about all that, some sort of a guide / personal experience on aligning real systems / organizations, and that real change comes really, really hard.
Would be awesome to know more, I personally feel EA could be a bit more down-to-earth in terms of doing actual things in the world and saving actual people :)
So I see downvotes as expected. I don’t get it
is it that people don’t want answers?
or maybe they like AI races?
Can people get real on this forum? Like, there are discussions about some ethical theory, infinite ethics or smth. Yet, right now, today, something fishy is going on. How can there be future without the present?
I ask for answers here.