I’m curious about what the thing you call EigenKarma is, is it the way people with more karma have more weighty votes? Or is it something with a global eigenvector?
plex
I have put some thought into the privacy aspect, and there are ways to make it non-trivial or even fairly difficult to extract someone’s trust graph, but nothing which actually hides it perfectly. That’s why the network would have to be opt-in, and likely would not cover negative votes.
I’d be interested to hear the unpacked version of your worries about “gatekeeping, groupthink and polarisation”.
Topically, this might be a useful part of a strategy to help the EA forum to stay focused on the most valuable things, if people had the option to sync their vote own history with the EigenKarma Network and use EKN lookup scores to influence the display and prioritization of posts on the front page. We’d be keen to collaborate with the EAF team to make this happen, if the community is excited.
Introduction to AI Alignment Ecosystem
Awesome! I’m glad to see these two communities connecting, I think there’s a lot of potential for cross-pollination. You might be interested in this thread, and this post. I’d love to see these ideas popularized in EA, as I think people who have flexibility and agency can achieve great things.
The tool looks great, one little suggestion is to merge the cells with the link and the ones to the right of it:
so that the link is clickable :)
My approach to addressing “AI can be technical but it’s not clear how much of that you need to know: AKA “But I don’t program”″ is running monthly Alignment Ecosystem Development Opportunity Calls, where I and others can quickly pitch lots of different technical and non-technical projects to improve the alignment ecosystem.
And for “There’s a lot of jargon and it’s not always well explained: AKA “Can you explain that again…but like I’m 5″”, aisafety.info is trying to solve this, along with upping Rob Miles’s output by bringing on board research volunteers.
Seems worthwhile and good for an extra reason: It allows us to train AIs on only data from before there was significant AI generated content, which might mitigate some safety concerns around AIs influencing future AIs training.
My understanding is that they asked someone to register the domains for things that had an EAF tag entry, which accidentally included some which didn’t really make sense. The full list includes a bunch of names of individuals, which were removed from EA domains.
I think ideally we should reach out to all individuals and orgs who CEA bought domains about and offer them, but it isn’t a super high priority for me.
Thanks! And yeah, it’s an incredibly powerful tool too! It’s a relational database integrated with an awesome editor and scripting engine, but all absurdly easy to use. The homepage is actually carrd.co, though, with coda embedded.
You can add them via the form, or you can DM me or post here request a bulk addition them if there are more than is easy to add by a form. You’ll remain the owner of them, unless you want to ask Ben West for CEA to be the custodian.
Probably a few depending on price, yeah. I’m fairly AI x-risk focused in my own purchases so probably wouldn’t get the full set that would be good (might get a few core EA ones if they’re still available though), but would be happy to list any that anyone else picks up.
ea.domains—Domains Free to a Good Home
Check ea.domains for available domains! Some relevant ones include:
ontological.tech
existential.dev
agenty.org
epistemic.dev
It does seem plausible that it was a miscommunication actually, the original was:
Effective Ventures Foundation (formerly named “Centre for Effective Altruism”) is a charitable company limited by guarantee in England and Wales (company number 07962181), and registered as a charity with the Charity Commission of England and Wales (charity number 1149828).
Effective Ventures Foundation is governed by a board of five trustees (Will MacAskill, Nick Beckstead, Tasha McCauley, Owen Cotton-Barratt, and Claire Zabel) (the “Board”). The Board is responsible for overall management and oversight of the charity, and where appropriate it delegates some of its functions to sub-committees and directors within the charity.
The Centre for Effective Altruism USA Inc. is a tax exempt public charity under section 501(c)(3) of the Internal Revenue Code.
Effective Ventures Foundation is the sole member of the Centre for Effective Altruism USA Inc.This group of entities will be collectively referred to as “Effective Ventures” (“EV”). The EV group may expand in future to include more non-profit and philanthropic organisations.(crossed out is the intentionally removed bit) And the new is
Effective Ventures Foundation (formerly named “Centre for Effective Altruism”) is a charitable company limited by guarantee in England and Wales (company number 07962181), and registered as a charity with the Charity Commission of England and Wales (charity number 1149828).
The Centre for Effective Altruism USA Inc. is a tax exempt public charity under section 501(c)(3) of the Internal Revenue Code. This group of entities will be collectively referred to as “Effective Ventures” (“EV”). The EV group may expand in future to include more non-profit and philanthropic organisations.
I’d guess someone handed someone who was rushing and not paying attention the task of removing the sentence about Effective Ventures in the middle of the last paragraph, and it got misunderstood as removing both that and the middle paragraph. Delegation is pretty failure prone.
I’m also quite keen to assume good faith, and would prefer that central EA nodes didn’t feel like they have to focus undue effort and brain-cycles on watching their backs, and were free to optimize more important things.
It’s pretty weird that the charity page has been edited since this comment to hide the board members. I’d like to see reflexive transparency, not reflexive hiding of information.
https://web.archive.org/web/20221221123432/https://ev.org/charity/
Strong upvote! Did you see this recent comment thread about FIREA (Financial Independance and Retire EA)?
I’d be excited to see a community of EAs focusing on high saving rates for later donation or direct work. Would you be interested in being around on a Discord for that if I set it up?
FIRE is not EA directly, you’re right, but they are competently optimizing for high saving rate which is very similar practically to optimizing for high donation rate. I’d also feel much more comfortable about EA’s stability if a good portion of highly engaged EAs were financially independent, and could focus on whatever they thought was the most valuable without any worry about needing to rely on funding applications.
My read is many of the best projects are started by enthusiasts in their free time, like ALLFED, because there’s no need to worry about proving yourself to funders immediately so you can just focus on doing the most good. And an even larger fraction of the best projects are by groups with enough financial slack that they’re not forced into short-term thinking by finances.
Great! I’d be interested to talk, you can book to my calendly, and I’d also suggest talking with Nicole Janeway as it seems likely she’ll be coordinating this.
Thanks :)
Awesome! Feel free to loop me in on docs and plans, and I’ll suggest this as something to a few other people as a possible intervention. Would you like me to connect them with you if they’re interested, with you coordinating?
Also, I joined what appears to be the main public-facing FIRE Discord, might be a good place to make contact or scout for who to talk to?
I have been thinking along similar lines. My current angle is to try and connect EA with the FIRE (Financial Independence Retire Early) community, as I think having as many EAs as possible free to work on whatever they find most valuable would be extremely beneficial.
The FIRE movement seems philosophically somewhat EA-adjacent, in that they’re explicitly optimizing finances. Some of them have, once they’re personally set for life, moved to philanthropy. They also have lots of free time on their hands, and some might be happy to consult with EAs who want to optimize their finances.
The name I’ve been thinking of this under is “Efficient Altruism”, and I think it gets back to EA’s roots.
Happy to have a call if you’re interested in bouncing ideas around.
- Feb 1, 2023, 4:33 PM; 4 points) 's comment on FIRE & EA: Seeking feedback on “Fi-lanthropy” Calculator by (
- Dec 27, 2022, 6:34 PM; 3 points) 's comment on Consider Financial Independence First by (
- Jan 31, 2023, 11:33 AM; 2 points) 's comment on FIRE & EA: Seeking feedback on “Fi-lanthropy” Calculator by (
This seems super useful! Would you be willing to let Rob Miles’s aisafety.info use this as seed content? Our backend is already in Google Docs, so if you moved those files to this drive folder we could rename them to have a question-shaped title and they’d be synced in and kept up to date by our editors, or we could copy these if you’d like to have your original separate.