I expect that people that read shortforms on the EA forum are not those that would give useful advice, and I think there are a lot of people that would be happy to give advice to someone with your skills
Related, “my own social circle is not worried or knowledgable about AGI”, might it make sense to spend time networking with people working on AI Safety and getting a feel for needs and opportunities in the area e.g. joining discussion groups?
Still, random questions on plan A as someone not knowledgable but worried about AI
Why product work only for meta orgs? Random examples that I know you know about: Senior Software Engineer at Anthropic, and they were looking for someone to help with some dev tooling. They seem to require product skills / understanding what the users actually need. (Not asking about Anthropic in particular, but non-meta in general)
What would make it easier to clear the bottleneck of talking to actual users with pain points?
What happened to the idea of internal prediction markets for EA orgs? I think it has potential and an MVP could be simple enough, e.g. I received this proposal for a freelance project a few days ago from a longtermist (non AI safety) EA org that made me update positively towards the general idea
we want an app that lets people bet “[edited] bucks” via slack, and then when the bet expires a moderator says whether they won or lost, and this adjusts their balance. If this data was fed into airtable, I could then build some visualisations etc
This would involve a slack bot/ app hosted in the cloud and an airtable integration
Let me know what you think! I’m super excited about this for helping us hone our decision making over time i.e. getting everyone in the habit of betting on outcomes, which apparently is a great way to get around things like the planning fallacy :D Also the /bet Slack interface seems very low friction & would be very easy for people to interact with
Not sure if any of this helps, but I am really excited to see whatever you will end up choosing!
Related, “my own social circle is not worried or knowledgable about AGI”, might it make sense to spend time networking with people working on AI Safety
I don’t think it will help with the social aspect which I’m trying to point at
and getting a feel for needs and opportunities in the area
[...]
What would make it easier to clear the bottleneck of talking to actual users with pain points?
I think it’s best if one person goes do the user research instead of each person like me bothering the AGI researchers (?)
I’m happy to talk to any such person who’ll talk to me and summarize whatever there is for others to follow, if I don’t pick it up myself
e.g. joining discussion groups?
Could be nice actually
Why product work only for meta orgs?
I mean “figure out what AGI researchers need” [which is a “product” task] and help do that [which helps the community, rather than helping the research directly]
Internal prediction markets
I’m in touch with them and basically said “yes”, but they want full time and by default I don’t think I’ll be available, but I’m advancing and checking it
> Give me the obvious stuff
I expect that people that read shortforms on the EA forum are not those that would give useful advice, and I think there are a lot of people that would be happy to give advice to someone with your skills
Related, “my own social circle is not worried or knowledgable about AGI”, might it make sense to spend time networking with people working on AI Safety and getting a feel for needs and opportunities in the area e.g. joining discussion groups?
Still, random questions on plan A as someone not knowledgable but worried about AI
Why product work only for meta orgs? Random examples that I know you know about: Senior Software Engineer at Anthropic, and they were looking for someone to help with some dev tooling. They seem to require product skills / understanding what the users actually need. (Not asking about Anthropic in particular, but non-meta in general)
What would make it easier to clear the bottleneck of talking to actual users with pain points?
What happened to the idea of internal prediction markets for EA orgs? I think it has potential and an MVP could be simple enough, e.g. I received this proposal for a freelance project a few days ago from a longtermist (non AI safety) EA org that made me update positively towards the general idea
Not sure if any of this helps, but I am really excited to see whatever you will end up choosing!
I don’t think it will help with the social aspect which I’m trying to point at
I think it’s best if one person goes do the user research instead of each person like me bothering the AGI researchers (?)
I’m happy to talk to any such person who’ll talk to me and summarize whatever there is for others to follow, if I don’t pick it up myself
Could be nice actually
I mean “figure out what AGI researchers need” [which is a “product” task] and help do that [which helps the community, rather than helping the research directly]
I’m in touch with them and basically said “yes”, but they want full time and by default I don’t think I’ll be available, but I’m advancing and checking it