@Kale Ridsdale asked me if I built this specifically with the EA community in mind (and also shared a similar tool called Sway with me). Here was my reply:
I didn’t build this specifically with the EA community in mind, but I realized it would probably appeal to EAs (on average) more than the general public (given their interest in polling tools like Manifold Markets and Metaculus, as well as their interest in AI alignment and civil discourse).
Thanks for sharing Simon Cullen’s Sway! It looks like his motivation is very similar to mine. My initial motivation for this was that I find people often disagree with others on an issue and then assume they disagree with them on everything. I’m hoping that by being able to see some areas of agreement, it will help create some respect for each other and better social cohesion.
After building it, I realized that it might be hard to gain initial traction as the value of a social platform comes from there being a bunch of people using the tool, meaning that the first few users would struggle to get value. So in addition to being able to compare your opinions/values with other humans, I pivoted a bit to being able to compare your opinions/values with AI, which I thought would be interesting and could also be valuable from an AI alignment/safety perspective (i.e. see where your opinions/values seem to diverge from AI’s). I thought this would be of particular interest to a bunch of people in the EA community.
@Kale Ridsdale asked me if I built this specifically with the EA community in mind (and also shared a similar tool called Sway with me). Here was my reply:
I didn’t build this specifically with the EA community in mind, but I realized it would probably appeal to EAs (on average) more than the general public (given their interest in polling tools like Manifold Markets and Metaculus, as well as their interest in AI alignment and civil discourse).
Thanks for sharing Simon Cullen’s Sway! It looks like his motivation is very similar to mine. My initial motivation for this was that I find people often disagree with others on an issue and then assume they disagree with them on everything. I’m hoping that by being able to see some areas of agreement, it will help create some respect for each other and better social cohesion.
After building it, I realized that it might be hard to gain initial traction as the value of a social platform comes from there being a bunch of people using the tool, meaning that the first few users would struggle to get value. So in addition to being able to compare your opinions/values with other humans, I pivoted a bit to being able to compare your opinions/values with AI, which I thought would be interesting and could also be valuable from an AI alignment/safety perspective (i.e. see where your opinions/values seem to diverge from AI’s). I thought this would be of particular interest to a bunch of people in the EA community.
Have you looked into liquid democracy?