I am happy to see this. Have you messaged people on the EA and epistemics slack?
Here are some epistemics projects I am excited about:
Polymarket and Nate Silver—It looks to me that forecasting was 1 − 5% of the Democrats dropping Biden from their ticket. Being able to rapidly see the drop in his % chance of winning during the debate, holding focus on this poor performance over the following weeks and seeing momentum increase for other candidates all seemed powerful[1].
X Community Notes—that one of the largest social media platforms in the world has a truth seeking process with good incentives is great. For all Musk’s faults, he has pushed this and it is to his credit. I think someone should run a think tank to lobby X and other orgs into even better truth seeking
The Swift Centre—large conflict of interest, since I forecast for them, but as a forecasting consultancy that is managing to stand largely (entirely?) without grant funding, just getting standard business gigs, if I were gonna suggest epistemics consulting, I’d probably recommend us. The Swift Centre is a professional or that has worked with DeepMind and the Open Nuclear Network.
Discourse mapping—Many discussions happen often and we don’t move forward. Personally I’m really excited about trying to find consensus positions to allow focus to be freed for more important stuff. Here is the site my team mocked up for Control AI, but I think we could have similar discourse mapping for SB 1047, different approaches to AI safety
The Forum’s AI Welfare Week—I enjoyed a week of focus on a single topic. I reckon if we did about 10 of these we might really start to get somewhere. Perhaps with clustering on different groups based on their positions on initial spectra.
Sage’s Fatebook.io—a tool for quickly making and tracking forecasts. The only tool I’ve found that I show to non-forecasting business people that they say “oh what’s that, can I use that”. I think Sage should charge for this and try and push it as a standard SaaS product.[2]
And a quick note:
An example of a potential project here: A consultancy which provides organisations support in improving their epistemics.
I think the obvious question here should be “how would you know such a consultancy has good epistemics”.
As a personal note, I’ve been building epistemic tools for years, eg estimaker.app or casting around for forecasting questions to write on. The FTXFF was pretty supportive of this stuff, but since it’s fall I’ve not felt like big EA finds my work particularly interesting or worthy of support. Many of the people I see doing interesting tinkering work like this end up moving to AI Safety.
Not that powerful and positively impactful aren’t the same thing, but here people who said Biden was too old should be glad he is gone, by their own lights
But, as I say in the following sentences it seems plausible to me that without betting markets to keep the numbers accessible and Silver to keep pushing on them, it would have taken longer for the initial crash to become visible, it could have faded from the news and it could have been hard to see that others were gaining momentum.
All of these changes seem to increase the chance of biden staying in, which was pretty knife edgy for a long time.
I am happy to see this. Have you messaged people on the EA and epistemics slack?
Here are some epistemics projects I am excited about:
Polymarket and Nate Silver—It looks to me that forecasting was 1 − 5% of the Democrats dropping Biden from their ticket. Being able to rapidly see the drop in his % chance of winning during the debate, holding focus on this poor performance over the following weeks and seeing momentum increase for other candidates all seemed powerful[1].
X Community Notes—that one of the largest social media platforms in the world has a truth seeking process with good incentives is great. For all Musk’s faults, he has pushed this and it is to his credit. I think someone should run a think tank to lobby X and other orgs into even better truth seeking
The Swift Centre—large conflict of interest, since I forecast for them, but as a forecasting consultancy that is managing to stand largely (entirely?) without grant funding, just getting standard business gigs, if I were gonna suggest epistemics consulting, I’d probably recommend us. The Swift Centre is a professional or that has worked with DeepMind and the Open Nuclear Network.
Discourse mapping—Many discussions happen often and we don’t move forward. Personally I’m really excited about trying to find consensus positions to allow focus to be freed for more important stuff. Here is the site my team mocked up for Control AI, but I think we could have similar discourse mapping for SB 1047, different approaches to AI safety
The Forum’s AI Welfare Week—I enjoyed a week of focus on a single topic. I reckon if we did about 10 of these we might really start to get somewhere. Perhaps with clustering on different groups based on their positions on initial spectra.
Sage’s Fatebook.io—a tool for quickly making and tracking forecasts. The only tool I’ve found that I show to non-forecasting business people that they say “oh what’s that, can I use that”. I think Sage should charge for this and try and push it as a standard SaaS product.[2]
And a quick note:
I think the obvious question here should be “how would you know such a consultancy has good epistemics”.
As a personal note, I’ve been building epistemic tools for years, eg estimaker.app or casting around for forecasting questions to write on. The FTXFF was pretty supportive of this stuff, but since it’s fall I’ve not felt like big EA finds my work particularly interesting or worthy of support. Many of the people I see doing interesting tinkering work like this end up moving to AI Safety.
Not that powerful and positively impactful aren’t the same thing, but here people who said Biden was too old should be glad he is gone, by their own lights
Though maybe we let Adam finish his honeymoon first. Congratulations to the happy couple!
curious where you’re getting this from?
I made it up[1].
But, as I say in the following sentences it seems plausible to me that without betting markets to keep the numbers accessible and Silver to keep pushing on them, it would have taken longer for the initial crash to become visible, it could have faded from the news and it could have been hard to see that others were gaining momentum.
All of these changes seem to increase the chance of biden staying in, which was pretty knife edgy for a long time.
https://nathanpmyoung.substack.com/p/forecasting-is-mostly-vibes-so-is
thanks for the response!
looks like the link in the footnotes is private. maybe there’s a public version you could share?
re: the rest — makes sense. 1%-5% doesn’t seem crazy to me, i think i would’ve “made up” 0.5%-2%, and these aren’t way off.
How about now https://nathanpmyoung.substack.com/p/forecasting-is-mostly-vibes-so-is
works! thx
… there is an EA and epistemics slack?? (cool!) if it’s free for anyone to join, would you be able to send me an access link or somesuch?
Invited! Feel free for others who are somewhat active in the space to ping me for invites.