Co-Founder and CTO at Empiricast
yhoiseth
Another way to approach the goal of improved decision-making is using other mechanisms to improve predictions. We’re trying a simpler variant at Empiricast. This is similar to Metaculus, but (currently) for internal use in organizations.
An idea for Task Y: Mentoring people a bit younger than oneself.
Tyler Cowen writes in The high-return activity of raising others’ aspirations:
At critical moments in time, you can raise the aspirations of other people significantly, especially when they are relatively young, simply by suggesting they do something better or more ambitious than what they might have in mind. It costs you relatively little to do this, but the benefit to them, and to the broader world, may be enormous.
This is in fact one of the most valuable things you can do with your time and with your life.
I think many young people today lack good mentors. Their peers are their own age, and the last person you want advice from as a 14-year-old is another 14-year-old. And parents, teachers and other grown-ups may not have the time, inclination, knowledge and/or skills to be very effective mentors. In any case, the age gap is often a bit too large.
A program where EAs systematically mentored, nudged and helped people up to, say, 15 years younger than themselves, could (I think) scale and be effective.
[Question] What open source projects should effective altruists contribute to?
[Question] Is any EA organization using or considering using Buterin et al.’s mechanism for matching funds?
Yeah, this and fraud are potential problems. They’re discussed in 5.2 Collusion and deterrence (pages 15 to 19).
Tool recommendation: Polar personal knowledge repository
Sweet. What’s the forecasting application about?
The RadicalxChange movement is very explicit about engaging artists. To learn about the movement, I recommend this 80,000 Hours episode.
Other than that, a lot of startups have short explainer videos above the fold on their homepage. See for example https://frontapp.com/. Such companies optimize vigorously, so it’s safe to assume that they are effective. I can imagine that a lot of EA-related organizations would benefit greatly from such videos.
In general I’ve noticed a pattern (of which the above two linked posts are an example) where 80k posts something like “our posts stating that ‘A is true’ have inadvertently caused many people to believe that A is true, here’s why A is actually false” while leaving up the old posts that say ‘A is true’ (sometimes without even a note that they might be outdated). This is especially bad when the older ‘A is true’ content is linked conveniently from the front page while the more recent updates are buried in blog history.
Do you have examples of this?
Why We Sleep — a tale of institutional failure
What do you mean?
Modular empirical science
Thanks, that’s great! So you are working with Chris Chambers on this?
Great post. Reminds me of Eric Weinstein on excellence vs. genius: https://youtu.be/bsgWSPWX-6A?t=553
[Startup to improve predictions]
I’m currently working on the startup https://www.primeprediction.com/. We aim to help organizations make better decisions by improving their prediction capabilities.
We’re currently very early stage and are learning more about the problems people face when making predictions/forecasts.
I’ll be happy to answer any questions you may have. I’d also love to hear your feedback, especially about concrete problems you have faced in your line of work for our product could be relevant.