My main altruistic endeavor involves thinking and writing about ideas that seem important and neglected. Here is a list of the specific risks that I’m trying to manage/mitigate in the course of doing this. What other risks am I overlooking or not paying enough attention to, and what additional mitigations I should be doing?
Being wrong or overconfident, distracting people or harming the world with bad ideas.
Think twice about my ideas/arguments. Look for counterarguments/risks/downsides. Try to maintain appropriate uncertainties and convey them in my writings.
The idea isn’t bad, but some people take it too seriously or too far.
Convey my uncertainties. Monitor subsequent discussions and try to argue against people taking my ideas too seriously or too far.
Causing differential intellectual progress in an undesirable direction, e.g., speeding up AI capabilities relative to AI safety, spreading ideas that are more useful for doing harm than doing good.
Check ideas/topics for this risk. Self-censor ideas or switch research topics if the risk seems high.
Being first to talk about some idea, but not developing/pursuing it as vigorously as someone else might if they were first, thereby causing a net delay in intellectual or social progress.
Not sure what to do about this one. So far not doing anything except to think about it.
PR/political risks, e.g., talking about something that damages my reputation or relationships, and in the worst case harms people/causes/ideas associated with me.
Keep this in mind and talk more diplomatically or self-censor when appropriate.
There’s also the unilateralist’s curse: suppose someone publishes an essay about a dangerous, viral idea that they misjudge to be net-positive; after 20 other people also thought about it but judged it to be net-negative.
My main altruistic endeavor involves thinking and writing about ideas that seem important and neglected. Here is a list of the specific risks that I’m trying to manage/mitigate in the course of doing this. What other risks am I overlooking or not paying enough attention to, and what additional mitigations I should be doing?
Being wrong or overconfident, distracting people or harming the world with bad ideas.
Think twice about my ideas/arguments. Look for counterarguments/risks/downsides. Try to maintain appropriate uncertainties and convey them in my writings.
The idea isn’t bad, but some people take it too seriously or too far.
Convey my uncertainties. Monitor subsequent discussions and try to argue against people taking my ideas too seriously or too far.
Causing differential intellectual progress in an undesirable direction, e.g., speeding up AI capabilities relative to AI safety, spreading ideas that are more useful for doing harm than doing good.
Check ideas/topics for this risk. Self-censor ideas or switch research topics if the risk seems high.
Being first to talk about some idea, but not developing/pursuing it as vigorously as someone else might if they were first, thereby causing a net delay in intellectual or social progress.
Not sure what to do about this one. So far not doing anything except to think about it.
PR/political risks, e.g., talking about something that damages my reputation or relationships, and in the worst case harms people/causes/ideas associated with me.
Keep this in mind and talk more diplomatically or self-censor when appropriate.
There’s also the unilateralist’s curse: suppose someone publishes an essay about a dangerous, viral idea that they misjudge to be net-positive; after 20 other people also thought about it but judged it to be net-negative.