It seems plausible to me that legal liability issues could be used to slow down AI development, at least in the West. But that doesn’t mean that donating to legal assistance would be a good use of funds. My sense is that there are many plaintiffs armed with plenty of money to fund their own lawsuits, and some of those lawsuits have already happened.
What might be helpful, however, would be amicus briefs from AI alignment, development, or governance organizations, arguing that AI developers should face liability for errors in or misuse of their products. That seems like something that EA funders might want to consider?
Actually, there are many plaintiffs I’m in touch with (especially those representing visual artists, writers, and data workers) who need funds to pay for legal advice and to start class-action lawsuits (given having to pay court fees if a case is unsuccessful).
amicus briefs from AI alignment, development, or governance organizations, arguing that AI developers should face liability for errors in or misuse of their products.
Sounds like a robustly useful thing to do to create awareness of the product liability issues of buggy spaghetti code.
It seems plausible to me that legal liability issues could be used to slow down AI development, at least in the West. But that doesn’t mean that donating to legal assistance would be a good use of funds. My sense is that there are many plaintiffs armed with plenty of money to fund their own lawsuits, and some of those lawsuits have already happened.
What might be helpful, however, would be amicus briefs from AI alignment, development, or governance organizations, arguing that AI developers should face liability for errors in or misuse of their products. That seems like something that EA funders might want to consider?
Actually, there are many plaintiffs I’m in touch with (especially those representing visual artists, writers, and data workers) who need funds to pay for legal advice and to start class-action lawsuits (given having to pay court fees if a case is unsuccessful).
Sounds like a robustly useful thing to do to create awareness of the product liability issues of buggy spaghetti code.