I think calling their opinions “ungrounded speculation” is an entirely valid opinion, although I would personally use the more diplomatic term “insufficiently grounded speculation”. She acknowledges that they have reasons for their speculation, but does not find those reasons to be sufficiently grounded in evidence.
I do not like that her stating her opinions and arguments politely and in good faith is being described as “aggressive”. I think this kind of hostile attitude towards skeptics could be detrimental to the intellectual health of the movement.
As for your alignment thoughts, I have heard the arguments and disagree with them, but I’ll just link to my post on the subject rather than drag it in here.
I think calling their opinions “ungrounded speculation” is an entirely valid opinion, although I would personally use the more diplomatic term “insufficiently grounded speculation”.
I disagree on that. Whether politely said or not, it disqualifies another’s views without any arguments at all. It’s like saying “you’re talking bullshit”. Now, if you do that and then follow up with “because, as I can demonstrate, facts A and B clearly contradict your claim”, then that may be okay. But she didn’t do that.
She could have said things like “I don’t understand your argument”, or “I don’t see evidence for claim X”, or “I don’t believe Y is possible, because …”. Even better would be to ask: “Can you explain to me why you think an AI could become uncontrollable within the next 20 years?”, and then answer to the arguments.
I think we’ll just have to disagree on this point. I think it’s perfectly fine to (politely) call bullshit, if you think something is bullshit, as long as you follow it up with arguments as to why you think that (which she did, even if you think the arguments were weak). I think EA could benefit from more of a willingness to call out emperors with no clothes.
I think it’s perfectly fine to (politely) call bullshit, if you think something is bullshit, as long as you follow it up with arguments as to why you think that
Agreed.
(which she did, even if you think the arguments were weak)
That’s where we disagree—strong claims (“Two Turing-award winners talk nonsense when they point out the dangerousness of the technology they developed”) require strong evidence.
I think calling their opinions “ungrounded speculation” is an entirely valid opinion, although I would personally use the more diplomatic term “insufficiently grounded speculation”. She acknowledges that they have reasons for their speculation, but does not find those reasons to be sufficiently grounded in evidence.
I do not like that her stating her opinions and arguments politely and in good faith is being described as “aggressive”. I think this kind of hostile attitude towards skeptics could be detrimental to the intellectual health of the movement.
As for your alignment thoughts, I have heard the arguments and disagree with them, but I’ll just link to my post on the subject rather than drag it in here.
I disagree on that. Whether politely said or not, it disqualifies another’s views without any arguments at all. It’s like saying “you’re talking bullshit”. Now, if you do that and then follow up with “because, as I can demonstrate, facts A and B clearly contradict your claim”, then that may be okay. But she didn’t do that.
She could have said things like “I don’t understand your argument”, or “I don’t see evidence for claim X”, or “I don’t believe Y is possible, because …”. Even better would be to ask: “Can you explain to me why you think an AI could become uncontrollable within the next 20 years?”, and then answer to the arguments.
I think we’ll just have to disagree on this point. I think it’s perfectly fine to (politely) call bullshit, if you think something is bullshit, as long as you follow it up with arguments as to why you think that (which she did, even if you think the arguments were weak). I think EA could benefit from more of a willingness to call out emperors with no clothes.
Agreed.
That’s where we disagree—strong claims (“Two Turing-award winners talk nonsense when they point out the dangerousness of the technology they developed”) require strong evidence.