Why expect global consensus to lead to strong action, when it hasn't worked for climate change?
Because the proof (An AI straight up saying "I think I'm going to kill literally every human if I'm ever deployed") is a lot less ambiguous, and the stakes are actually a fair bit higher.
But it still might not work.
There are going to be contingency plans in that case, but we shouldn't talk about them.