Australian attitudes towards existential risks from AI

"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war."

Do you support/oppose this statement?