There’s been a bit more buzz around AI this week following an open letter signed by a group of AI experts and other high-profile figures and released by the Center for AI Safety. The open letter had only one line – “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”
Listen to the full interview below.
Originally published by BFM.