This morning a group of prominent AI researchers made a joint statement as shown above. “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.” Beside the 23-word statement, there are no other details. I wish more rationale was given when such a significant statement was made. What does this mean? If we don’t deal with the AI risk, we may die? What are we supposed to do?
5/30/2023: AI Extinction Risk!!??
5/30/2023: AI Extinction Risk!!??
5/30/2023: AI Extinction Risk!!??
This morning a group of prominent AI researchers made a joint statement as shown above. “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.” Beside the 23-word statement, there are no other details. I wish more rationale was given when such a significant statement was made. What does this mean? If we don’t deal with the AI risk, we may die? What are we supposed to do?