AI experts sign doc comparing risk of ‘extinction from AI’ to pandemics, nuclear war

  • 📰 Cointelegraph
  • ⏱ Reading Time:
  • 27 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 14%
  • Publisher: 51%

Entertainment Entertainment Headlines News

Entertainment Entertainment Latest News,Entertainment Entertainment Headlines

Top AI experts sign a statement saying that AI poses a threat to humankind’s existence.

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”; University of California, Berkeley’s Stuart Russell; and Massachusetts Institute of Technology’s Lex Fridman. Musician Grimes is also a signatory, listed under the “other notable figures” category.While the statement may appear innocuous on the surface, the underlying message is a somewhat controversial one in the AI community.

A seemingly growing number of experts believe that current technologies may or will inevitably lead to the emergence or development of an AI system capable of posing an existential threat to the human species. Their views, however, are countered by a contingent of experts with diametrically opposed opinions. Meta chief AI scientist Yann LeCun, for example, hasSuper-human AI is nowhere near the top of the list of existential risks.Until we have a basic design for even dog-level AI , discussing how to make it safe is premature.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.

A seemingly growing number of experts believe that current technologies may or will inevitably lead to the emergence or development of an AI system capable of posing an existential threat to the human species.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 562. in ENTERTAİNMENT

Entertainment Entertainment Latest News, Entertainment Entertainment Headlines