Yahoo Web Search

Search results

  1. Mar 29, 2023 · By Eliezer Yudkowsky. March 29, 2023 6:01 PM EDT. Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute. He's been working on...

  2. Time op-ed. In a 2023 op-ed for Time magazine, Yudkowsky discussed the risk of artificial intelligence and proposed action that could be taken to limit it, including a total halt on the development of AI, [13] [14] or even "destroy [ing] a rogue datacenter by airstrike". [5]

  3. Apr 5, 2023 · Apr 05, 2023. 47. 40. FLI put out an open letter, calling for a 6 month pause in training models more powerful than GPT-4, followed by additional precautionary steps. Then Eliezer Yudkowsky put out a post in Time, which made it clear he did not think that letter went far enough.

    • Zvi Mowshowitz
  4. Jun 10, 2023 · 阅读简体中文版 閱讀繁體中文版. Last month, hundreds of well-known people in the world of artificial intelligence signed an open letter warning that A.I. could one day destroy humanity. “Mitigating the risk of...

  5. May 19, 2023 · That month, Time magazine published an op-ed from AI researcher Eliezer Yudkowsky warning that the result of building super-intelligent machines was that “literally everyone on Earth will die.”

  1. People also search for