Yahoo Web Search

Search results

  1. Mar 29, 2023 · Shut It Down | TIME. Ideas. Technology. Pausing AI Developments Isn’t Enough. We Need to Shut it All Down. 11 minute read. Illustration for TIME by Lon Tweeten. Ideas. By Eliezer Yudkowsky....

    • Eliezer Yudkowsky
  2. Sep 7, 2023 · Co-Founder, Machine Intelligence Research Institute. Illustration by TIME; reference image courtesy of Eliezer Yudkowsky. By Will Henshall. September 7, 2023 7:00 AM EDT. In February, Sam...

  3. Sep 7, 2023 · In March, TIME published an essay from AI safety advocate Eliezer Yudkowsky that prompted discussion in the White House press briefing room about the Biden Administration’s plan on AI. By...

  4. Decision theorist Eliezer Yudkowsky has a simple message: superintelligent AI could probably kill us all. So the question becomes: Is it possible to build powerful artificial minds that are obedient, even benevolent? In a fiery talk, Yudkowsky explores why we need to act immediately to ensure smarter-than-human AI systems don't lead to our extinction.

  5. In a 2023 op-ed for Time magazine, Yudkowsky discussed the risk of artificial intelligence and proposed action that could be taken to limit it, including a total halt on the development of AI, or even "destroy[ing] a rogue datacenter by airstrike".

  6. Jun 10, 2023 · In the early 2000s, a young writer named Eliezer Yudkowsky began warning that A.I. could destroy humanity. His online posts spawned a community of believers.

  7. Apr 5, 2023 · 17 min read 5th Apr 2023 86 comments. 203. Public Reactions to AI AI. Frontpage. FLI put out an open letter, calling for a 6 month pause in training models more powerful than GPT-4, followed by additional precautionary steps. Then Eliezer Yudkowsky put out a post in Time, which made it clear he did not think that letter went far enough.

  1. People also search for