Yahoo Web Search

Search results

  1. Mar 29, 2023 · By Eliezer Yudkowsky. March 29, 2023 6:01 PM EDT. Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute. He's been working on...

  2. May 20, 2024 · We Need to Shut it All Down": Yudkowsky in TIME. Rob Miles' Reading List. 1 subscriber. Subscribed. 0. 2 views 1 minute ago. Read the original article in TIME Magazine:...

    • 12 min
    • 135
    • Rob Miles' Reading List
  3. Jun 10, 2023 · In the early 2000s, a young writer named Eliezer Yudkowsky began warning that A.I. could destroy humanity. His online posts spawned a community of believers. Called rationalists or effective ...

  4. Apr 5, 2023 · Apr 05, 2023. 47. 40. FLI put out an open letter, calling for a 6 month pause in training models more powerful than GPT-4, followed by additional precautionary steps. Then Eliezer Yudkowsky put out a post in Time, which made it clear he did not think that letter went far enough.

  5. Decision theorist Eliezer Yudkowsky has a simple message: superintelligent AI could probably kill us all. So the question becomes: Is it possible to build powerful artificial minds that are obedient, even benevolent?

  1. People also search for