Search results
Mar 29, 2023 · By Eliezer Yudkowsky. March 29, 2023 6:01 PM EDT. Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute. He's been working on...
- Eliezer Yudkowsky: The 100 Most Influential People in ... - TIME
By Will Henshall. September 7, 2023 7:00 AM EDT. In...
- Eliezer Yudkowsky | Time
Yudkowsky is a decision theorist from the U.S. and leads...
- Eliezer Yudkowsky: The 100 Most Influential People in ... - TIME
May 20, 2024 · We Need to Shut it All Down": Yudkowsky in TIME. Rob Miles' Reading List. 1 subscriber. Subscribed. 0. 2 views 1 minute ago. Read the original article in TIME Magazine:...
- 12 min
- 135
- Rob Miles' Reading List
Jun 10, 2023 · In the early 2000s, a young writer named Eliezer Yudkowsky began warning that A.I. could destroy humanity. His online posts spawned a community of believers. Called rationalists or effective ...
Apr 5, 2023 · Apr 05, 2023. 47. 40. FLI put out an open letter, calling for a 6 month pause in training models more powerful than GPT-4, followed by additional precautionary steps. Then Eliezer Yudkowsky put out a post in Time, which made it clear he did not think that letter went far enough.
Decision theorist Eliezer Yudkowsky has a simple message: superintelligent AI could probably kill us all. So the question becomes: Is it possible to build powerful artificial minds that are obedient, even benevolent?