Yahoo Web Search

Search results

  1. Jun 10, 2022 · This post is a summary of Eliezer Yudkowsky’s post “AGI Ruin: A List of Lethalities”. I wrote it because I think the original post is longer and less organized than I would like it to be. The purpose of this post is to summarize the main points in the original post and structure the points in a new layout that I hope will make it easier ...

  2. Mar 29, 2023 · Eliezer Yudkowsky, one of the earliest researchers to analyze the prospect of powerful Artificial Intelligence, now warns that we've entered a bleak scenario

    • Eliezer Yudkowsky
  3. People also ask

  4. Jun 10, 2022 · I have several times failed to write up a well-organized list of reasons why AGI will kill you. People come in with different ideas about why AGI would be survivable, and want to hear different obviously key points addressed first.

  5. Jun 10, 2023 · In the early 2000s, a young writer named Eliezer Yudkowsky began warning that A.I. could destroy humanity. His online posts spawned a community of believers.

  6. Nov 11, 2021 · Summarizing the above two points, I suspect that I’m in more-or-less the “penultimate epistemic state” on AGI timelines: I don’t know of a project that seems like they’re right on the brink; that would put me in the “final epistemic state” of thinking AGI is imminent.

  7. AGI Ruin: A List of Lethalities. Created with Sketch. Preamble: (If you're already familiar with all basics and don't want any preamble, skip ahead to Section Bfor technical difficulties of alignment proper.) I have several times failed to write up a well-organized list of reasons why AGI will kill you.

  1. People also search for