Yahoo Web Search

Search results

  1. Jun 5, 2022 · The fact that, twenty-one years into my entering this death game, seven years into other EAs noticing the death game, and two years into even normies starting to notice the death game, it is still Eliezer Yudkowsky writing up this list, says that humanity still has only one gamepiece that can do that.

  2. Jun 10, 2022 · Eliezer Yudkowsky argues that artificial general intelligence (AGI) will kill you unless alignment is solved with high probability. He lists several reasons why AGI alignment is lethally difficult and why current approaches are insufficient.

  3. Mar 29, 2023 · Eliezer Yudkowsky, one of the earliest researchers to analyze the prospect of powerful Artificial Intelligence, now warns that we've entered a bleak scenario

    • Eliezer Yudkowsky
  4. People also ask

  5. Jun 10, 2023 · In the early 2000s, a young writer named Eliezer Yudkowsky began warning that A.I. could destroy humanity. His online posts spawned a community of believers. Called rationalists or...

  6. May 12, 2023 · Summary of “AGI Ruin: A List of Lethalities” by Stephen McAleese. Overview. Introduction. This post is a summary of Eliezer Yudkowskys post “AGI Ruin: A List of Lethalities”. I wrote it because I think the original post is longer and less organized than I would like it to be.

  7. The claimed truth of several of these 'true things' is often backed up by nothing more than Eliezer's best-guess informed-gut-feeling predictions about what future AGI must necessarily be like.

  1. People also search for