Search results
by Eliezer Yudkowsky. 397 Welcome to LessWrong! Ruby, Raemon, RobertM, habryka. 5y. 48. 305 Epistemic Legibility. Elizabeth. 2y. 30. 241 Scale Was All We Needed, At First. Gabriel Mukobi. 2d. 28. 138 Using axis lines for good or evil. dynomight. 5d. 39. Latest Posts. Customize Feed (Hide)
- Login
by Eliezer Yudkowsky. 397 Welcome to LessWrong! Ruby,...
- All Posts
CSSR is an example of a reconstruction algorithm. apparently...
- Concepts
Good Explanations (Advice) Ideological Turing Tests...
- Library
For two years Eliezer Yudkowsky wrote a blogpost a day,...
- Community
A community blog devoted to refining the art of rationality....
- Self-driving Car Bets
Yes. My median is probably 2.5 years to have 10 of the 50...
- Rationality: A-Z
Rationality: A-Z (or "The Sequences") is a series of blog...
- The Litany
The good news is that despite the leadership of Biden and...
- Eliezer Yudkowsky
Eliezer Yudkowsky's profile on LessWrong — A community blog...
- Login
Aug 27, 2023 · Based on these considerations, I conclude that Eliezer’s view is legitimately crazy. There is, quite literally, no good reason to believe it, and lots of evidence against it. Eliezer just dismisses that evidence, for no good reason, bites a million bullets, and acts like that’s the obvious solution.
Dec 27, 2023 · Written for the LessWrong 2022 Review. In the existential AI safety community, there is an ongoing debate between positions situated differently on some axis which doesn't have a common agreed-upon name, but where Christiano and Yudkowsky can be regarded as representatives of the two directions [1].
People also ask
Is Yudkowsky right about being wrong?
Should we're all gonna die with Eliezer Yudkowsky have a comma?
Does Eliezer say'suppose I'm Right and they're wrong'?
Is Yudkowsky a pessimism?
Mar 21, 2023 · I recently watched Eliezer Yudkowsky's appearance on the Bankless podcast, where he argued that AI was nigh-certain to end humanity. Since the podcast, some commentators have offered pushback against the doom conclusion.
Aug 24, 2017 · Predictably Wrong - LessWrong. Aug 24, 2017 by Eliezer Yudkowsky. This, the first book of "Rationality: AI to Zombies" (also known as "The Sequences"), begins with cognitive bias. The rest of the book won’t stick to just this topic; bad habits and bad ideas matter, even when they arise from our minds’ contents as opposed to our minds’ structure.