Top Qs
Timeline
Chat
Perspective
Rationalist community
Internet community From Wikipedia, the free encyclopedia
Remove ads
The rationalist community is a 21st century movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten (formerly known as Slate Star Codex). The movement initially gained prominence in the San Francisco Bay Area. Its members seek to use rationality to avoid cognitive biases. Common interests include transhumanism, probability, effective altruism, and mitigating existential risk from artificial general intelligence.
Remove ads
Description
Summarize
Perspective
Rationalists are concerned with applying science and probability to various topics,[1] with special attention to Bayesian inference.[2] According to Ellen Huet, the rationalist community "aim[s] to keep their thinking unbiased, even when the conclusions are scary".[3]
One of the main interests of the rationalist community is combating existential risk posed by the emergence of an artificial superintelligence.[4][5] Many members of the rationalist community believe that it is one of the only communities that have a chance at saving humanity from extinction.[6][7][8] The stress associated with this consequential responsibility has been a contributing factor to mental health crises among several rationalists.[9][10]
The early rationalist blogs LessWrong and Slate Star Codex attracted a STEM-interested audience that cared about self-improvement, and was suspicious of the humanities and how emotions inhibit rational thinking.[11] The movement attracted the attention of the founder culture of Silicon Valley, leading to many shared cultural shibboleths and obsessions, especially optimism about the ability of intelligent capitalists and technocrats to create widespread prosperity.[12][13]
Writing for The New Atlantis, Tara Burton describes rationalist culture as having a "technocratic focus on ameliorating the human condition through hyper-utilitarian goals",[14] with the "distinctly liberal optimism... that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so".[15] Burton writes that "Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is".[16]
Bloomberg Businessweek journalist Ellen Huet adds that the rationalist movement "valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior".[17] In particular, several women in the community have made allegations of sexual misconduct, including abuse and harassment, which they describe as pervasive and condoned.[18]
Writing in The New Yorker, Gideon Lewis-Kraus argues that rationalists "have given safe harbor to some genuinely egregious ideas," such as scientific racism and neoreactionary views, and that "the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding."[19] Though this attitude is based on "the view that vile ideas should be countenanced and refuted rather than left to accrue the status of forbidden knowledge",[19] rationalists also hold the view that other ideas, referred to as information hazards, are dangerous and should be suppressed.[20] Roko's basilisk and the writings of Ziz LaSota are commonly cited information hazards among rationalists.[9]
Remove ads
History
Summarize
Perspective
LessWrong was originally founded in 2009,[21] although the community had previously existed on various blogs on the Internet, including Overcoming Bias (founded 2006). Slate Star Codex was launched in 2013, and its successor blog "Astral Codex Ten" was launched on January 21, 2021.[22][23][24]
Eliezer Yudkowsky created LessWrong and is regarded as a major figure within the movement. He has also published the Harry Potter fanfiction called Harry Potter and the Methods of Rationality from 2010 to 2015, which led people towards LessWrong and the rationalist community.[25][26] Harry Potter and the Methods of Rationality was a highly popular fanfiction and is well-regarded within the rationalist community.[27][28] Yudkowsky has used the work to solicit donations for the Center for Applied Rationality, which teaches courses based on it,[29][30] and a 2013 LessWrong survey revealed a quarter of its users had found the site due to the fanfiction.[31]
In the 2010s, the rationalist community emerged as a major force in Silicon Valley.[32][33] Billionaires Elon Musk, Peter Thiel, and Vitalik Buterin have donated to rationalist-associated institutions.[34][35] Bay Area organizations associated with the rationalist community include the Center for Applied Rationality, which teaches the techniques of rationality espoused by rationalists, and the Machine Intelligence Research Institute, which conducts research on AI safety.[36][37][38]
While the movement has online origins, the community is also active and close-knit offline. The community is especially active in the San Francisco Bay Area, where many rationalists live in intentional communities and engage in polyamorous relationships with other rationalists.[39][40][41] Some members and former members of the community have described it as "cult-like".[42][43]
Remove ads
Offshoots and overlapping movements
Summarize
Perspective
The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups.[44] Members who have drifted from traditional rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" (also known as "TPOT"[45]) or "EA-adjacent".[46]
Effective altruism and transhumanism
The rationalist community has a large overlap with effective altruism[47][48] and transhumanism.[49] Critics such as computer scientist Timnit Gebru and philosopher Émile P. Torres additionally link rationalists with other philosophies they collectively name TESCREAL: Transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism.[50]
Postrationalists
The postrationalists are a loose group of one-time rationalists who became disillusioned with the rationalist community, which they came to perceive as cultlike[51] and had lost focus on the less quantifiable elements of a well-lived human life.[14] This community also goes by the acronym TPOT, standing for This Part of Twitter.[52][45] The term postrationalist is also used as a hedge by people in the community who have drifted from its orthodoxy.[46]
Zizians
The Zizians are a splinter[53] group with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four murders.[54] The Zizians originally formed around the Bay Area rationalist community, but became disillusioned with other rationalist organizations and leaders. Among the Zizians' accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI.[55]
See also
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads