Top Qs
Timeline
Chat
Perspective

Rationalist community

Internet community From Wikipedia, the free encyclopedia

Remove ads

The rationalist community is a 21st century philosophical movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten (formerly known as Slate Star Codex). The movement initially gained prominence in the San Francisco Bay Area. Its members seek to use rationality to avoid cognitive biases. Common interests include probability, effective altruism, transhumanism, and mitigating existential risk from artificial general intelligence.

The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups.[1] Members who have drifted from traditional rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" (also known as "ingroup" and "TPOT", an acronym for "this part of Twitter"[2]) or "EA-adjacent".[3]

Remove ads

Description

Summarize
Perspective

Rationality

The rationalists are concerned with applying science and probability to various topics,[4] with special attention to Bayesian inference.[5] According to Ellen Huet, the rationalist community "aim[s] to keep their thinking unbiased, even when the conclusions are scary".[6]

The early rationalist blogs LessWrong and Slate Star Codex attracted a STEM-interested audience that cared about self-improvement, and was suspicious of the humanities and how emotions inhibit rational thinking.[7] The movement attracted the attention of the founder culture of Silicon Valley, leading to many shared cultural shibboleths and obsessions, especially optimism about the ability of intelligent capitalists and technocrats to create widespread prosperity.[8][9]

Writing for The New Atlantis, Tara Isabella Burton describes rationalist culture as having a "technocratic focus on ameliorating the human condition through hyper-utilitarian goals",[10] with the "distinctly liberal optimism... that defines so much of Silicon Valley ideology — that intelligent people, using the right epistemic tools, can think better, and save the world by doing so".[11] Burton writes that "Central to the rationalist worldview was the idea that nothing — not social niceties, not fear of political incorrectness, certainly not unwarranted emotion — could, or should, get between human beings and their ability to apprehend the world as it really is".[12]

AI safety

One of the main interests of the rationalist community is combating existential risk posed by the emergence of an artificial superintelligence.[13][14] Many members of the rationalist community believe that it is one of the only communities that has a chance at saving humanity from extinction.[15][16][17] The stress associated with this consequential responsibility has been a contributing factor to mental health crises among several rationalists.[18][19]

Extreme values

Bloomberg Businessweek journalist Ellen Huet adds that the rationalist movement "valorizes extremes: seeking rational truth above all else, donating the most money and doing the utmost good for the most important reason. This way of thinking can lend an attractive clarity, but it can also provide cover for destructive or despicable behavior".[20] In particular, several women in the community have made allegations of sexual misconduct, including abuse and harassment, which they describe as pervasive and condoned.[21]

Writing in The New Yorker, Gideon Lewis-Kraus argues that rationalists "have given safe harbor to some genuinely egregious ideas," such as scientific racism and neoreactionary views, and that "the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding."[22] Though this attitude is based on "the view that vile ideas should be countenanced and refuted rather than left to accrue the status of forbidden knowledge",[22] rationalists also hold the view that other ideas, referred to as information hazards, are dangerous and should be suppressed.[23] Roko's basilisk and the writings of Ziz LaSota are commonly cited information hazards among rationalists.[18]

Some members and former members of the community have said that aspects of the community and its organizations are cult-like.[24][25][26] In The New York Times, religious scholar Greg Epstein stated: "When you think about the billions at stake and the radical transformation of lives across the world because of the eccentric vision of this group, how much more cult-y does it have to be for this to be a cult? Not much."[27]

Lifestyle

While the movement has online origins, the community is also active and close-knit offline. The community is especially active in the San Francisco Bay Area, where many rationalists live in intentional communities and engage in polyamorous relationships with other rationalists.[28][29][30]

Remove ads

History

Summarize
Perspective

LessWrong was originally founded in 2009,[31] although the community had previously existed on various blogs on the Internet, including Overcoming Bias (founded 2006). Slate Star Codex was launched in 2013, and its successor blog "Astral Codex Ten" was launched on January 21, 2021.[32][33][34]

Eliezer Yudkowsky created LessWrong and is regarded as a major figure within the movement. He has also published the Harry Potter fanfiction called Harry Potter and the Methods of Rationality from 2010 to 2015, which led people towards LessWrong and the rationalist community.[35][36] Harry Potter and the Methods of Rationality was a highly popular fanfiction and is well-regarded within the rationalist community.[37][38] Yudkowsky has used the work to solicit donations for the Center for Applied Rationality, which teaches courses based on it,[39][40] and a 2013 LessWrong survey revealed a quarter of its users had found the site due to the fanfiction.[41]

In the 2010s, the rationalist community emerged as a major force in Silicon Valley.[42][43] Sillicon Valley founders such as Elon Musk, Peter Thiel, Vitalik Buterin, Dustin Moskovitz, and Jaan Tallinn have donated to rationalist-associated institutions or otherwise supported rationalist figures.[44][45][27] The movement has directed hundreds of millions of dollars towards companies, research labs, and think tanks aligned with its objectives, and was influential in the abortive removal of Sam Altman from OpenAI.[27]

Bay Area organizations associated with the rationalist community include the Center for Applied Rationality, which teaches the techniques of rationality espoused by rationalists, and the Machine Intelligence Research Institute, which conducts research on AI safety.[46][47][48]

Remove ads

Overlapping movements and offshoots

Summarize
Perspective

The borders of the rationalist community are blurry and subject to debate among the community and adjacent groups.[1] The rationalist community has a large overlap with effective altruism[49][50] and transhumanism.[51] Critics such as computer scientist Timnit Gebru and philosopher Émile P. Torres link rationalists with other philosophies they collectively name TESCREAL: Transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism.[52] Members who have drifted from traditional rationalist beliefs often self-describe as "rationalist-adjacent", "post-rationalist" (also known as "ingroup" and "TPOT", an acronym for "this part of Twitter"[2]) or "EA-adjacent".[3]

Effective altruism

Effective altruism (EA) is a 21st-century philosophical and social movement that advocates impartially calculating benefits and prioritizing causes to provide the greatest good. It is motivated by "using evidence and reason to figure out how to benefit others as much as possible, and taking action on that basis".[53][54] People who pursue the goals of effective altruism, who are sometimes called effective altruists,[55] follow a variety of approaches proposed by the movement, such as donating to selected charities and choosing careers with the aim of maximizing positive impact. The movement gained popularity outside academia, spurring the creation of research centers, advisory organizations, and charities, which collectively have donated several hundred million dollars.[56]

Effective altruists emphasize impartiality and the global equal consideration of interests when choosing beneficiaries. Popular cause priorities within effective altruism include global health and development, social and economic inequality, animal welfare, and risks to the survival of humanity over the long-term future. Only a small portion of all charities are affiliated with effective altruism, except in niche areas such as farmed-animal welfare, AI safety, and biosecurity.[57]

The movement developed during the 2000s, and the name effective altruism was coined in 2011. Philosophers influential to the movement include Peter Singer, Toby Ord, and William MacAskill. What began as a set of evaluation techniques advocated by a diffuse coalition evolved into an identity.[58] Effective altruism has ties to elite universities in the United States and United Kingdom, and became associated with Silicon Valley's technology industry.[59]

The movement received mainstream attention and criticism with the bankruptcy of the cryptocurrency exchange FTX as founder Sam Bankman-Fried was a major funder of effective altruism causes prior to late 2022. Some in the San Francisco Bay Area criticized what they described as a culture of sexual misconduct.

Postrationalists

The postrationalists are a loose group of one-time rationalists who became disillusioned with the rationalist community, which they came to perceive as "a little culty [and] dogmatic"[26] and as having lost focus on the less quantifiable elements of a well-lived human life.[10] This community also goes by the acronym TPOT, standing for This Part of Twitter.[60][2] The term postrationalist is also used as a hedge by people associated with the rationalist community who have drifted from its orthodoxy.[3]

Zizians

The Zizians are a splinter[61] group with an ideological emphasis on veganism and anarchism, which became well known in 2025 for being suspected of involvement in four murders.[62] The Zizians originally formed around the Bay Area rationalist community, but became disillusioned with other rationalist organizations and leaders. Among the Zizians' accusations against them were anti-transgender discrimination, misuse of donor funds to pay off a sexual misconduct accuser, and not valuing animal welfare in plans for human-friendly AI.[63]

The group has been called radical or cult-like by publications such as The Independent,[64] the Associated Press,[65] SFGate,[66] and Reason.[67] The Boston Globe and The New York Times have compared the Zizians to the Manson Family.[68][69] Similarly, Anna Salamon, the director of the Center for Applied Rationality, compared the Zizian belief system to that of a doomsday cult.[69]

Remove ads

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads