RSS

LessWrong

TagLast edit: 20 Jul 2022 18:48 UTC by Leo

LessWrong (sometimes spelled Less Wrong) is a community blog and forum dedicated to improving epistemic and instrumental rationality.

History

In November 2006, the group blog Overcoming Bias was launched, with Robin Hanson and Eliezer Yudkowsky as its primary authors.[1] In early March 2009, Yudkowsky founded LessWrong, repurposing his contributions to Overcoming Bias as the seed content for this “community blog devoted to refining the art of human rationality.”[2][3] This material was organized as a number of “sequences”, or thematic collections of posts to be read in a specific order, later published in book form.[4] Shortly thereafter, Scott Alexander joined as a regular contributor.

Around 2013, Yudkowsky switched his primary focus to writing fan fiction, and Alexander launched his own blog, Slate Star Codex, to which most of his contributions were posted. As a consequence of this and other developments, posting quality and frequency on LessWrong began to decline.[5] By 2015, activity on the site was a fraction of what it had been in its heyday.[6]

LessWrong was relaunched as LessWrong 2.0 in late 2017 on a new codebase and a full-time, dedicated team.[7][8] The launch coincided with the release of Yudkowsky’s book Inadequate Equilibria, posted as a series of chapters and also released as a book.[9] Since the relaunch, activity has recovered and has remained at steady levels.[6][10]

In 2021, the LessWrong team became Lightcone Infrastructure, broadening its scope to encompass other projects related to the rationality community and the future of humanity.[11]

Funding

As of July 2022, LessWrong and Lightcone Infrastructure have received over $2.3 million in funding from the Survival and Flourishing Fund,[12][13][14] $2 million from the Future Fund,[15] and $760,000 from Open Philanthropy.[16]

Further reading

Monteiro, Chris et al. (2017) History of Less Wrong, LessWrong Wiki, October 20.

External links

LessWrong. Official website.

Donate to LessWrong.

Related entries

AI Alignment Forum | Effective Altruism Forum | Lightcone Infrastructure | rationality community

  1. ^

    Hanson, Robin (2009) About, Overcoming Bias.

  2. ^

    Alexander, Scott (2014) Five years and one week of Less Wrong, Slate Star Codex, March 13.

  3. ^

    Bloom, Ruben (2019) A brief history of LessWrong, LessWrong, May 31.

  4. ^

    Yudkowsky, Eliezer (2015) Rationality: From AI to Zombies, Berkeley: Machine Intelligence Research Institute.

  5. ^

    Alexander, Scott (2017) Comment on ‘A history of the rationality community?’, Reddit, August 15.

  6. ^
  7. ^

    Vaniver (2017) LW 2.0 open beta live, LessWrong, September 20.

  8. ^

    Bloom, Ruben et al. (2019) Welcome to LessWrong!, LessWrong, June 14.

  9. ^

    Yudkowsky, Eliezer (2017) Inadequate Equilibria: Where and How Civilizations Get Stuck, Berkeley: Machine Intelligence Research Institute.

  10. ^

    “the conclusion that the LW community recovered from its previous decline holds” (Sempere, Nuño (2021) Shallow evaluations of longtermist organizations, Effective Altruism Forum, June 24)

  11. ^

    Habryka, Oliver (2021) The LessWrong Team is now Lightcone Infrastructure, come work with us!, LessWrong, September 30.

  12. ^

    Survival and Flourishing Fund (2019) SFF-2020-H1 S-process recommendations announcement, Survival and Flourishing Fund.

  13. ^

    Survival and Flourishing Fund (2020) SFF-2021-H1 S-process recommendations announcement, Survival and Flourishing Fund.

  14. ^

    Survival and Flourishing Fund (2020) SFF-2021-H2 S-process recommendations announcement, Survival and Flourishing Fund.

  15. ^

    Future Fund (2022) Our grants and investments: LessWrong, Future Fund.

  16. ^

    Open Philanthropy (2022) Grants database: LessWrong, Open Philanthropy.

The LessWrong Team is now Light­cone In­fras­truc­ture, come work with us!

Habryka1 Oct 2021 4:36 UTC
65 points
2 comments1 min readEA link
(www.lesswrong.com)

Shal­low eval­u­a­tions of longter­mist organizations

NunoSempere24 Jun 2021 15:31 UTC
192 points
34 comments34 min readEA link

[Linkpost] LessOn­line (May 31—June 2, Berkeley, CA)

Saul Munn28 Mar 2024 23:37 UTC
26 points
0 comments1 min readEA link
(Less.Online)

Pod­cast with Oli Habryka on LessWrong /​ Light­cone Infrastructure

DanielFilan6 Feb 2023 16:42 UTC
85 points
14 comments1 min readEA link

“Two-fac­tor” vot­ing (“two di­men­sional”: karma, agree­ment) for EA fo­rum?

david_reinstein25 Jun 2022 11:10 UTC
81 points
18 comments1 min readEA link
(www.lesswrong.com)

[Question] I have thou­sands of copies of HPMOR in Rus­sian. How to use them with the most im­pact?

MikhailSamin27 Dec 2022 11:07 UTC
39 points
10 comments1 min readEA link

Less­wrong Di­as­pora survey

elo3 Apr 2016 11:25 UTC
5 points
5 comments1 min readEA link

In­creased Availa­bil­ity and Willing­ness for De­ploy­ment of Re­sources for Effec­tive Altru­ism and Long-Termism

Evan_Gaensbauer29 Dec 2021 20:20 UTC
46 points
1 comment2 min readEA link

Read­ing the ethi­cists 2: Hunt­ing for AI al­ign­ment papers

Charlie Steiner6 Jun 2022 15:53 UTC
9 points
0 comments1 min readEA link
(www.lesswrong.com)

some thoughts on lessOn­line (note: early prices end Mon­day)

Raemon10 May 2024 23:21 UTC
11 points
0 comments5 min readEA link

Low-Com­mit­ment Less Wrong Book (EG Ar­ti­cle) Club

Jeremy10 Feb 2022 15:25 UTC
39 points
25 comments1 min readEA link

Listen to more EA con­tent with The Non­lin­ear Library

Kat Woods19 Oct 2021 12:24 UTC
187 points
94 comments8 min readEA link

[Question] What (stan­dalone) LessWrong posts would you recom­mend to most EA com­mu­nity mem­bers?

Vaidehi Agarwalla 🔸9 Feb 2022 0:31 UTC
67 points
20 comments1 min readEA link

In­ter­ven­tion op­tions for im­prov­ing the EA-al­igned re­search pipeline

MichaelA🔸28 May 2021 14:26 UTC
49 points
27 comments12 min readEA link

A sum­mary of ev­ery “High­lights from the Se­quences” post

Akash15 Jul 2022 23:05 UTC
47 points
3 comments17 min readEA link

LessWrong is now a book, available for pre-or­der!

jacobjacob4 Dec 2020 20:42 UTC
48 points
1 comment10 min readEA link

LessWrong/​EA New Year’s Ul­tra Party

Vaidehi Agarwalla 🔸18 Dec 2020 5:15 UTC
74 points
10 comments1 min readEA link

Rea­sons for and against post­ing on the EA Forum

MichaelA🔸23 May 2021 11:29 UTC
32 points
10 comments14 min readEA link

[Question] Fo­rum + LW re­la­tion­ship: What is the effect?

Rockwell23 Jan 2023 21:35 UTC
56 points
26 comments1 min readEA link

What cog­ni­tive bi­ases feel like from the inside

EA Handbook27 Jul 2022 23:13 UTC
34 points
6 comments5 min readEA link
(www.lesswrong.com)

Light­cone In­fras­truc­ture/​LessWrong is look­ing for funding

Habryka14 Jun 2023 4:45 UTC
65 points
10 comments1 min readEA link

Learn­ing From Less Wrong: Spe­cial Threads, and Mak­ing This Fo­rum More Useful

Evan_Gaensbauer24 Sep 2014 10:59 UTC
6 points
21 comments3 min readEA link

LW4EA: Elas­tic Pro­duc­tivity Tools

Jeremy3 Jan 2023 3:18 UTC
−1 points
5 comments1 min readEA link
(www.lesswrong.com)

New tool for ex­plor­ing EA Fo­rum and LessWrong—Tree of Tags

Filip Sondej27 Oct 2022 17:43 UTC
43 points
8 comments1 min readEA link

How To Ac­tu­ally Succeed

Jordan Arel12 Sep 2022 22:33 UTC
11 points
0 comments5 min readEA link

[Question] How do Ra­tion­al­ists’ and EAs’ be­liefs differ?

aprilsun11 Aug 2023 21:04 UTC
13 points
9 comments1 min readEA link

Looping

Jarred Filmer5 Oct 2022 1:47 UTC
19 points
4 comments1 min readEA link

Is­sue with AI al­ign­ment—di­ver­sity of opinions as com­pete­tive ad­van­tage? (as op­posed to echo cham­bers)

Mars Robertson30 Aug 2023 9:44 UTC
1 point
5 comments2 min readEA link

Como são vie­ses cog­ni­tivos vis­tos por den­tro?

AE Brasil / EA Brazil20 Jul 2023 18:48 UTC
4 points
0 comments4 min readEA link

I Con­verted Book I of The Se­quences Into A Zoomer-Read­able Format

Daniel Kirmani10 Nov 2022 2:59 UTC
33 points
5 comments1 min readEA link

LW4EA: Six eco­nomics mis­con­cep­tions of mine which I’ve re­solved over the last few years

Jeremy30 Aug 2022 15:20 UTC
8 points
0 comments1 min readEA link
(www.lesswrong.com)

Es­ti­mat­ing EA Growth Rates (MCF memo)

Angelina Li25 Oct 2023 8:48 UTC
133 points
18 comments30 min readEA link

A tool for search­ing ra­tio­nal­ist & EA webs

Daniel_Friedrich29 Sep 2023 15:20 UTC
11 points
8 comments1 min readEA link
(ratsearch.blogspot.com)
No comments.