Lightcone needs your funding for LessWrong! because they had to give the FTX money back
Rationalism meets iNaturalist: ficus all
Casey Newton drinks the kool-aid
Mass resignations at Intelligence as Elsevier actually bothers slightly for once
Mangione "really wanted to meet my other founding members and start a community based on ideas like rationalism, Stoicism, and effective altruism"
what the hell is Theil on here (CONTENT WARNING: Piers Morgan)
Son! Are you in there building God from neural networks?
TPOT hits the big time!
The dregs of Dimes Square after the election: the most despondent winners you ever saw
In which some researchers draw a spooky picture and spook themselves
Eliezer Yudkowsky on discovering that the (ALLEGED) ceo shooter was solidly a member of his very own rationalist subculture: "he was crazy and also on drugs and also not a real e/acc"
The Professor Assigns Their Own Book — But Now With a Tech Bubble in the Middle Step
2023 study: how EA uses double meanings for a milk-before-meat strategy
That Time Eliezer Yudkowsky recommended a really creepy sci-fi book to his audience
Does AI startup Extropic actually … do anything?
LessWrong House Style (2023)
In which a Eugenicist Effective Altruist advocates becoming a cuckold for the greater good
Peter Singer introduces the Peter Singer AI to elevate ethical discourse in the digital age
The predictably grievous harms of Effective Altruism
Anthropic protects the welfare of possible future AIs. Present-day humans, not so much