4 Comments
User's avatar
Paul Sas's avatar

Whenever I hear Walker's Why We Sleep mentioned, Alex Guzey's takedown is my instant reply

https://guzey.com/books/why-we-sleep/

David Nebinski's avatar

Thanks for writing this!

Heidi Huang's avatar

great write-up!

Kindred Winecoff's avatar

These "notes from before the crash" are going to be pretty funny to look back on. This part is funny right now:

"I start with AI, since it’s central to the modern world"

b/w

"There is no LLM use in White House ... Most federal agencies are not using AI models ... highly oppressive and terrible when AIs observe and enforce laws ... Too much democracy could be a problem, per the Founders, and it’s unclear what space this makes for AI."

The Founders produced an apartheid state that collapsed in Civil War because of too little democracy, consequently the Constitution was amended repeatedly to expand democracy and things have gone much better since. So it actually is clear what space this makes for "AI" (i.e., probabilistic statistical models) in a representative democracy: very, very little.

Which is why the SF robber barons -- who violate others' intellectual property wantonly while demanding every jot and tittle of theirs is protected -- want to end democracy.

"The healthy root of ambition was to do meaningful things to help others". Maybe, but the root of *tech* ambition is very clearly power and money. The "non-profit" "open" company led by Sam Altman being valued at $500bn answers the question on its own, but if not then Elon demanding $1tn so he can control a "dark MAGA" robot army would.

If you disagree then we can ask some chatbots whether their companies are motivated by power and money, or by charity, and see what they "think". E.g., Gemini says: "The reality is that most major AI development happens within for-profit companies or organizations with a commercial arm, meaning money, market dominance, and power (in the form of control over foundational technology and data) are central drivers."