It’s an impressive chain of thoughts by Scott Alexander, stretching from the start of agriculture through to superintelligence. Moloch is the name Alexander plucks from Ginsberg to describe all of them. Moloch is civilization, or the tragedy of the commons, or institutions that drive their members into mutual destruction:
A basic principle unites all of the multipolar traps above. In some competition optimizing for X, the opportunity arises to throw some other value under the bus for improved X. Those who take it prosper. Those who don’t take it die out
All this reminds me strongly of Daniel Quinn, a writer you might place somewhere between primitivism, Deep Green environmentalism, or tribalism. Quinn is one of the writers I most treasure, someone who has reshaped much of how I see the world. But he’s not a natural fellow-traveller for Scott Alexander, whose background is in the hyper-rationalist technophile community around Less Wrong.
One of Quinn’s fundamental ideas is opposition to ‘civilization’. What Quinn calls civilization roughly corresponds to, or perhaps contains, Moloch. It’s the set of basic lifestyles and activities we live under — which are the ones that have outcompeted other cultures. This civilization is the outcome of a process of natural selection. It has won not by being better for people, but by being better at growing. Quinn takes this all the way back to when farming won out over hunter-gathering, despite the life of a farmer being much worse than that of a hunter.
Alexander traces the same process as Quinn, and then pushes it forward into the future. Humans become less useful to Moloch as technology progresses, meaning that there is less need for Moloch to make any allowance for their wishes:
the current rulers of the universe – call them what you want, Moloch, Gnon, Azathoth, whatever – want us dead, and with us everything we value. Art, science, love, philosophy, consciousness itself, the entire bundle. And since I’m not down with that plan, I think defeating them and taking their place is a pretty high priority.
Alexander’s way out of this is that we should rush to develop a friendly artficial intelligence that can outcompete Moloch on our behalf, reach a position of absolute universal power and use it to smack down other superintelligences that care less about humans.
I can’t say I find that prospect much more reassuring than Quinn’s nods towards neo-tribalism. I’d rather run with a tribe than be subjected to the benevolant dictatorship of an all-conquering machine of loving grace.