Preview Mode Links will not work in preview mode

Astral Codex Ten Podcast

Jul 30, 2023

Machine Alignment Monday, 7/24/23

Intelligence explosion arguments don’t require Platonism. They just require intelligence to exist in the normal fuzzy way that all concepts exist.

First, I’ll describe what the normal way concepts exist is. I’ll have succeeded if I convince you that claims using the...


Jul 30, 2023

[This is one of the finalists in the 2023 book review contest, written by an ACX reader who will remain anonymous until after voting is done. I’ll be posting about one of these a week for several months. When you’ve read them all, I’ll ask you to vote for a favorite, so remember which ones you liked]

A book about...


Jul 30, 2023

People are talking about British economic decline.

Not just the decline from bestriding the world in the 19th century to today. A more recent, more profound decline, starting in the early 2000s, when it fell off the track of normal developed-economy growth. See for example this graph from We Are In An Unprecedented Era...


Jul 25, 2023

This month’s big news in forecasting: the Forecasting Research Institute has released the results of the Existential Risk Persuasion Tournament (XPT). XPT was supposed to use cutting-edge forecasting techniques to develop consensus estimates of the danger from various global risks like climate change, nuclear...


Jul 20, 2023

Elon Musk has a new AI company, xAI. I appreciate that he seems very concerned about alignment. From his Twitter Spaces discussion:

I think I have been banging the drum on AI safety now for a long time. If I could press pause on AI or advanced AI digital superintelligence, I would. It doesn’t seem like that is...