The AI industry is racing toward a precipice.

The default consequence of the creation of artificial superintelligence (ASI) is human extinction.

Our survival depends on delaying the creation of ASI, as soon as we can, for as long as necessary.