As a layperson, and a recent reader on the subject of AI (yes, I’ve been happily hiding under a rock), I have enjoyed but been concerned by the numerous topics surrounding AI risk. I appreciate this particular post as it explores some aspects which I can understand and therefore hold with some semblance of rationality. A recent post about ‘clown attacks’ was also deeply interesting. In comparison, the paper clip theory seems completely ‘other worldly’
Is it possible that humanity might be faced with more mundane risks? My thoughts on this come from a personal perspective, not professional or academic, but from living in a highly controlled society (China) where my access to many of the online interests that I have are restricted or forbidden due to the firewall.
From this minor but direct experience, it seems to me that all a non-aligned AGI would need to do is reduce and then remove our access to information and communication. Healthcare, energy and water supplies, finance, cross border communications (isolate communities/cultures), knowledge access, and control of manufacturing/processes. These would all cease to operate in the ways needed to support our current populatio.
Where I live is so dependent upon internet access for almost everything that, if this connection was broken or removed for a few weeks, there would be a significant harm done. Imagining this as a permanent state of affairs and the consequences seem to me to expand out into the future whereby we are no longer functioning as large societies and would be reduced to foraging and no longer in the technology race. AGI wins and not a single paper clip in sight.
I guess these mundane risks have been covered elsewhere on LW and would greatly appreciate any signposting.
I am not sure what posts might be worth linking to, but I think in your scenario the next point would be that this is a temporary state of affairs. Once large-scale communication/coordination/civilization/technology are gone and humans are reduced to small surviving bands, AGI keeps going, and by default it is unlikely that it leaves humans alone, in peace, in an environment they can survive in, for very long. It’s actually just the chimp/human scenario with humans reduced to the chimps’ position but where the AGIs don’t even bother to have laws officially protecting human lives and habitats.
As a layperson, and a recent reader on the subject of AI (yes, I’ve been happily hiding under a rock), I have enjoyed but been concerned by the numerous topics surrounding AI risk. I appreciate this particular post as it explores some aspects which I can understand and therefore hold with some semblance of rationality. A recent post about ‘clown attacks’ was also deeply interesting. In comparison, the paper clip theory seems completely ‘other worldly’
Is it possible that humanity might be faced with more mundane risks? My thoughts on this come from a personal perspective, not professional or academic, but from living in a highly controlled society (China) where my access to many of the online interests that I have are restricted or forbidden due to the firewall.
From this minor but direct experience, it seems to me that all a non-aligned AGI would need to do is reduce and then remove our access to information and communication. Healthcare, energy and water supplies, finance, cross border communications (isolate communities/cultures), knowledge access, and control of manufacturing/processes. These would all cease to operate in the ways needed to support our current populatio.
Where I live is so dependent upon internet access for almost everything that, if this connection was broken or removed for a few weeks, there would be a significant harm done. Imagining this as a permanent state of affairs and the consequences seem to me to expand out into the future whereby we are no longer functioning as large societies and would be reduced to foraging and no longer in the technology race. AGI wins and not a single paper clip in sight.
I guess these mundane risks have been covered elsewhere on LW and would greatly appreciate any signposting.
I am not sure what posts might be worth linking to, but I think in your scenario the next point would be that this is a temporary state of affairs. Once large-scale communication/coordination/civilization/technology are gone and humans are reduced to small surviving bands, AGI keeps going, and by default it is unlikely that it leaves humans alone, in peace, in an environment they can survive in, for very long. It’s actually just the chimp/human scenario with humans reduced to the chimps’ position but where the AGIs don’t even bother to have laws officially protecting human lives and habitats.