It’s not just alignment and capability research you need to watch out for—anything connected to AI could conceivably advance timelines and therefore is inadvisable.
We could take this logic further. Any participation in the global economy could conceivably be connected to AI. Therefore people concerned about AI x-risk should quit their jobs, stop using any services, learn to live off nature, and go live as hermits.
Of course, it’s possible that the local ecosystem may have a connection to the local economy and thus the global economy, so even living off nature may have some influence on AI. Learning to survive on pure sunlight may thus be a crucial alignment research priority.
We could take this logic further. Any participation in the global economy could conceivably be connected to AI. Therefore people concerned about AI x-risk should quit their jobs, stop using any services, learn to live off nature, and go live as hermits.
Of course, it’s possible that the local ecosystem may have a connection to the local economy and thus the global economy, so even living off nature may have some influence on AI. Learning to survive on pure sunlight may thus be a crucial alignment research priority.