For anyone looking for an easily sharable deep dive they can use to explain to someone why collapsing birth rates are probably a big deal (assuming we solve AI alignment) and could lead to a system collapse we through this document together: https://docs.google.com/document/d/1Wx6D3iFcB9eEE4EbQRogI9W-DunauGO3lj8SZDR6vYM If you have any additional content you want us to add point us in the right direction.
collapsing birth rates are probably a big deal (assuming we solve AI alignment)
Long term issues are mostly irrelevant given short AI timelines not just in the doom branch, aligned AIs solve such issues just as fast (I see this is explicitly mentioned in the document, and yet). The branch worth worrying about with regard to long term AI-unrelated things is delayed AI.
For anyone looking for an easily sharable deep dive they can use to explain to someone why collapsing birth rates are probably a big deal (assuming we solve AI alignment) and could lead to a system collapse we through this document together: https://docs.google.com/document/d/1Wx6D3iFcB9eEE4EbQRogI9W-DunauGO3lj8SZDR6vYM
If you have any additional content you want us to add point us in the right direction.
Long term issues are mostly irrelevant given short AI timelines not just in the doom branch, aligned AIs solve such issues just as fast (I see this is explicitly mentioned in the document, and yet). The branch worth worrying about with regard to long term AI-unrelated things is delayed AI.