I’m always reminded of a trademark quip of an old chemistry professor of mine: “Thermodynamic says ‘yes’, kinetic says… ‘maybe’.”
We will run out of ressources on Earth, because blablabla Second thermodynamic principle something.** Now the question is how fast, in particular how fast compared to either leaving Earth or blowing ourselves up through nukes or AI. If the answer is “not fast at all”, then we should not really worry.
**(I’m aware that the Earth is not a close system, but if I understand the physic correctly the system still needs to be stationary, meaning we need to let out as much energy as the Sun send in order not to cook ourselves up. My memories about black body radiation and so on are far away however, so I’m happy to be corrected)
I’m a bit surprised, in general, by the way people in EA/rationalist adjacent circles treat the concerns about ressource usage. Running out of a key ressource seems to me an obvious x-risk, whose probability should be estimated and taken into account, rather than quickly dismissed. Besides, from a strategic viewpoint, the “peak everything” crowd has a lot of similarities with us : they worry about long term issues and difficult coordination problems, are treated as lunatics by the mainstream, and generally seem like prime targets for a communication effort on AI-risk and over x-risks.
I’m always reminded of a trademark quip of an old chemistry professor of mine: “Thermodynamic says ‘yes’, kinetic says… ‘maybe’.”
We will run out of ressources on Earth, because blablabla Second thermodynamic principle something.** Now the question is how fast, in particular how fast compared to either leaving Earth or blowing ourselves up through nukes or AI. If the answer is “not fast at all”, then we should not really worry.
**(I’m aware that the Earth is not a close system, but if I understand the physic correctly the system still needs to be stationary, meaning we need to let out as much energy as the Sun send in order not to cook ourselves up. My memories about black body radiation and so on are far away however, so I’m happy to be corrected)
I’m a bit surprised, in general, by the way people in EA/rationalist adjacent circles treat the concerns about ressource usage. Running out of a key ressource seems to me an obvious x-risk, whose probability should be estimated and taken into account, rather than quickly dismissed.
Besides, from a strategic viewpoint, the “peak everything” crowd has a lot of similarities with us : they worry about long term issues and difficult coordination problems, are treated as lunatics by the mainstream, and generally seem like prime targets for a communication effort on AI-risk and over x-risks.