In a recent conversation, someone said the truism about how young people have more years of their life ahead of them and that’s exciting. I replied that everyone has the same number of years of life ahead of them now, because AI timelines. (Everyone = everyone in the conversation, none of whom were above 30)
I’m interested in the question of whether it’s generally helpful or harmful to say awkward truths like that. If anyone is reading this and wants to comment, I’d appreciate thoughts.
I sometimes kinda have this attitude that this whole situation is just completely hilarious and absurd, i.e. that I believe what I believe about the singularity and apocalypse and whatnot, but that the world keeps spinning and these ideas have basically zero impact. And it makes me laugh. So when I shrug and say “I’m not saving enough for retirement; oh well, by then probably we’ll all be dead or living in a radical post-work utopia”, I’m not just laughing because it’s ambiguously a joke, I’m also laughing because this kind of thing reminds me of how ridiculous this all is. :-P
“Truths” are persuasion, unless expected to be treated as hypotheses with the potential to evoke curiosity. This is charity, continuous progress on improving understanding of circumstances that produce claims you don’t agree with, a key skill for actually changing your mind. By default charity is dysfunctional in popular culture, so non-adversarial use of factual claims that are not expected to become evident in short order depends on knowing that your interlocutor practices charity. Non-awkward factual claims are actually more insidious, as the threat of succeeding in unjustified persuasion is higher. So in a regular conversation, there is a place for arguments, not for “truths”, awkward or not. Which in this instance entails turning the conversation to the topic of AI timelines.
I don’t think there are awkward arguments here in the sense of treading a social taboo minefield, so there is no problem with that, except it’s work on what at this point happens automatically via stuff already written up online, and it’s more efficient to put effort in growing what’s available online than doing anything in person, unless there is a plausible path to influencing someone who might have high impact down the line.
It’s fine to say that if you want the conversation to become a discussion of AI timelines. Maybe you do! But not every conversation needs to be about AI timelines.
I’ve stopped bringing up the awkward truths around my current friends. I started to feel like I was using to much of my built up esoteric social capital on things they were not going to accept (or at least want to accept). How can I blame them? If somebody else told me there was some random field that a select few of people interested in will be deciding the fate of all of humanity for the rest of time and I had no interest in that field I would want to be skeptical of it as well. Especially if they were to through out some figures like 15 − 25 years from now (my current timelines) is when humanities rein over the earth will end because of this field.
I found when I stopped bringing it up conversations were lighter and more fun. I’ve accepted we will just be screwing around talking about personal issues and the issues de jour, I don’t mind it. The truth is a bitter pill to get down, and if they no interest in helping AI research its probably best they don’t live their life worrying about things they won’t be able to change. So for me at least I saw personal life improvements on not bringing some of those awkward truths up.
Depends on the audience and what they’ll do with the reminder. But that goes for the original statement as well (which remains true—there’s enough uncertainty about AI timelines and impact on individual human lives that younger people have more years of EXPECTED (aka average across possible futures) life).
In a recent conversation, someone said the truism about how young people have more years of their life ahead of them and that’s exciting. I replied that everyone has the same number of years of life ahead of them now, because AI timelines. (Everyone = everyone in the conversation, none of whom were above 30)
I’m interested in the question of whether it’s generally helpful or harmful to say awkward truths like that. If anyone is reading this and wants to comment, I’d appreciate thoughts.
I’ve been going with the compromise position of “saying it while laughing such that it’s unclear whether you’re joking or not” :-P
The people who know me know I’m not joking, I think. For people who don’t know me well enough to realize this, I typically don’t make these comments.
I sometimes kinda have this attitude that this whole situation is just completely hilarious and absurd, i.e. that I believe what I believe about the singularity and apocalypse and whatnot, but that the world keeps spinning and these ideas have basically zero impact. And it makes me laugh. So when I shrug and say “I’m not saving enough for retirement; oh well, by then probably we’ll all be dead or living in a radical post-work utopia”, I’m not just laughing because it’s ambiguously a joke, I’m also laughing because this kind of thing reminds me of how ridiculous this all is. :-P
What if things foom later than you’re expecting—say during retirement?
What if anti-aging enters the scene and retirement can last, much, much longer, before the foom?
Tbc my professional opinion is that people should continue to save for retirement :-P
I mean, I don’t have as much retirement savings as the experts say I should at my age … but does anyone? Oh well...
“Truths” are persuasion, unless expected to be treated as hypotheses with the potential to evoke curiosity. This is charity, continuous progress on improving understanding of circumstances that produce claims you don’t agree with, a key skill for actually changing your mind. By default charity is dysfunctional in popular culture, so non-adversarial use of factual claims that are not expected to become evident in short order depends on knowing that your interlocutor practices charity. Non-awkward factual claims are actually more insidious, as the threat of succeeding in unjustified persuasion is higher. So in a regular conversation, there is a place for arguments, not for “truths”, awkward or not. Which in this instance entails turning the conversation to the topic of AI timelines.
I don’t think there are awkward arguments here in the sense of treading a social taboo minefield, so there is no problem with that, except it’s work on what at this point happens automatically via stuff already written up online, and it’s more efficient to put effort in growing what’s available online than doing anything in person, unless there is a plausible path to influencing someone who might have high impact down the line.
It’s fine to say that if you want the conversation to become a discussion of AI timelines. Maybe you do! But not every conversation needs to be about AI timelines.
I’ve stopped bringing up the awkward truths around my current friends. I started to feel like I was using to much of my built up esoteric social capital on things they were not going to accept (or at least want to accept). How can I blame them? If somebody else told me there was some random field that a select few of people interested in will be deciding the fate of all of humanity for the rest of time and I had no interest in that field I would want to be skeptical of it as well. Especially if they were to through out some figures like 15 − 25 years from now (my current timelines) is when humanities rein over the earth will end because of this field.
I found when I stopped bringing it up conversations were lighter and more fun. I’ve accepted we will just be screwing around talking about personal issues and the issues de jour, I don’t mind it. The truth is a bitter pill to get down, and if they no interest in helping AI research its probably best they don’t live their life worrying about things they won’t be able to change. So for me at least I saw personal life improvements on not bringing some of those awkward truths up.
Depends on the audience and what they’ll do with the reminder. But that goes for the original statement as well (which remains true—there’s enough uncertainty about AI timelines and impact on individual human lives that younger people have more years of EXPECTED (aka average across possible futures) life).
Whether it makes sense to tell someone an awkward truth depends often more on the person then on the truth.
Truths in general:
This is especially true when the truth isn’t in the words, but something you’re trying to point at with them.
Awkward truths:
What makes something an awkward truth, is the person, anyway, so your statement seems tautological.