Very interesting. I have transhumanist beliefs that I claim to hold. My actions imply that I believe that I believe, if I understand this properly.
A prime example would be how I tend to my health. There are simple rational steps I can take to increase my odds of living long enough to hit pay dirt. I take okay care of myself, but could do better. Much better.
Cryonics may be another example. More research is required on my part, but a non-zero last stab is arguably better than nothing. I am not enrolled. It feels a bit like Pascal’s Wager to me. Perhaps it is a more valid form of the argument, though. Hoping for a scientific miracle seems essentially different than hoping for a magical miracle. Scientific miracles abound. Artificial hearts, cochlear implants, understanding our origins, providing succor to imbalanced minds, the list goes on. Magical miracles… not so much.
Heck, I could stop forgetting to floss daily! (There seem to be strong correllations between gum disease and heart disease).
I anticipate as if there will be no radical life extension available within my life time, but I will argue for the possibility and even likelihood. Do I have this correct as a type of belief in belief?
In both cases, you profess “I should floss every day” and do not actually floss every day. If it’s belief in belief, you might not even acknowledge the incongruence. If it’s merely akrasia, you almost certainly will.
It can be even simpler than that. You can sincerely desire to change such that you floss every day, and express that desire with your mouth, “I should floss every day,” and yet find yourself unable to physically establish the new habit in your routine. You know you should, and yet you have human failings that prevent you from achieving what you want. And yet, if you had a button that said “Edit my mind such that I am compelled to floss daily as part of my morning routine unless interrupted by serious emergency and not simply by mere inconvenience or forgetfulness,” they would be pushing that button.
On the other hand, I may or may not want to live forever, depending on how Fun Theory resolves. I am more interested in accruing maximum hedons over my lifespan. Living to 2000 eating gruel as an ascetic and accruing only 50 hedons in those 2000 years is not a gain for me over an Elvis Presley style crash and burn in 50 years ending with 2000 hedons. The only way you can tempt me into immortality is a strong promise of massive hedon payoff, with enough of an acceleration curve to pave the way with tangible returns at each tradeoff you’d have me make. I’m willing to eat healthier if you make the hedons accrue as I do it, rather than only incrementally after the fact. If living increasingly longer requires sacrificing increasingly many hedons, I’m going to have to solve some estimate of integrating for hedons per year over time to see how it pays out. And if I can’t see tangible returns on my efforts, I probably won’t be willing to put in the work. A local maximum feels satisfying if you can’t taste the curve to the higher local maximum, and I’m not all that interested in climbing down the hill while satisfied.
Give me a second order derivative I can feel increasing quickly, and I will climb down that hill though.
That’s helpful input, thanks. After reading the link and searching the wiki I suspect that it is more likely an akrasia/urges v. goals sort of thing based upon my reaction to noticing the inconsistency. I felt a need to bring my actions in line with my professed beliefs.
Very interesting. I have transhumanist beliefs that I claim to hold. My actions imply that I believe that I believe, if I understand this properly.
A prime example would be how I tend to my health. There are simple rational steps I can take to increase my odds of living long enough to hit pay dirt. I take okay care of myself, but could do better. Much better.
Cryonics may be another example. More research is required on my part, but a non-zero last stab is arguably better than nothing. I am not enrolled. It feels a bit like Pascal’s Wager to me. Perhaps it is a more valid form of the argument, though. Hoping for a scientific miracle seems essentially different than hoping for a magical miracle. Scientific miracles abound. Artificial hearts, cochlear implants, understanding our origins, providing succor to imbalanced minds, the list goes on. Magical miracles… not so much.
Heck, I could stop forgetting to floss daily! (There seem to be strong correllations between gum disease and heart disease).
I anticipate as if there will be no radical life extension available within my life time, but I will argue for the possibility and even likelihood. Do I have this correct as a type of belief in belief?
Pretty much. Though it might just be a case of urges not lining up with goals.
In both cases, you profess “I should floss every day” and do not actually floss every day. If it’s belief in belief, you might not even acknowledge the incongruence. If it’s merely akrasia, you almost certainly will.
It can be even simpler than that. You can sincerely desire to change such that you floss every day, and express that desire with your mouth, “I should floss every day,” and yet find yourself unable to physically establish the new habit in your routine. You know you should, and yet you have human failings that prevent you from achieving what you want. And yet, if you had a button that said “Edit my mind such that I am compelled to floss daily as part of my morning routine unless interrupted by serious emergency and not simply by mere inconvenience or forgetfulness,” they would be pushing that button.
On the other hand, I may or may not want to live forever, depending on how Fun Theory resolves. I am more interested in accruing maximum hedons over my lifespan. Living to 2000 eating gruel as an ascetic and accruing only 50 hedons in those 2000 years is not a gain for me over an Elvis Presley style crash and burn in 50 years ending with 2000 hedons. The only way you can tempt me into immortality is a strong promise of massive hedon payoff, with enough of an acceleration curve to pave the way with tangible returns at each tradeoff you’d have me make. I’m willing to eat healthier if you make the hedons accrue as I do it, rather than only incrementally after the fact. If living increasingly longer requires sacrificing increasingly many hedons, I’m going to have to solve some estimate of integrating for hedons per year over time to see how it pays out. And if I can’t see tangible returns on my efforts, I probably won’t be willing to put in the work. A local maximum feels satisfying if you can’t taste the curve to the higher local maximum, and I’m not all that interested in climbing down the hill while satisfied.
Give me a second order derivative I can feel increasing quickly, and I will climb down that hill though.
That’s helpful input, thanks. After reading the link and searching the wiki I suspect that it is more likely an akrasia/urges v. goals sort of thing based upon my reaction to noticing the inconsistency. I felt a need to bring my actions in line with my professed beliefs.