I’m guessing TAI doesn’t stand for “International Atomic Time”, and maybe has something to do with “AI”, as it seems artificial intelligence has really captured folk’s imagination. =]
It seems like there are more pressing things to be scared of than AI getting super smart (which almost by default seems to imply “and Evil”), but we (humans) don’t really seem to care that much about these pressing issues, as I guess they’re kinda boring at this point, and we need exciting.
If we had an unlimited amount of energy and focus, maybe it wouldn’t matter, but as you kind of ponder here— how do we get people to stay on target? The less time there is, the more people we need working to change things to address the issue (see Leaded Gas[1], or CFCs and the Ozone Layer, etc.), but there are a lot of problems a lot of people think are important and we’re generally fragmented.
I guess I don’t really have any answers, other than the obvious (leaded gas is gone, the ozone is recovering), but I can’t help wishing we were more logical than emotional about what we worked towards.
Also, FWIW, I don’t know that we know that we can’t change the past, or if the universe is deterministic, or all kinds of weird ideas like “are we in a simulation right now/are we the AI”/etc.— which are hardcore axioms to still have “undecided” so to speak! I better stop here before my imagination really runs wild…
but like, not leaded pipes so much, as they’re still ’round even tho we could have cleaned them up and every year say we will or whatnot, but I digress
I’m familiar with AGI, and the concepts herein (why the OP likes the proposed definition of CT better than PONR), it was just a curious post, what with having “decisions in the past cannot be changed” and “does X concept exist” and all.
I think maybe we shouldn’t muddy the waters more than we already have with “AI” (like AGI is probably a better term for what was meant here— or was it? Are we talking about losing millions of call center jobs to “AI” (not AGI) and how that will impact the economy/whatnot? I’m not sure if that’s transformatively up there with the agricultural and industrial revolutions (as automation seems industrial-ish?). But I digress.), by saying “maybe crunch time isn’t a thing? Or it’s relative?”.
I mean, yeah, time is relative, and doesn’t “actually” exist, but if indeed we live in causal universe (up for debate) then indeed, “crunch time” exists, even if by nature it’s fuzzy— as lots of things contribute to making Stuff Happen. (The butterfly effect, chaos theory, game theory &c.)
“The avalanche has already started. It is too late for the pebbles to vote.” - Ambassador Kosh
I’m guessing TAI doesn’t stand for “International Atomic Time”, and maybe has something to do with “AI”, as it seems artificial intelligence has really captured folk’s imagination. =]
It seems like there are more pressing things to be scared of than AI getting super smart (which almost by default seems to imply “and Evil”), but we (humans) don’t really seem to care that much about these pressing issues, as I guess they’re kinda boring at this point, and we need exciting.
If we had an unlimited amount of energy and focus, maybe it wouldn’t matter, but as you kind of ponder here— how do we get people to stay on target? The less time there is, the more people we need working to change things to address the issue (see Leaded Gas[1], or CFCs and the Ozone Layer, etc.), but there are a lot of problems a lot of people think are important and we’re generally fragmented.
I guess I don’t really have any answers, other than the obvious (leaded gas is gone, the ozone is recovering), but I can’t help wishing we were more logical than emotional about what we worked towards.
Also, FWIW, I don’t know that we know that we can’t change the past, or if the universe is deterministic, or all kinds of weird ideas like “are we in a simulation right now/are we the AI”/etc.— which are hardcore axioms to still have “undecided” so to speak! I better stop here before my imagination really runs wild…
but like, not leaded pipes so much, as they’re still ’round even tho we could have cleaned them up and every year say we will or whatnot, but I digress
TAI = Transformative AI
I think you’re missing too many prerequisites to follow this post, and that you’re looking for something more introductory.
LOL! Yeah I thought TAI meant
TAI: Threat Artificial Intelligence
The acronym was the only thing I had trouble following, the rest is pretty old hat.
Unless folks think “crunch time” is something new having only to do with “the singularity” so to speak?
If you’re serious about finding out if “crunch time” exists[1] or not, as it were, perhaps looking at existing examples might shed some light on it?
even if only in regards to AGI
Agree with Jim, and suggest starting with some Rob Miles videos. The Computerphile ones, and those on his main channel, are a good intro.
I’m familiar with AGI, and the concepts herein (why the OP likes the proposed definition of CT better than PONR), it was just a curious post, what with having “decisions in the past cannot be changed” and “does X concept exist” and all.
I think maybe we shouldn’t muddy the waters more than we already have with “AI” (like AGI is probably a better term for what was meant here— or was it? Are we talking about losing millions of call center jobs to “AI” (not AGI) and how that will impact the economy/whatnot? I’m not sure if that’s transformatively up there with the agricultural and industrial revolutions (as automation seems industrial-ish?). But I digress.), by saying “maybe crunch time isn’t a thing? Or it’s relative?”.
I mean, yeah, time is relative, and doesn’t “actually” exist, but if indeed we live in causal universe (up for debate) then indeed, “crunch time” exists, even if by nature it’s fuzzy— as lots of things contribute to making Stuff Happen. (The butterfly effect, chaos theory, game theory &c.)
“The avalanche has already started. It is too late for the pebbles to vote.”
- Ambassador Kosh