suppose I believe the second coming involves the Lord giving a speech on capitol hill. one thing I might care about is how long until that happens. the fact that lots of people disagree about when the second coming is doesn’t mean the Lord will give His speech soon.
similarly, the thing that I define as AGI involves AIs building Dyson spheres. the fact that other people disagree about when AGI is doesn’t mean I should expect Dyson spheres soon.
The amount of contention says something about whether an event occurred according to the average interpretation. Whether it occurred according to your specific interpretation depends on how close that interpretation is to the average interpretation.
You can’t increase the probability of getting a million dollars by personally choosing to define a contentious event as you getting a million dollars.
My response to this is to focus on when a Dyson Swarm is being built, not AGI, because it’s easier to define the term less controversially.
And a large portion of disagreements here fundamentally revolves around being unable to coordinate on what a given word means, which from an epistemic perspective doesn’t matter at all, but it does matter from a utility/coordination perspective, where coordination is required for a lot of human feats.
suppose I believe the second coming involves the Lord giving a speech on capitol hill. one thing I might care about is how long until that happens. the fact that lots of people disagree about when the second coming is doesn’t mean the Lord will give His speech soon.
similarly, the thing that I define as AGI involves AIs building Dyson spheres. the fact that other people disagree about when AGI is doesn’t mean I should expect Dyson spheres soon.
The amount of contention says something about whether an event occurred according to the average interpretation. Whether it occurred according to your specific interpretation depends on how close that interpretation is to the average interpretation.
You can’t increase the probability of getting a million dollars by personally choosing to define a contentious event as you getting a million dollars.
My response to this is to focus on when a Dyson Swarm is being built, not AGI, because it’s easier to define the term less controversially.
And a large portion of disagreements here fundamentally revolves around being unable to coordinate on what a given word means, which from an epistemic perspective doesn’t matter at all, but it does matter from a utility/coordination perspective, where coordination is required for a lot of human feats.