Say a city has an inadequate budget. There is a movement pro more property tax, and a movement against increasing property tax.
Assume you are a senior partner in the city’s largest law firm with a lot of friends. (The kind of person who gets any voice at all in this kind of politics)
You could join the anti movement, and ask for outright tax rates of zero for some class. Such as senior citizens, families with young children, point is you are trying to change the movement from “generally low taxes” to “lower taxes a lot for a subgroup I am championing”. Simply asking for less taxes is just pulling with the group not sideways.
You could join the pro movement and ask for a variation on Georgism. Put all the increase into a tax on the land itself. Outraged empty lot (in downtown) owners come out to stop you. Simply asking for more taxes is pulling with the group not sideways.
Joining a group wanting to restrict AI progress and asking for a 30 year ban on any improvements or a ban on all technology and a reversion to the stone age is just pulling with the group. Adding a new dimension like a Blockchain record of GPU cluster utilization (making it more difficult for covert ASIs to exist undetected) would be adding a new dimension consistent with the groups goals.
Joining a pro AI group and asking for massive subsidies to chip manufacturers would be pulling with the groups goals. Asking the NIH to start a series of research grants on human tissue with cat fur genes edited in would be adding a new dimension consistent with the groups goals.
I wonder what kind of “tug” has the most effect. Do you ask for something reasonable that is within the Overton window of things a rational person might request, or just demand the earth and expect to get a small concession in reality.
Part of the problem here is you can’t be reasonable. You can’t join the anti property tax group and say the city actually is underfunded compared to other cities, even if thats reality. You can’t join the anti AI group and say GPT-4 is just so stupid adding 10 times the compute will still be deeply subhuman and cite your evidence. (Not claiming the latter just bioanchors says this is true, and generally “last 10 percent” problems can take logarithmically more effort, compute than the 90 percent case)
So a worked example:
Say a city has an inadequate budget. There is a movement pro more property tax, and a movement against increasing property tax.
Assume you are a senior partner in the city’s largest law firm with a lot of friends. (The kind of person who gets any voice at all in this kind of politics)
You could join the anti movement, and ask for outright tax rates of zero for some class. Such as senior citizens, families with young children, point is you are trying to change the movement from “generally low taxes” to “lower taxes a lot for a subgroup I am championing”. Simply asking for less taxes is just pulling with the group not sideways.
You could join the pro movement and ask for a variation on Georgism. Put all the increase into a tax on the land itself. Outraged empty lot (in downtown) owners come out to stop you. Simply asking for more taxes is pulling with the group not sideways.
Joining a group wanting to restrict AI progress and asking for a 30 year ban on any improvements or a ban on all technology and a reversion to the stone age is just pulling with the group. Adding a new dimension like a Blockchain record of GPU cluster utilization (making it more difficult for covert ASIs to exist undetected) would be adding a new dimension consistent with the groups goals.
Joining a pro AI group and asking for massive subsidies to chip manufacturers would be pulling with the groups goals. Asking the NIH to start a series of research grants on human tissue with cat fur genes edited in would be adding a new dimension consistent with the groups goals.
I wonder what kind of “tug” has the most effect. Do you ask for something reasonable that is within the Overton window of things a rational person might request, or just demand the earth and expect to get a small concession in reality.
Part of the problem here is you can’t be reasonable. You can’t join the anti property tax group and say the city actually is underfunded compared to other cities, even if thats reality. You can’t join the anti AI group and say GPT-4 is just so stupid adding 10 times the compute will still be deeply subhuman and cite your evidence. (Not claiming the latter just bioanchors says this is true, and generally “last 10 percent” problems can take logarithmically more effort, compute than the 90 percent case)