If some conscious activations the process of consolidating is itself causing “one idea to win… sometimes the wrong one”, then trying consolidation on “the feelings about the management of consolidation and its results” seems like it could “meta-consolidate” into a coherently “bad” result.
...
It could be that the author of the original post was only emitting their “help, I’m turning evil from a standard technique in psychotherapy that I might or might not be using slightly wrongly!” post in a brief period of temporary pro-social self-awareness.
This is part of why I’m somewhat opposed to hasty emotional consolidation (which seems to me like it rhymes with the logical fallacy of hasty generalization).
If some conscious activations the process of consolidating is itself causing “one idea to win… sometimes the wrong one”, then trying consolidation on “the feelings about the management of consolidation and its results” seems like it could “meta-consolidate” into a coherently “bad” result.
Can you give an example of how this would happen? Do you have examples of it? I think the only way that the process of consolidating can cause one idea to win in the way described is through suppression of a higher level concern. At some point as you keep going meta there’s nowhere left to suppress it.
If some conscious activations the process of consolidating is itself causing “one idea to win… sometimes the wrong one”, then trying consolidation on “the feelings about the management of consolidation and its results” seems like it could “meta-consolidate” into a coherently “bad” result.
...
It could be that the author of the original post was only emitting their “help, I’m turning evil from a standard technique in psychotherapy that I might or might not be using slightly wrongly!” post in a brief period of temporary pro-social self-awareness.
If we are beyond the reach of god then there’s nothing in math or physics that would make a spin glass process implemented in belief holding metamaterials always have to point “up” at the end of the annealing process. It could point down, at the end, instead.
This is part of why I’m somewhat opposed to hasty emotional consolidation (which seems to me like it rhymes with the logical fallacy of hasty generalization).
Can you give an example of how this would happen? Do you have examples of it? I think the only way that the process of consolidating can cause one idea to win in the way described is through suppression of a higher level concern. At some point as you keep going meta there’s nowhere left to suppress it.