Not about Gendlin, but following the trail of relating chunks to other things: I wonder if propaganda or cult indoctrination can be described as a malicious chunking process.
I’ve weighed in against taking the numbers literally elsewhere, but following this thread I suddenly wondered if the work that using few words was doing isn’t delivering the chunk, but rather screening out any alternative chunk. If what we are interested in is common knowledge, it isn’t getting people to develop a chunk per se that is the challenge; rather everyone has to agree on exactly which chunk everyone else is using. This sounds much more like the work of a filter than a generator.
When I thought about it in those terms, it occurred to me that it is perfectly possible to drive this in any direction at all; we aren’t even meaningfully constrained by reality. This feels obvious in retrospect—there’ve been lots of times when common knowledge was utterly wrong—but doing that on purpose never occurred to me.
So now it feels like what cults do, and why they sound so weird to everyone outside of them, is deliberately create a different sequence of chunks for normal things for the purpose of having different chunks. Once that is done, the availability heuristic will sustain communication on that basis, and the artificially-induced inferential distance will tend to isolate them from anyone outside the group.
Not about Gendlin, but following the trail of relating chunks to other things: I wonder if propaganda or cult indoctrination can be described as a malicious chunking process.
I’ve weighed in against taking the numbers literally elsewhere, but following this thread I suddenly wondered if the work that using few words was doing isn’t delivering the chunk, but rather screening out any alternative chunk. If what we are interested in is common knowledge, it isn’t getting people to develop a chunk per se that is the challenge; rather everyone has to agree on exactly which chunk everyone else is using. This sounds much more like the work of a filter than a generator.
When I thought about it in those terms, it occurred to me that it is perfectly possible to drive this in any direction at all; we aren’t even meaningfully constrained by reality. This feels obvious in retrospect—there’ve been lots of times when common knowledge was utterly wrong—but doing that on purpose never occurred to me.
So now it feels like what cults do, and why they sound so weird to everyone outside of them, is deliberately create a different sequence of chunks for normal things for the purpose of having different chunks. Once that is done, the availability heuristic will sustain communication on that basis, and the artificially-induced inferential distance will tend to isolate them from anyone outside the group.