Decision-making, for dissidents

However much of a traditionalist a dissident may be, he would do well to have books such as Thinking, Fast and Slow in his library.

While the social sciences are considered the citadel of leftism – certainly as far as staffing and application of social science in universities are concerned, that is true – they provides some valuable insights for the right. Although the right leans on traditional models of character, behaviour and morality, scientific research sometimes provides empirical backing for some of the right’s instinctive understanding of human action. Such evidence is not why the right hold such precepts to be true – and in some senses (particularly in that of religion), the deliberate flouting of evidence is celebrated and even considered necessary – but there should be room for consideration of how empirical data can back a belief system.

Here is a summary of Nobel Prize-winning psychologist-economist Daniel Kahneman’s Thinking, Fast and Slow (2011), as it relates to psychology and decision making. This will be of particular value for counter-establishment elites and dissidents of all inclinations.

Priming: Priming is when people are set up to respond to stimuli and clusters of associated ideas, mainly through positive associations and social approval. Even familiarity without understanding improves positive reactions to triggers. Responses are instinctive and can be cultivated by sites of authority. Multiculturalism is not a natural response of people. People tend to be averse to difference, therefore priming is essential to maintain the little support there is for the construct of multiculturalism. Look at the nationalism of the average (Han) Chinese or Russian national. They are not genetically predisposed towards nationalism; the official culture of these states primes citizens towards unity and national pride. Therefore, should there be a removal of multicultural priming and the institution of nationalism, the people of (say) the UK will follow.

Instinct: Instinct trumps cognition most of the time. As Jonathan Haidt’s moral psychology research confirms, cognition tends to provide post facto justifications for personal judgements based on instinct, prejudice and social pressure. Intellectual arguments only work when a subject is already searching for a reason to change position. Especially when cognitive checking requires effort and time, people tend to be lazy and rely on instincts.

Influence: Television greatly influences low-information individuals. People rely on impressions regarding trustworthiness and competence of public figures. They tend not to check on the facts when their impressions of reality align with their instincts and social acceptance. In other words, if one controls the narrative then the reality of an elite leadership’s success or failure is of relatively low importance. The present tends to influence a person’s overall judgement. Satiated people view the world more benignly and generously than a hungry or anxious person. Bread and circuses, indeed.

Socialisation: People tend to conform when they believe they are being watched. This is known in religious societies, where an omniscient god observes and punishes the oath-breaker and the sinner, leading to greater socialisation, conformity and rule-adherence. Note how the presence of the cardboard cut-out of a policeman in a store reduces incidents of shoplifting even though there is no increased security there.

Vividness: Language plays an important role is swaying people’s decisions. A statement phrased in a manner that is clear, vivid, memorable and emotional has a greater impact upon the listener than the same proposition phrased less effectively, even if the content of the statements is factually identical. Language sways the listener often more than the content of that statement. Framing of factual statistics can make a noticeable difference in the way data is received. That information comes as news to no one but it is well to be reminded that there is analysis of such observations.

Invalidity: Related to the influence of socialisation on decisions, the absence of good grounding for beliefs is no bar to a person (perhaps even a majority of a population) holding those beliefs. Even knowledge of the probable invalidity of data regarding a certain belief does not shake the confidence of the belief-holder. “The amount of evidence and its quality do not count for much, because poor evidence can make a very good story. For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous – and it is also essential.” (Daniel Kahneman, Thinking, Fast and Slow, Penguin, 2011, p.209.)

Halo effect: The positive inferences we make from our predisposition (for example towards a person) heavily colour our responses. If a speaker is handsome, tall, confident and well spoken, we instinctively trust him and subject his statements to a lower level of scrutiny and criticism. Image management and canny presentation can carry a faulty argument, if the audience is made amenable.

Obstinacy: Although this is not a term Kahneman uses in his book, it is glib summary of one aspect of his findings. Even though educated, intelligent people were given data-backed explanations of true and unexpected human behaviour, they did not absorb this into their understanding. Although they could recite the information and use it in arguments, it did not change the way they thought or acted. Thus, even educated people fail to act on the fruits of their education if it does not fit their instinctive model of the world. The persistence of an apparently truthful model edged out any evidentially based (but unexpected) actually truthful model in the minds of even self-aware and supposedly logical individuals, in a way we could call the (persistent, subconscious) obstinacy of suppositions.

Unexperts: In the area of political predictions, experts (on aggregate) fare poorly. “The experts performed worse than they would have if they had simply assigned equal probabilities to each of the three potential outcomes. In other words, people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options. Even in the region they knew best, experts were not significantly better than nonspecialists.” (Daniel Kahneman, Thinking, Fast and Slow, Penguin, 2011, p.219.) For dissidents, this is heartening. Experts – who are detached from the subject of their specialisation – predicting the failure of dissident movements do not necessarily know anything worth knowing. Setting aside the possibility of experts deliberately manipulating predictions for tactical purposes, we find that group think and socialisation turn political experts into unexperts.

Obviously, although these summaries are highly simplified, they are not misleading. Readers are advised to consult the book for thorough overviews. Besides offering these insights into decision-making, Kahneman’s book will help readers temper (or at least understand) errors in statistical prediction and overall intuition. Such correctives are of value to anyone. So, however much of a traditionalist a dissident may be, he would do well to have books such as Thinking, Fast and Slow in his library.

Alexander Adams

Alexander Adams is an artist and critic. Alongside Bournbrook Magazine, he is a regular contributor to The JackdawThe Critic and The Salisbury Review.

Previous
Previous

Rome burns whilst police chase tweets

Next
Next

The rules themselves were worse than the rule-breaking