That worse for the facts – decision-making and figures of authority

People have a tendency to assume that anything said by figures of authority must automatically be correct, at least when they are talking about their own fields. This, on its face, seems to be a logical assumption. After all, an expert has studied his field far more thoroughly than a layman, and with his greater knowledge he should be more capable of drawing conclusions.

Alas, this is not correct, as history has often shown. Plato and Aristotle taught that the Earth is the center of the universe; roman philosopher Lucretius taught that heavier objects fall faster than lighter objects; United States Deputy Secretary of Defense said in 1998 “The Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe.”. Albert Einstein said that “Human beings only use 10% of their brains.”. Educational system, as it is, is based on indoctrination – people are being taught not how to think, but what to think. While exceptions exist, as a rule, any thought that does not conform to a prevailling dogma is being discouraged, ridiculed or outright forbidden. This means that questioning assumptions and checking “facts” is an uphill battle, one that most people are loath to fight. Consequently, most people – “experts” included – find it easier to “go with the flow” and accept commonly held views as facts, bowing to authority and disregarding evidence of their own eyes. These “facts” are the same they were taught in the school or through the media system. That is to say, they are commonly accepted because it is in somebody’s interest for them to be accepted, not necessarily because they are correct. Students accept “facts” based on the authority of writer, or a teacher, not based on evidence; because it is easier, because it has to be done and because they entered education believing they will be properly informed. Even diregarding this, a legitimate authority speaking within an area of their expertise is not necessarily correct. Noone knows everything, and humans have a tendency to fill in the gaps in knowledge with related knowledge, experiences or their own personality. Consequently, conclusions can always be incorrect. Scientists, always held to be impartial, are just as susceptible to standard human vices of closing eyes to the evidence as anybody else. They are just as motivated by emotion as the next person. As a result, scientists who contradict the prevailling dogma face isolation, ridicule or outright attacks. In the 1950s United States, scientist Wilhelm Reich was jailed and his books burned because he published research contradicting prevailling dogma (his name certainly didn’t help matters either). In fact, real problem of the West is scientism, a belief that other disciplines are “worthwile only insofar as they conform their techniques of investigation to those of the physical and biological sciences.”, which means numbers. However, human sciences for example cannot achieve their goals if limited to numbers, because humans are not computers but rather living beings; they act based on emotions, experiences, expectations; not just on what is mathematically logical or more profitable. Scientism is opposed to science, and is in fact a very militant religion.

Further, in many causes authorities are not in positions where they can be trusted to be objective (this fallacy is called “Appeal to biased authority”), despite having both access to raw data and knowledge/expertise to interpret them. To take a military example, military personnell are commonly accepted to be an authority on weapons systems. However, due to the nature of procurement system – particularly in United States – generals have vested personal interest in seeing that expensive weapons procurement projects succeed, regardless of actual performance of such weapons, as doing so secures them lucrative positions in weapons industry after retirement. Even without such incentive, fact remains that air forces are bureocracies, and generals are top-notch bureocrats. Bureocrats tend to feel threatened by anything out of the box disrupting their carefully set-up routine. Generals are also in position where they can exert pressure on lower-ranking personnel and force them to support the “party line”. Consequently, statements by military personnel, especially generals, about performance of weapon systems should not be accepted on face value, as they are likely to be intentionally false due to personal goals and concerns, or simply due to wishful thinking (“A doctor who treats himself has a fool for a patient.”). In one of many examples of such pressure, John Boyd was court-martialled for proving that original Sidewinder was dodgable with no countermeasures. Robin Olds was ordered to cease BFM training in order to prevent mishaps, and even later on when Suter got approval to institute Red Flag exercises, these were initially done well but got quickly riddled with bureocraftic safety measures, such as 500 ft altitude limit. Many performance parameters will also be classified, which means that military personnel cannot reveal them even if they want to, and generals are not breathing down their necks.

 

Another form of appeal to authority is appeal to masses: “majority believes this to be true, so it is true” (bandwagon fallacy). However, majority is often not correct. While even ancient Greeks had known that Earth is round, and scientific community in Europe never actually believed the flat Earth myth, majority of people believed the Earth to be flat. However, humans are psychologically predisposed to accept views that are held by authority, held by majority, or first made on some subject, without much logical consideration and fact-checking. Failure to do so creates considerable psychological distress, in good part due to peer pressure. High-status individuals also create a stronger likelihood of a subject agreeing with an obviously false conclusion, despite the subject normally being able to clearly see that the answer was incorrect. This can create a ripple effect and cause vast majority of people, experts included, to agree on an incorrect assumption or position.

Related issue, especially problematic in extremely structured organizations (such as military) due to their reliance on interdependence, respecting authority and chain of command, is “groupthink”. In this case, a desire for smooth functioning of a system results in dysfunctional, and often irrational, decision-making outcome. Minimizing conflict is seen as an imperative, leading to intentional supression of critical evaluation and dissending viewpoints. This leads to lack of willingness to question authority, and loss of individual creativity and independent thinking. Result is an illusion of invulnerability and irrational overestimation of group’s ability to make a correct decision. This also means underestimation of ability of outsiders to notice flaws in group’s decision-making cycle. Oftentimes, process is subconscious, caused by education and patterns of thinking, and is thus not noticed by members of the group. Groupthink led to US Navy’s illusion of invincibility and thus directly to the Pearl Harbor fiasco. While Japanese preparations to attack US were well known within US military, nobody seriously considered a possibility of attack against Hawaii, due to overestimation of fleet’s ability do defend against air attack, underestimation of Japanese technological adaptability and illusion of safety due to distances involved. Again, groupthink typically starts from figures of authority and makes its way down the ranks.

Consensus science, a related “scientific” approach, itself is a fallacy. It is little more than an institutionalized groupthink, used to push forward ideas without proper debate. That way, science is being turned into politics. Whereas politics require consensus, science only requires person to be verifiably correct. But as Michael Chrichton puts it: “Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled.”. This is true for any field, and any theme. Scientific consensus was often wrong: claims of fevers being contagious were rejected by scientific consensus in 1795., 1843. and 1849. In 1920s, scientific consensus was that pellagra was contagious despite proof that it is caused by poor diet. Continental drift was denied by scientists for 50 years, until 1961., despite being obvious to a ten-year schoolchild looking at a map. Such examples are countless, yet people still rely on “most people agree on X” and “most [insert authority figures] agree on X” to counter logical arguments. Science has no shortage of charlatany – weather science is incapable of predicting weather twelve hours ahead, yet it claims to be able to predict weather patterns hundreds of years into future. Modern science tends to act like Middle/earl-New age religion, attacking anyone who disagrees with scientific dogma not through arguments, but through raw force, condemning dissenders for heresy.

People, both masses and authorities, often refuse all evidence and logic if it goes against their political beliefs (more here). Reason is often just a way of justifying decisions made on emotional basis, and when emotions and facts disagree – that worse for the facts. But even when person genuinely wants to be objective, emotions and prejudices are often used – even unintentionally, or unconsciously – to fill in always present gaps in knowledge. This is true for all people, regardless of their background or education.

Even when it is reasonable to believe an authority, more direct evidence always takes precedence. For example, aircraft turn rates can be compared by comparing wing loading, g limits and aerodynamic configuration. If done correctly, it carries more weight than claims of experts. If, however, turn rates have been measured and data is avaliable, this data takes precedence over anything else.

 

On the opposite side of the coin, that a person is arguing from a position of no authority does not mean that the argument is invalid. Oftentimes, people from within the system are incapable of seeing forrest for all the trees they are surrounded with, and it takes persons from outside to make an obvious observations about system as a whole. Further, people who are outside the system are less likely to have personal interest in any side of the argument, making it less likely that their observations will be biased. Of course, in any case, quality of observations is dependant on quality of data avaliable, meaning that person from outside the system has to do more work on “connecting the dots” in order to have sufficient basis for their conclusions when compared to person who works within the system.

 

All of this does not mean that claims of experts can be simply disregarded. It does mean that they should be judged on basis of evidence avaliable, not on basis of who said it, and that evidence should be carefully scrunitized. If such evidence is not avaliable, then authority figure itsef should be evaluated, especially in terms of any possible causes of bias, such as connections to groups that stand to profit from authority figure’s claims. If such connections exist – as they do between e.g. USAF and US defense industry, and indeed most world’s militaries with respective countries’ defense industries – then authority figure immediately looses on relevancy. Authority figure should also be making statements about a field they have knowledge in, else their statements are just as valid as those of lay men.

Above is an actual practice in courts. While expert testimony is accepted, it is not accepted by itself. Rather, after an expert gives testimony, facts and methodology he used to reach a conclusion are very carefully scrunitized. If they are found unsound, then expert’s opinion is rejected, regardless of expert’s credentials.

 

Reason why people have a habit of taking anything for granted if it was stated by an expert or another figure for authority is nothing but good old intellectual laziness. Thinking is, whatever people may believe, not easy, as it requires finding large amounts of data, sorting and analyzing them, before drawing conclusions. Relying on expert’s opinions is just a shortcut to avoid work of finding evidence and connecting the dots. It is also often used when person is incapable of finding evidence, and thus uses an appeal to authority as a substitute. Same intellectual laziness is the reason why people, when discussing weapons systems, focus on hardware specifications of weapons and ignore their meaning within actual combat usage, or overwhelming importance of human factor, that is, user’s abilities / competence and his interaction with weapon. This is especially prevalent in assessments of weapons’ performance in combat, where human factor is oftentimes ignored and results of one side’s superiority in training, organization and adaptiveness are attributed to its – oftentimes nonexistent – superiority in hardware (see invasion of France in 1940 or bth Gulf Wars for classical examples of such fallacious approach).

 

Further reading:
http://stephenschneider.stanford.edu/Publications/PDF_Papers/Crichton2003.pdf
http://www.alternet.org/media/most-depressing-discovery-about-brain-ever

13 thoughts on “That worse for the facts – decision-making and figures of authority

  1. Most important thing I ever learned in school was how to learn.

    Many folks in general population are just not very smart or very closed minded but, there is also the problem of not having time to verify everything so you just accept many facts.

    Like

    1. Agreed. Today’s censorship is quite bit different from what people first think when someone mentions the word – instead of denying information, it is about inducing information overload. And it is frighteningly effective.

      Like

      1. The more more humanity will connect in network of self-censorship opinion, the more it will behave like a school of fish. All at the same time in the same direction. Yet history (and physic) shows that it is the free electrons that have always made the difference, not the nucleus.

        Liked by 1 person

  2. By any chance, are you familiar with the work of Bob Altemeyer?

    Anyways, his work was on authoritarians and “socially dominant people”.

    Authoritarians are characterized by:
    1. Authoritarian submission to perceived legitimate authority
    2. Conventionality towards perceived social norms and ideas set by leadership
    3. Extreme hostility towards people who are different and a very high tolerance for hypocrisy from leadership

    Not a surprise, but such people also struggle with thinking logically. They motivated by fear, self-righteousness, and a notable lack of critical thinking. They prefer to repeat the party line. Authoritarians are dogmatic and feel empowered in their groups.

    Anyways, if you are interested, Bob Altemeyer has made his work free for everyone:

    Click to access TheAuthoritarians.pdf

    Apparently in the Western world, what would be known as “right wing conservative” (especially in the Anglo world) correlated with high authoritarian tendencies, while in Eastern Europe, strong beliefs in Communism were more likely to be authoritarian.

    Although not perfect, I’d consider it to be one of the most important contributions of the 20th century and very worthy of a read.

    Like

    1. I’ll read it when I get time, thanks for the link. Anyway, it would seem that most people are authoritarians. This “authoritarianism” is in fact an aspect of social conformism, where acceptance is more important than facts. We all feel more comfortable among people who hold similar beliefs to ones we hold, and many are willing to go to great lengths to remain in that comfort zone. Anyone not comforming to group beliefs or norms is thus seen as a threat. This has no connection to wether belief is factually true, and as a result, such people will assault and/or ridicule anyone who holds different views, regardless of the proof they have. It is because of this behavior that term “conspiracy theorist” gained negative connotations, despite actually being just a technical term for people who believe that very few or no major social events are accidental, and are instead planned.

      Sorry for the late reply, I was busy these days.

      Like

    2. Yeah it’s no problem.

      I’ve been busy myself as of late.

      Actually that reminds me regarding the legal system and the use of eye witnesses that are biased – in the American justice system, this is quite common.

      Like

      1. A conviction should never be based on eye witness alone but, reality is that most crimes don’t leave forensic evidence behind so eye witnesses (if available) become the most important source of evidence.

        Not perfect but nothing is.

        Liked by 1 person

      2. Yes, with a life sentence a mistake can be at least partly fixed even at a later date. No such possibility exists with a death penalty.

        Like

    3. I took a look at Dr. Robert Altemeyer’s book about authoritarian personalities. It is clearly his own take on The Authoritarian Personality, a book written in the 1950s by Jewish intellectuals from the radical Marxist Frankfurt School. Both Altemeyer’s book and The Authoritarian Personality are blatant attempts to pathologize traditional European family structure.

      Wikipedia says about The Authoritarian Personality:

      “The Authoritarian Personality “invented a set of criteria by which to define personality traits, ranked these traits and their intensity in any given person on what it called the ‘F scale’ (F for fascist).”” … Authoritarianism was measured by the F-scale. The “F” was short for “pre-fascist personality.” Another major hypothesis of the book is that the authoritarian syndrome is predisposed to right-wing ideology and therefore receptive to fascist governments.

      …The impetus of The Authoritarian Personality was the Holocaust… Adorno had been a member of the “Frankfurt School”, a predominantly Jewish group of philosophers and Marxist theorists who fled Germany when Hitler shut down their Institute for Social Research. Adorno et al. were thus motivated by a desire to identify and measure factors that were believed to contribute to antisemitic and fascist traits. The book was part of a “Studies in Prejudice” series sponsored by the American Jewish Committee’s Department of Scientific Research.

      … Some observers have criticized what they saw as a strongly politicized agenda to The Authoritarian Personality. Social critic Christopher Lasch argued that by equating mental health with left-wing politics and associating right-wing politics with an invented “authoritarian” pathology, the book’s goal was to eliminate antisemitism by “subjecting the American people to what amounted to collective psychotherapy—by treating them as inmates of an insane asylum.”

      https://en.wikipedia.org/wiki/The_Authoritarian_Personality

      Altemeyer’s book seems to be the result of many years of struggling with the questions raised by The Authoritarian Personality. But there is no evidence that he departed from the main premise, mainly that traditional European family structures are pathological and strongly disposed to violence against Jews. For example, he writes about Whites:

      “Right-wing authoritarians are prejudiced compared to other people. That does not
      mean they think that Jews can’t be trusted at all, that all Black people are naturally
      violent, or that every Japanese is cruel.”

      He spends quite a bit of time discussing religious fundamentalists as authoritarian, but the criticisms are all directed against Christians, and none against Jews. He makes only one offhand reference to Judaism. That is very odd because Judaism is one of the most authoritarian of all religions.

      For a far more insightful analysis, please read this article on the Pathologization of Gentile Group Allegiances. It makes clear the Jewish connection:

      “The first generation of the Frankfurt School were all Jews by ethnic background and
      the Institute of Social Research itself was funded by a Jewish millionaire, Felix Weil . Weil’s efforts as a “patron of the left” were extraordinarily successful: By the early 1930s the University of Frankfurt had became a bastion of the academic left and “the place where all the thinking of interest in the area of social theory was concentrated” . During this period sociology was referred to as a “Jewish science,” and the Nazis came to view Frankfurt itself as a “New Jerusalem on the Franconian Jordan.”

      “Horkheimer and Adorno (authors of the Authoritarian Personality) propose that modern fascism is basically the same as traditional Christianity because both involve opposition to and subjugation of nature. While Judaism remained a “natural religion” concerned with national life and self-preservation, Christianity turned toward domination and a rejection of all that is natural. In an argument reminiscent of Freud’s argument in Moses and Monotheism, religious anti-Semitism then arises
      because of hatred of those “who did not make the dull sacrifice of reason. . . .
      The adherents of the religion of the Father are hated by those who support the
      religion of the Son—hated as those who know better” http://www.kevinmacdonald.net/chap5.pdf

      This subject is important for our time. The leftist criticisms of Trump and his followers are the same ones found in Altemeyer’s attack on traditional European peoples. By reading Altemeyer, you will understand exactly why they call Trump a Nazi fascist.

      Like

  3. Hi, will your blog be covering issues related to small arms as well? There’s a lot of groupthink regarding them too. You could start with what you think of the “Modular Handgun System” program, for instance.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s