In this article referenced below “Rapid-Fire Reasoning: Research Could Help Military Leaders Make Better Decisions Under Pressure ”
Dr. Dennis Folds discussed that that when people process information, they develop unconscious strategies – or biases – that simplify their decisions. The research indicated that nine different kinds of biases can lead to errors in judgment when people are dealing with a lot of information. Meanwhile, the error rate was not as high as researchers expected for individuals under time pressure.
The key in his finding was in my opinion: “Also, the study revealed that subjects who were trained to spot conditions that lead to decision-making biases were better at detecting "false-alarm opportunities."
So in FF if you practice thinking about the data and determine where are you weak? What are the conditions where decisions seem too good or it came to you too quick. I am picking this guy off the waiver wire! Ask before hand “Is this a false-alarm? Write it out, walk away and read it again. Are you in a bias trap!
In experiments, Dr. Folds considered previous research on seven specific biases that affect individuals who must wrestle with large amounts of data:
Absence of evidence. Missing, relevant information is not properly considered. Availability. Recent events or well-known conjecture provide convenient explanations.
Over-sensitivity to consistency. People give more weight to multiple reports of information, even if the data came from the same source.
Persistence of discredited information. Information once deemed relevant continues to influence even after it has been discredited.
Randomness. People perceive a causal relationship when two or more events share some similarity, although the events aren't related.
Sample size. Evidence from small samples is seen as having the same significance as larger samples.
Vividness. When people perceive information directly, it has greater impact than information they receive secondhand -- even if the secondhand information has more substance.
In addition, Dr. Folds discovered two new biases that can hinder the quality of rapid decisions:
Superficial similarity. Evidence is considered relevant because of some superficial attribute, such as a key word in a message title. For example, a hostage situation might have been reported earlier, and then another message shows up in the inbox with the word "hostage" in its header, although the message's actual content has nothing to do with hostages.
Think of all the phrases I and others use in blogs and written pieces. Sensational headlines get your attention. I use those as well. BUT I EXPECT READERS TO BE CRITICAL OF WHAT SAY. You look at the data and you decide!
Sensationalist appeal. Items containing exaggerated claims or threats influence a decision-maker even when there is no substance to the content. Information presented in vivid and concrete detail often has unwarranted impact, and people tend to disregard abstract or statistical information that may have greater evidential value
So in fantasy football language, a player says he drafted one of the top RBs first last year and won his league. That can have as much impact as large scale data that shows that on average that way wins you little more than by drafting a top WR first.
Case histories and personal anecdotes will have greater impact than more informative but abstract aggregate or statistical data. Somebody on a podcast said this or that vs. looking at a table of data. The hearing or watching on TV is valued because it is vivid etc. That is sensational appeal. We are currently in the time when you hear or read that Player Blank Blank is looking/playing/running etc good.
Nisbett and Ross labeled this the "man-who" syndrome and provide the following illustrations.
"But I know a man who smoked three packs of cigarettes a day and lived to be ninety-nine."
"I've never been to Turkey but just last month I met a man who had, and he found it.
"I know someone that won a FF league using only 2 RBs all year etc.”
Absence of Evidence. A principal characteristic of intelligence analysis is that key information is often lacking. Analytical problems are selected on the basis of their importance and the perceived needs of the consumers, without much regard for availability of information. Analysts have to do the best they can with what they have, somehow taking into account the fact that much relevant information is known to be missing.
In the Journal of Experimental Psychology: Human Perception and Performance 1978, Vol. 4, No. 2, 330-344 Fault Trees: Sensitivity of Estimated Failure Probabilities to Problem Representation Baruch Fischhoff, Paul Slovic, and Sarah Lichtenstein Decision Research, A Branch of Perceptronics Eugene, Oregon
Abstract. These authors use fault trees to represent problem situations by organizing "things that could go wrong" into functional categories. Such trees are essential devices for analyzing and evaluating the fallibility of complex systems. They follow many different formats, sometimes by design, other times inadvertently.
The present study examined the effects of varying three aspects of fault tree structure on the evaluation of a fault tree for the event "a car fails to start." The fault trees studied had four to eight branches, including "battery charge insufficient," "fuel system defective," and "all other problems."
Major results were as follows:
(a) People were quite insensitive to what had been left out of a fault tree,
(b) increasing the amount of detail for the tree as a whole or just for some of its branches produced small effects on perceptions, and
(c) the perceived importance of a particular branch was increased by presenting it in pieces (i.e., as two separate component branches). Insensitivity to omissions was found with both college student subjects and experienced garage mechanics.
As an antidote for this problem, analysts should identify explicitly those relevant variables on which information is lacking, consider alternative hypotheses concerning the status of these variables, and then modify their judgment and especially confidence in their judgment accordingly.
I suggest using this information to create “Fault Trees” for Fantasy Football analysis.
1) Pick Critical Issues – Pivot Player for example.
2) Underneath list all alternatives to the success of that player. His Faults!
3) Determine the strength of the evidence under each branch!
4) Is there evidence or lack of evidence?
5) Judge then the faults maybe assigning a % confidence. Also note where that is the absence of evidence present. At least you know that area of knowledge is weak!
Other aspects to use fault trees are: Injury Scenarios. If Eddie Lacy goes down, what are the fault tree branches for Starks etc.