Copyright (c) 2013 John L. Jerz

Blackwell Handbook of Judgment and Decision Making by Koehler and Harvey

Home
A Proposed Heuristic for a Computer Chess Program (John L. Jerz)
Problem Solving and the Gathering of Diagnostic Information (John L. Jerz)
A Concept of Strategy (John L. Jerz)
Books/Articles I am Reading
Quotes from References of Interest
Satire/ Play
Viva La Vida
Quotes on Thinking
Quotes on Planning
Quotes on Strategy
Quotes Concerning Problem Solving
Computer Chess
Chess Analysis
Early Computers/ New Computers
Problem Solving/ Creativity
Game Theory
Favorite Links
About Me
Additional Notes
The Case for Using Probabilistic Knowledge in a Computer Chess Program (John L. Jerz)
Resilience in Man and Machine

Blackwell Handbook of Judgment and Decision Making by Koehler and Harvey

If we are at all concerned with practical reasoning, then we should become interested in the field of judgment and decision making.
 
p.5"The field of judgment and decision making is mainly concerned with action and with judgments that are useful in decision making about action. Thus the field should tell us a great deal about practical reasoning."
 
People make decisions often without thinking about the heuristic they are using. A good heuristic should be able to be applied to a new situation, and should produce good results.
 
p.77,78"Research suggests that people hardly ever make conscious decisions about which heuristic to use, but that they quickly and unconsciously tend to adapt heuristics to changing environments, provided there is feedback (Payne et al., 1993)... A good heuristic needs to be robust. Robustness is the ability to make predictions about the future or new events"
 
Amazingly, in order to make a decision we have to ignore information - this is due to the fact that some information is not diagnostic in nature, or has a possibility of misleading us. We should focus on information that generalizes.
 
p.80"The general lesson is that in judgments under uncertainty, one has to ignore information in order to make good predictions. The art is to ignore the right kind. Heuristics that promote simplicity, such as using the best reason that allows one to make a decision and ignore the rest, have a good chance of focusing on the information that generalizes."
 
What we choose to focus our attention on is very important when making a decision. We should gather information according to some useful rule, then analyze it using another useful rule. Our focus should be on 1) gathering useful information,  2) the reduction of this useful information to knowledge that can be applied towards making a decision and 3) evaluating the choices and the likely course of action after each choice is made.
 
p.112-113"A core idea of the information-processing approach is that conscious attention is the scarce resource for decision makers (Simon, 1978). Thus, people are generally highly selective about what information is attended to and how it is used.  Understanding what drives selective attention in decision making is a critical task for decision research... Importantly, people may be unaware that their attention has been focused on certain aspects of the task environment, and that their decisions consequently have been influenced... If attention is the scarce resource of the decision maker, then helping individuals manage attention is critical for improving decisions... The distinction between the cost of processing an item of information and the cost of acquiring information is related to the idea of attention as the scarce resource... An increase in the cognitive (or emotional) cost of processing an item of information, like the cost of acquiring an item of information, will lead to greater use of simplification mechanisms that minimize information processing."
 
It seems that our positional evaluation function should have some way to accurately perceive the relationships in the environment - at least if we want to have some degree of predicting the future.
 
p.220"The ability to accurately perceive relationships in the environment is an essential component of adaptive behavior, as it allows powers of explanation, control, and prediction."

The challenge is to discover the valid component of intuitive judgment amid the extensive noise that surrounds it. My dentist pulled up a copy of my x-rays and spent a few minutes looking at it before he did a recent examination. The x-ray (and his notes) told him the diagnostic information he needed. There is an equivalent to an x-ray for every problem that exists - the trick seems to be finding it.

p.281"When constructing models, in order to combine the multiple factors that matter, people often harbor simplistic notions about cause and effect (Tversky & Kahneman, 1980; Einhorn & Hogarth, 1986). They may ignore feedback loops or secondary interactions (Sterman, 1989, 1999). Surprisingly, intuitive predictions involving multiple variables are usually outperformed by linear regression models based on those very judgments (known as bootstrapping). Combining human and mechanical predictions typically beats either alone (Blattberg & Hoch, 1990; Hoch, 1994, 2001). The challenge is to discover the valid component of intuitive judgment amid the extensive noise that surrounds it (Whitecotton, Sanders, & Norris, 1998). Recent approaches in statistics have tried to address the LH case [LH stands for Low uncertainty - high complexity] using vector autoregression (Tiao, 2001)."

Perhaps we can use some type of 'influence diagram' in the evaluation function of our computer chess program. It seems that we would benefit from a general purpose way to predict how 'fully engaged' each chess piece is in the game. This of course would depend on specifically how each piece can interact with the other pieces in the future.
 
p.283"Tools to Improve Prediction [section title] Low uncertainty- high complexity (LH case) In this case, the number of variables involved, and their relationships, will require a more systematic analysis. LH cases are typically amenable to some form of deterministic modeling. A simple approach would be to start with assumption analysis (Mason & Mitroff, 1981) since this may identify gaps or distortions in the fabric of presumed interrelationship. Next, an influence diagram could be constructed and/or an entire system dynamic model (see Sterman, 2000) to fully appreciate causal linkages over time (Einhorn & Hogarth, 1986). Another approach to consider is discovery-driven planning (McGarth & MacMillan, 1995), in which one works backward to see what assumptions and interim achievements would be required to achieve a particular end state. This technique will also help identify key drivers that should be monitored closely so that timely adjustments can be made in the forecasts if needed. In addition to these more holistic approaches, more focused techniques exist to improve the quality of judgments when matters get complex. Reason generation, role-playing, imagining unusual outcomes (outliers), comparisons against past cases, and panel techniques like Delphi polling can all improve judgments about input variables or causal relationships (see Armstrong, 2001)."
 
Experts are often able to predict the future and to position themselves for the future that one day will become the present. An expert can be anyone who does a particular job on a regular basis, and is able to notice the cause and effect relationships that occur predictably. For example, I usually have Sunday brunch at a particular restaurant, and I have noticed that the bartender/ waiter is very efficient at predicting what each regular patron will order and in anticipating when their glass is empty. Experts often maintain tables of statistics (look at a sports page) - diagnostic information to explain why a team won or lost, or to predict who will win the game next week.  A football coach runs a series of practices and anticipates what areas need improvement so that the team will be prepared for the upcoming game.  Coaches will often predict how a player will perform in a game based on their performance in simulated games during a weekly practice. Perhaps we can come up with ways to "test" the future performance of the pieces on the chessboard in order to determine (predict, or estimate) the winnability of a chess game.
 
p.301,302"Experts can use their detailed mental models, coupled with their understanding of the current state of the situation, to construct simulations of how the situation is going to develop in the future, and thereby generate predictions and expectations... Experts spend relatively more time analyzing the situation than deliberating about a course of action, whereas non-experts show the reverse trend.. The richer mental models of experts enable them to identify atypicalities and therefore adjust the story they are developing to explain events... Expert decision makers tend to use their mental models to fill gaps with assumptions, to mentally simulate and project into the future, to formulate information seeking tactics."
 
Perhaps we should look into the area of game theory for additional insight into the problem. 
 
p.486"Game theory is a branch of mathematics that provides a framework for modeling and predicting behavior in social situations of cooperation, coordination, and conflict."
 
In a game situation, we must attempt to predict what our opponent will do. We can also predict what we would do in our opponent's situation. We can also attempt to bound what our opponent will do. Strategic planning seems to be forming a set of scenarios based on possible responses to our movements, based in part on what is likely to happen based on our insight. We are therefore concerned with our potential for action, since the exact positions we will occupy in our game cannot be predicted with absolute certainty a few moves into the future. Our choice of moves in our game should be based on 1) what can be predicted with certainty, 2) what we are reasonably sure of, and 3) what might happen that we somehow were surprised and did not anticipate in advance.
 
p.496"Strategic thinking in games requires players to form beliefs about what opponents will do."
 
Experts appear to know how to acquire meaningful information, and know how to interpret it and use it to make effective decisions. Experts appear to know how to create and implement diagnostic tests, the results of which give them a level of understanding or insight into a certain matter. The results of these diagnostic tests help the expert make their decision, or help them to focus their attention in a continued search for a solution. Imagine a teacher creating a midterm or a final exam. The teacher will try to create a test which demonstrates the student's knowledge of a certain subject matter. The teacher will use the results to assign a grade for the semester. There is a good likelihood that the students who know more will score higher on the tests. It appears that if you want to estimate a certain performance capacity, the design and execution of diagnostic tests is one way you can do this. I had the opportunity to work with an Engineer who was in his part time a Scholastic Aptitude Test  (SAT) tutor. It seems that the Scholastic Aptitude Test is so important in getting into a good college that wealthy parents will pay people to tutor their children in improving their scores on this test. You can also buy inexpensive workbooks, or attend group classes. For the case of the students who specially prepared for the test, the SAT might not be an accurate prediction of anything more than the student's ability to take an SAT test. Diagnostic tests can fail if the object of the test has the ability to predict and prepare for the test in advance. 
 
I wonder what diagnostic tests we can create and use in our evaluation function to determine how well each chess piece is contributing to the game. How complicated would this test have to be in order for it to be reasonably accurate? Can we use this test to help us search for the lines of best play in a chess game? Under what conditions would our diagnostic test fail? How likely would these conditions be? Under what conditions can we stop searching for a solution, confident that our diagnostic test has shown that progression down a certain path is not likely? Can we create a heuristic which allows our computer program to obtain insight similar to that which is obtained by a good human chessplayer? How well would such a computer program perform in a competitive environment? So many questions.

Enter supporting content here