p.2 Developmental evaluation as a distinct niche emerged in response to one of my client's questions and needs.
p.5 Complexity theory shows that great changes can emerge from small actions.
p.5 evaluation is ultimately about reality testing, getting real about what's going on, what's being achieved - examining both what's working and what's not working.
p.6 Conventional prescriptions for controlling complexity
- In the face of complexity, the first task is to identify clear, specific, and measurable goals. Clear direction and measurable goals cut right through complexity.
- Everything seems complex until you do a logic model. Sort out the complexities into a sequence of concrete actions that are clear, sequential, and logical.
- At its most effective and useful, evaluation makes the uncertain certain, the ambiguous unambiguous, the unknown known, the unpredictable predictable, and the complex simple.
- Accountability requires that programs manage and control complexity. Evaluation makes that possible.
p.9 What brings me to complexity is its utility for understanding certain evaluation challenges. Complexity concepts can be used to identify and frame a set of intervention circumstances that are amenable to a particular situationally appropriate evaluation response, what I am calling here developmental evaluation. This makes dealing with complexity a defining characteristic of developmental evaluation's niche. Principles for operating in complex adaptive systems inform the practice of developmental evaluation. The controversies and challenges that come with complexity ideas will also and inevitably afflict developmental evaluation.
p.9 You are entering here the world of uncertain beginnings, muddled middles, and unpredictable endings that ripple on and on without end.
p.10 Developmental evaluation requires what distinguished and experienced evaluator Jon Morell has called "agile evaluators," those who learn to expect the unexpected and adapt with agility and flexibility, including changing the evaluation design, reconfiguring program theory, and responding to emergent stakeholder needs.
p.10 I don't find that it takes a lot of effort to convince people that the world is complex.
p.13 Developmental evaluation is meant to communicate that there is an option in and approach to conducting evaluations that specifically supports development.
p.13 Utilization-focused evaluation begins with the premise that evaluations should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration for how everything that is done, from beginning to end, will affect use. Use concerns how real people in the real world apply evaluation findings
p.14 Utilization-focused evaluation does not advocate any particular evaluation content, model, method, theory, or even use. Rather, it is a process for helping primary intended users select the most appropriate content, model, methods, theory, and uses for their particular situation. Situational responsiveness guides the interactive process between evaluator and and primary intended users.
p.15 There is no one best way to conduct an evaluation... The point here is that every evaluation involves the challenge of matching the evaluation process and approach to the circumstances, resources, time lines, data demands, politics, intended users, and purposes of a particular situation.
p.16 developmental evaluation is not appropriate for every situation. This book will detail when it is appropriate.
p.18-19 Ecologists studying the health and resilience of forests have found that these complex ecological systems adapt... through four phases that make up a recurring adaptive cycle: release... reorganization/exploration... exploitation... and conservation... This cycling through the phases, with major transitions from one phase to another, can be observed not only in healthy ecosystems, but also in social systems. However, if adaptation doesn't occur from one phase to another, the health of the system, or the organization, is threatened.
p.19 Developmental evaluation is especially well suited for the reorganization/exploration phase.
p.19 Developmental evaluation does not rely on any particular evaluation method, design, or tool.
p.21-22 As the book unfolds, I'll be making the case that developmental evaluation is particularly appropriate for but needs to be matched to five different complex situations and developmental purposes.
- Ongoing development in adapting a project, program, strategy, policy, or other innovative initiative to new conditions in complex dynamic systems...
- Adapting effective general principles to a new context as ideas and innovations are taken from elsewhere and developed within a new setting...
- Developing a rapid response in the face of a sudden major change or a crisis, like a natural disaster or financial meltdown, exploring real-time solutions...
- Performative development of a potentially scalable innovation to the point where it is ready for traditional formative and summative evaluation...
- Major systems change and cross-scale developmental evaluation, providing feedback about how major systems change is unfolding, evidence of emergent tipping points, and/or how an innovation is or may need to be changed and adapted as it is taken to scale, that is, as its principles are shared and disseminated in an effort to have broader impact...
p.22 These five different uses of developmental evaluation match different situations. They provide different lenses through which to understand and engage in evaluating social innovations under conditions of complexity.
p.29 I offered different advice. I advised developing a set of principles that would guide program development but not to overplan. "It won't make much difference what you plan," I insisted. "It won't be right... learn what works, and make corrections as you go..."
p.29 "Ready, fire, aim" is the essence of what I was advising... open the program without more planning... experiment, pay attention to what happens, be ferocious about getting feedback, and learn by doing. Such advice runs counter to the conventional wisdom that extensive planning (aiming) should precede action. But detailed planning only works where you have a high degree of control and know what the critical factors are.
p.32 Poverty is complex. [JLJ - The visible results of poverty are simple - people do not have the means or resources to support themselves. It is trying to understand the dynamics of poverty - to develop successful programs to move people out of poverty - that is complex]
p.32 Wisdom from those who have gone before can alert innovators to what to watch for, and can accelerate the climb up the learning curve... but some things just have to be relearned and adapted in the context of a new effort.
p.32 Advance anticipation of what might happen is a tricky business at best and in general we're not very good at it (Morell, 2010), especially under conditions of complexity.
p.34 Dealing with boundaries is a common and constant issue... for all kinds of social innovations that start out with a narrow focus and and find that changing what they've targeted morphs into changing other systems that affect what they've targeted. Boundary management is an ongoing challenge for staff... Boundary decisions - We'll do this and not do that - aren't easy to establish and maintain. Boundaries develop and evolve.
p.36 Let me summarize what I hope is becoming clear: developmental evaluation guides action and adaptation in initiatives facing high uncertainty. Where predictability and control are relatively low, goals, strategies, and what gets done can be emergent and changing rather than predetermined and fixed. Continuous development occurs in response to dynamic conditions and attention to rapid feedback about what's working and what's not working. Developmental evaluation supports innovation by bringing data to bear to inform and guide ongoing decision making as part of innovative processes.
This is especially true for social innovations
p.39-40 Developmental evaluation... involves exploring the parameters of an innovation and, as it takes shape, changing the intervention as needed (and if needed), adapting it to changed circumstances, and altering tactics based on emergent conditions... Developmental evaluation is designed to be congruent with and nurture developmental, emergent, innovative, and transformative processes.
p.41 developmentally oriented social innovators... They're constantly experimenting, adapting and developing what they do in response to program participants' feedback, changing conditions, new insights, and emerging challenges all around them... They assume a world of multiple causes, diversity of outcomes, inconsistency of interventions, interactive effects at every level... They expect to be forever developing and changing - and they want an evaluation approach that supports development and change. That's why they resonate to developmental evaluation.
p.41 "Development," as I'll be using the word throughout this book, means making something different at a level and in a way that actually changes the intervention to some significant degree. Such developments are driven by, in response to, and interact with the volatile environment and innovative dynamics that emerge in complex systems. Social innovators change what they're doing because they are attuned to and responsive as they experience new understandings. They are attentive to revelations from program participants as interactions deepen and trust builds... They are quick to spot changes in the world around them that have implications for their own more narrow arena of action... The commitment to adapt doesn't carry a judgment that what was done before was inadequate or less effective. Change is not necessarily improvement. Change is adaptation.
p.42 I've suggested that detailed planning isn't very useful where knowledge is limited and the environment is turbulent.
p.42 Developmental evaluation supports knowledge assessment and strategic systems thinking and helps ensure that social innovators stay attuned and responsive to emerging realities.
p.49 Developmental evaluation... expects that some of what is planned will go unrealized, some will be implemented roughly as expected, and some new things will emerge.
p.52 Developmental evaluation helps social innovators adapt to dynamic conditions, explore possibilities to see what works and what doesn't work, make sense of successes and learn from failures.
p.54 Developmental evaluation is not a linear process, nor is learning about it
p.64 There is, indeed, a rhythm and flow to the interactions between program development and developmental evaluation.
p.69 Doing what makes sense applies to any utilization-focused evaluation, not just to developmental evaluation.
p.73 The point here is that program implementation and evaluation became integrated and synchronized. As a result, the evaluation framework helped guide program implementation and constituted a framework for program planning and reporting that provided focus to staff activities, an example of what is called "process use" of evaluation in which the way the evaluation is conducted affects the program as much as the findings do (Patton, 2008c, chap. 5).
p.75 Developmental evaluation... It's a mindset of inquiry into how to bring data to bear on what's unfolding so as to guide and develop that unfolding.
p.75 Ten Key Points about Developmental Evaluation
- Thinking about what is useful and sensible for evaluation can open the door and establish the foundation for developmental evaluation.
- Developmental evaluation can include both internal and external approaches to evaluation.
- Developmental evaluation can produce not just findings about progress but materials useful for program development...
- Watching for and being open to what emerges is central to developmental evaluation.
- Developmental evaluation requires timely engagement and rapid feedback.
- Evaluation can become the engine for program development.
- Ongoing program development and evaluation can become mutually reinforcing, a way of doing business, and a way of thinking.
- Project leadership and support for doing developmental evaluation is a sine qua non (without which there is nothing).
- Competent evaluators are essential for successful developmental evaluation.
- Developmental evaluation produces more than improvements; it supports program development.
p.75 Developmental evaluation... It's a mindset of inquiry into how to bring data to bear on what's unfolding so as to guide and develop that unfolding.
p.80 Developments are grounded in and emerge from reactions to situations.
p.80-81 [David Brooks]
Most successful people also have a phenomenal ability to consciously focus their attention... Control of attention is the ultimate individual power. People who can do that are not prisoners of the stimuli around them. They can choose from the patterns in the world and lengthen their time horizons. [This individual power leads to others. It leads to self control...] It leads to resilience... It leads to creativity. Individuals who can focus attention have the ability to hold a subject or problem in their mind long enough to see it anew. (2008, p.A37) [JLJ - fixed break in citation between attention and Control, added additional material from cited Brooks article in New York Times]
p.84 Developmental evaluation is particularly appropriate for a specific kind of situation: complexity. Understanding complexity and its implications for evaluation is critical to recognizing those situations for which developmental evaluation is well suited.
p.84-85 To facilitate situation recognition, it is useful to have a heuristic framework, some way of "cutting to the chase" by knowing what factors are important to consider when we encounter a new situation. Heuristics are shortcuts that tell us what's important to pay attention to. We can't look at everything... Heuristics direct us in making sense of things. They frame and inform decisions. indeed, they make choices and action possible (Gigerenzer, Todd, & ABC Research Group, 1999; Kahneman & Tversky, 2000).
p.85 Developmental evaluation informs fast action and quick reactions by social innovators. First, we have to decide if we're in a situation that is appropriate for developmental evaluation, that is, a complex situation, where the pace of actions, reactions, and interactions matter greatly.
p.90 Complex situations are characterized by high uncertainty and high social conflict... The outcomes of interventions aimed at solving problems under conditions of complexity are unpredictable.
p.91 All perception of truth is the detection of an analogy. - Henry David Thoreau (1817-1862)
p.92 In complicated situations cause and effect is knowable as patterns are established through research and observations over time, but the many variables involved make prediction and control more precarious. In complex situations, cause and effect is unknown and unknowable until after the effect has emerged, at which point some retrospective tracing and patterning may be possible.
p.97 Developmental evaluation of innovations involves ongoing observation, assessment, and feedback about how things are unfolding, what's working and what's not, and what's emerging, toward what outcomes.
p.97-98 there is no one best way to conduct an evaluation... How difficult can it be to design an evaluation to fit the program's situation? Well, how difficult is it to play chess? ...To become more sophisticated and intentional about situational analysis in evaluation, one needs a framework to decide what to pay attention to because you can't track everything.
p.99 What happens when we're faced with complexity? The evidence from social and behavioral science is that when faced with complex choices and multiple situations, we fall back on a set of rules and standard operating procedures that predetermine what we will do and effectively short-circuit situational adaptability.
p.100 a fundamental question: How can evaluators prepare themselves to deal with... a huge variety of situations? The research of decision making says we can't systematically consider every possible variable... What we need is a framework for making sense of situations, for telling us what factors deserve priority based on research and desired results. Such a framework... should offer questions to force us to think about and analyze the situation.
p.105 Evaluation design (complex) Developmental evaluation, tracking what emerges and develops over time.
p.106 Wise executives tailor their approach to fit the complexity of the circumstances they face. - David Snowden and Mary Boone (2007, p. 68)
p.107 Complex: cause and effect is contingent on contextual and dynamic conditions, and therefore unknowable; patterns are unpredictable in advance. Practice is emergent and contingent. A leader's or manager's decision/action sequence should be:
Probe → Sense → Respond
p.108 Wise evaluators tailor their approach to fit the complexity of the circumstances they face.
p.117 Thinking about the intervention as a systems change intervention rather than just an individual treatment intervention has implications for how even the individual treatment is conceptualized and evaluated.
p.119 This kind of mapping and systems thinking offers windows through which one can watch for complex effects. A developmental evaluation is attuned to both linear and nonlinear relationships, both intended and unintended interactions and outcomes, and both hypothesized and unpredicted results... The sources of nonlinearity, emergence, and unpredictability are deeply enmeshed in the complex web of relationships that we all experience.
p.119-120 All kinds of ripple effects may occur... the effects... can ripple through one's... networks. That's why the systems-thinking developmental evaluator needs a map not only of the individual... network of influences and relationships, but also of ways to gauge the effects of and provide feedback about the contextual factors that may affect the intervention as manifest in organizational norms and larger societal values ... To make such a map manageable, begin with only the most basic and critical influences and their relationships.
p.122-123 The basic premise here is that evaluation in complex adaptive systems is more likely to be useful if the evaluation is informed by complexity concepts and understandings.
p.126 Emergence as a core complexity construct tells us that innovators can't determine in advance what will happen, so evaluators can't determine in advance what to measure. We have to be watching for whatever emerges. We have to expect the unexpected. We have to be prepared... The idea of emergence in complex adaptive systems alerts the observer/evaluator to watch for patterns of self-organization among interacting agents... patterns of interaction emerge and the whole of the interactions become greater than the separate parts.
p.127 intended and predictable elements of the intervention will be central to the evaluation design. At the same time, understanding a situation as complex invites evaluators to go beyond the usual token nod toward unanticipated consequences... Taking emergence seriously means engaging in real fieldwork that probes for what's emerging and its significance, meaning, and implications. Taking emergence seriously in the face of uncertainty means "anticipating surprise and responding to the inevitable" unintended consequences of both innovations and evaluations (Morell, 2010). This begins by freeing one's mind from the constraints and blinders of narrow goals-focused evaluation to be open to look for unanticipated impacts and surprises.
p.131 Adaptation is at the center of complex adaptive systems. Interacting elements and agents respond and adapt to each other, and to their environment, so that what emerges is a function of ongoing adaptation both among interacting elements and the responsive relationships interacting agents have with their environment. Innovators adapt. Evaluators of innovative programs will have to follow those adaptations and adapt the evaluation design accordingly. Thus, developmental evaluations must also be adaptive.
p.131-132 Adaptive management is a systematic, iterative process for making decisions in the face of uncertainty, reduced control, and low predictability, through ongoing system monitoring. The process essentially involves learning by doing and observing... when facing complexity, probe first, then sense, then respond. Probing is the doing.
p.133 Under conditions of complexity, processes and outcomes are unpredictable, uncontrollable, and unknowable in advance. Uncertainties flow from turbulence in the environment, both evolutionary and transformational changes in systems, and the limits of knowledge. The predictability of program outcomes is heavily dependent on the state of research knowledge about how to produce desired outcomes.
p.146 This chapter has reviewed a set of six interdependent complexity concepts that undergrid developmental evaluation: nonlinearity, emergence, adaptation, uncertainty, dynamic and dynamical interactions, and coevolution. These are sensitizing concepts... while the specific manifestations of social phenomena vary by time, space, and circumstance, the sensitizing concept is a container for capturing, holding, and examining these manifestations to better understand patterns and implications.
p.152-153 innovation scholar Thomas Homer-Dixon (2006) has summarized the future succinctly:
In coming decades our resource and environmental problems will become progressively harder to solve; our companies, organizations, and societies will therefore have to become steadily more complex to produce good solutions; and the solutions they produce - whether technological or institutional - will have to be more complex too. (p. 251)
Developmental evaluation operates in that complex space.
p.153 Developmental evaluation offers a middle path: navigating, sorting out, making sense of, and adapting effective principles to local context under conditions of complexity, when it's not clear what should be done because of inadequate knowledge, the large number of interdependent factors that have to be taken into account, the complex adaptive nature of the system where innovation is occurring, and disagreements among various stakeholders about how to proceed and where to place priorities.
p.153 As these top-down and bottom-up forces intersect, developmental evaluation helps find a way through the labyrinth of adaptation.
p.167 Effective principles have to be interpreted and adapted to context... Principles provide guidance for action in the face of complexity... Developmental evaluation can assist innovators in identifying, applying, and adapting effective principles.
p.181 Here is where the top-down forces of change and the bottom-up grassroots initiatives intersect. Developmental evaluation captures and guides the interactive developments that emerge... Those developments generate and reinforce effective principles... thereby contributing to large-scale change even as they support and nurture ongoing local adaptation
p.184 What I hope is emerging is a sense of the kinds of situations for which developmental evaluation is especially appropriate. This requires situation recognition
p.193 the McConnell Foundation has adopted the adaptive cycle as its theory of change for funding innovation initiatives across Canada (Pearson, 2007) and developmental evaluation as the approach most attuned to the adaptive cycle (Gamble, 2008).
p.193 the characteristics of complexity that are especially relevant for developmental evaluation
- Nonlinearity, in which small initial actions can reverberate in unexpected and unpredictable ways to have huge impacts...
- Emergence, in which patterns emerge from self-organization among interacting agents... emergence occurs as each agent or element pursues its own path but as that path intersects with other paths, and the agent interacts with other agents, also pursuing their own paths, patterns of interaction cohere, becoming greater than the separate parts. What emerges can be beyond, outside of, and oblivious to any notion of shared intentionality...
- Adaptive. Uncertain. Dynamical. Coevolutionary.
p.194 [Mills]
Issues have to do with matters that transcend the local environment of the individual and the range of his inner life. They have to do with the organization of many such milieux into the institutions of an historical society as a whole... An issue, in fact, often involves a crisis in institutional arrangements. (Mills, 1959, pp. 8-9)
p.194 I resonate to complexity as a framework in part because it helps me make sense of my own personal and professional journey.
p.194 The issue of how to adapt evaluation to conditions of complexity does involve a crisis in institutional arrangements. Engaging that crisis is what brought recovering sociologists Frances Westley and Michael Patton together to consider the implications of the adaptive cycle for evaluation.
p.196-197 The premise of system resilience is that it is made manifest in an adaptive cycle. The very notation of a cycle connotes that change processes manifest repeating phases of growth, decline, reorganization, and new growth, repeating the cycle, what in economic terms are periods of boom and bust... Understanding and taking into account the adaptive cycle is important because it draws our attention to the realities of complex, dynamical systems.
p.198 ecosystem resilience "emphasizes conditions far from equilibrium steady state, where instabilities can flip a system into another regime of behavior (i.e., to another stability domain). In this case resilience is measured by the magnitude of the disturbance that can be absorbed before the system changes its structure by changing the variables and processes that control behavior" ([Gunderson and Holling 2002] pp. 27-28).
p.199 The themes of ecosystem resilience are nimbleness, agility, adaptability, responsiveness, and responding to turbulence and uncertainties... Social innovators... manifest an ecosystem resilience mindset, as does, then, the developmental evaluator supporting and facilitating social innovation. [JLJ - direct application to game theory. This one work by Patton has saved me years of effort - definitely a desert island kind of book - one that I would like to have in a small collection if I had to choose among a large number.]
p.205 As promising innovations emerge and attract resources, the transition from exploration to exploitation occurs.
p.206 Developmental evaluation is especially useful during the alpha phase of reorganization, exploration, and innovation. This is when social innovators try out new ideas, experiment, and learn by doing. Most of what's tried won't work; some will. Developmental evaluation helps innovators know the difference, moving on from dead ends and further exploring what looks promising. Identifying dead ends during exploration and innovation doesn't involve the rigorous evidence and high-stakes judgment of summative evaluation.
p.207 In highly turbulent environments and complex situations, developmental evaluation may be ongoing in assisting and supporting social innovators who adapt their interventions as they encounter the nonlinear dynamics of complexity.
p.209 Failure to adapt to changed conditions and emergent trends brings a slower but no less certain demise of long-established programs.
p.213 Maladaption leads to system collapse when, says Holling, a system's potential diversity, and resilience have been eradicated (Gunderson & Holling, 2002, p.95)
p.213 The French philosopher Emile Chartier might have been describing the rigidity trap when he mused: "Nothing is more dangerous than an idea when it is the only one you have."
p.215 What indicators and benchmarks need to be tracked by... innovators to know which path is emerging?
p.227 The developmental evaluator inquires into developments, tracks developments, facilitates interpretation of developments and their significance, and engages with... making judgments about what is being developed, how it is being developed, the consequences and impacts of what has been developed, and the next stages of development.
p.227 Asking "questions that matter" can be thought of as "a tool for working in complex situations" (Parsons & Jessup, 2009).
p.248 Values inform decisions about which way to go at inevitable forks in the road. When a problem arises that challenges the innovator in ways not foreseen by strategy, then values provide guidance for reconciling tensions.
p.248 Environmental activist and author Wendell Berry emphasizes the fundamental importance of values, especially in the face of uncertainty, because, he posits, there will never be enough certain knowledge to guide action. Values, then, are a way to deal with the unknowability - the inherent ignorance - of the human condition.
p.248-249 [Wendell Berry]
There are kinds and degrees of ignorance that are remediable, of course, and we have no excuse for not learning all we can. Within limits, we can learn and think; we can read, hear, and see; we can remember... But... our ignorance ultimately is irremediable... Some problems are unsolvable and some questions unanswerable... Do what we will, we are never going to be free of mortality, partiality, fallibility, and error. The extent of our knowledge will always be, at the same time, the measure of the extent of our ignorance.
Because ignorance is thus a part of our creaturely definition, we need an appropriate way: a way of ignorance
p.256 evaluations typically begin by establishing a baseline.
p.270-271 The sensitizing concepts that guide innovations provide a powerful focus for reflective practice and developmental evaluation... Sensitizing concepts in the social sciences include loosely operationalized notions... that can provide some initial direction to a study as one inquires into how the concept is given meaning in a particular place or set of circumstances... while the specific manifestations of social phenomena vary by time, space, and circumstance, the sensitizing concept is a container for capturing, holding, and examining these manifestations to better understand patterns and implications... Evaluators commonly use sensitizing concepts to inform their understanding of situations... A sensitizing concept raises consciousness about something and alerts us to watch out for it within a specific context.
p.284 Tom Schwandt has eloquently made the case that evaluation is ultimately more about reasoning than about data and methods. We use methods to generate data, but the data have to be interpreted and given meaning, and that involves reasoning.
p.284 Abductive inference... is reasoning that begins with an observation that something has occurred, what evaluators would call an "outcome," and works backwards to track the pathway that led to that observed outcome. Abductive inference is common in interpreting forensic evidence and has been made popular by crime scene investigation television shows. [JLJ - Perhaps we can use abductive inference to make a machine "play" a complex game of strategy. When we construct our diagnostic test of the adaptive capacity to mobilize coercion, we reason from the outcome, an "evaluation" of an endpoint position in the game tree, that this position/assessment is "typical" of what we would expect to see "down the road", based on the fact that
- promising moves were used to drive our "search" efforts - possibly better described as a strategic simulation of coevolution
- diverse move choices were available in positions deemed to be not the most promising, and therefore explored less
- we ought not to be surprised, because we sensitized our efforts to see premonitions of effects, using our time accordingly.
'Premonitions' - rich details of dynamic effects of driving forces and critical success factors - drove our efforts, which, we reason, should be "good enough", in the end, to form a "stance" in a complex strategic game, where we will eventually face positions unknown at present time.]
p.286 Philosopher Atocha Aliseda (2006) calls abduction "the logic of discovery" and distinguishes it from the logic of confirmation or proof (pp. 12-18). This follows from distinguishing well-structured from ill-structured problems, with the logic of proof applied to well-structured problems (simple problems in the framework presented in Chapter 4) while the logic of discovery is appropriate for ill-structured problems, which is the realm of complexity in this book.
Walton (2004) presents a method for evaluating abductive arguments built around a dialogue process of discovery. The dialogue is back and forth between possibilities (hypotheses) and explanations, with observations (data) mediating the dialogue.
p.288 As bricoleurs developmental evaluators can use any methods or tools deemed appropriate for the inquiry at hand... "It's all about persistently asking questions and pursuing credible answers in time to be used. Questioning is the ultimate method. Questions are the quintessential tools." ...Pick methods and tools that are appropriate to the situation and context, that will provide meaningful, credible, practical, and useful answers for the primary intended users given the primary intended purpose of the evaluation.
p.296 Prospective evaluation looks ahead... Retrospective evaluation, in contrast, looks back at what has already occurred.
p.305 The distinguishing characteristic of developmental evaluation is contributing to something that's being developed. That's the purpose: development... Developmental evaluation is a process of engagement.
p.307 Given that change is constant and its direction uncertain in a complex dynamic system, developmental evaluation supports programs and innovators to adapt as they face these different challenges, whether one at a time, more than one at a time, or in some sequence or cycle. The purpose of making the distinctions is to better match the developmental evaluation design to the nature of the complex situational challenges that pertain at any given time within a particular context.
p.314 It is the paradox of decision making that effective action is born of reaction. Only when organizations and people take in information from those around them and the environment and react to changing conditions can they act to reduce uncertainty and increase discretionary flexibility... Action emerges through reaction and interaction and leads to adaptation.
|