Evaluation is dead. Long live evaluative thinking!

Monday 11:00am

Duncan Rintoul and Lyn Alderman provided case studies in this interactive session, facilitated by Jess Dart.

Click here for the full abstract.

Links referred to in the presentations

Evaluative thinking literature review
(Archibald 2013, blog post)

Evaluative thinking for successful educational innovation
(Earl and Timperley 2015, OECD paper)

Defining and teaching evaluative thinking: Insights from research on critical thinking
(Buckley et al 2016, AJE journal article)

Constructive dissent: Making an asset of our differences and disagreements
(Dick and Patty, nd, free pdf)

BSN404 Evaluative Thinking and Practice

Lyn Alderman’s masters course at QUT, in the Master of Business (Philanthropy and Nonprofit Studies).

Cognitive bias cheat sheet

(Better Humans blogpost by Buster Benson, 2 Sep 2016)

Notes from discussion exercise:
"What’s the role of evaluative thinking at this stage of the policy / program cycle?"

Needs assessment
  • Whose needs?  Consult with beneficiaries

  • Understanding who stakeholders are

  • Environmental scan / content analysis of program documents – current situation

  • Understanding political imperatives and context

  • Appreciating why/what evaluation is needed

  • Assessing whether needs have changed over time

 
Options assessment
  • The s of ‘options’ is important –  because first idea might fail, first idea rarely your best

  • Options surface and distil thinking about theory of change

  • Making clear there are choices

  • Creating capacity for change later

  • Setting criteria

  • Being inquisitive

  • Suspending judgement, no pre-conceived idea

  • Tapping into emotional intelligence

  • Realist perspective – context

 
Program design
  • Clarifying goals / objectives – what am I trying to achieve?

  • Clarifying ‘how’ the program works is essential to evaluative thinking

  • Using program logic and theory of change to co-design a program before it is fully funded / locked in

 
M&E Framework design
  • Outcome map is 1.0 but continuous collective review brings the real version

  • Built-in action research – plan for it, so all know it can change

  • All stakeholders need to be part of it

  • Build capacity of people working in the field on data collection

  • Evaluator has clear concept -  action plan for the research

 
Monitoring and formative evaluation
  • Working backwards, e.g. clarify objectives

  • Framing questions – what do ‘program people’ need to know and measure

  • Analysis of data and sense-making

  • Designing data collection instruments, suitable methods for the audience

  • Getting agreement on what to measure

  • Capability building in evaluative thinking (see below)

  • Educating senior management about value of evaluation – in particular formative evaluation

  • Monitoring for compliance and later evaluation

  • Converting learning to improvement

 
Summative / lapsing program evaluation
  • Improving and influencing future impact and its measurement

  • Triangulation

  • Difference between description and true insight

  • Debate and discussion about who will benefit

  • Challenging the evaluation questions, improving them to maximise value of the evaluation / usefulness

  • Producing something helpful in terms of learning or new initiatives, some generalizable learning

  • Move beyond ‘did it work?’ to ‘what did we learn?’ ‘Why and for whom?’

  • Looking at contribution, attribution and impact

 
Policy and program adjustments
  • Applying options

  • Assessments

  • Decisions

  • Ongoing

  • Flexibility

  • Enquiry

  • Use of evidence

 
So what?  What next?
  • Meaning making

  • Links to developmental evaluation work

  • Bridging between evaluation and design

  • Adaptation-type work, feeding into policy

  • Influencing policy – thinking about pathways for utilisation

  • Translation of evaluation in actions internally (in gov)

 
Capacity building
  • Capability in each part of the cycle is equally important

  • Make it sexy, easy, accessible

  • Show the use, relevance

  • Start where they are at – their needs, their language

  • Start with a dialogue, assess needs and gaps

  • Understand how adults learn, use familiar examples

  • Need a theoretical framework for pathway between capacity building efforts and behaviour change, e.g.

    • Kirkpatrick model

    • Guskey framework is like Kirkpatrick, but includes a layer of how receptive the workplace is to change – i.e. org change, not just individual practice change

    • Preskill – multidisciplinary model for evaluation capability building  

  • Tailor the approach to risk – appetite of organisation to own the change

 
Misc
  • Role modelling

  • Assessment of who has influence

  • Community of practice

  • Changing mindsets, leading to culture change

  • Clarity about what can be expected / not expected from evaluation