Evaluating complexity and managing complex evaluations 

Wednesday 1:30pm

Vanessa Hood facilitated this panel session featuring 

  • Kate Gilbert (Health and Human Services, Victoria)

  • Jess Kenway (Bluebird Consultants)

  • Stefan Kaufman (EPA Victoria).

There were around 60 participants from across Australasia. The room included a mix of evaluation practitioners and commissioners from government, private sector, NGO and academia.

Click here for the full abstract.

Summary of group points raised in group discussions

Tips for... planning complex evaluations and evaluations of complex programs 

 

Top 4

  • Clarify use of terms across disciplines, sectors

  • Map assumptions and risks

  • Dedicate resources and time to designing your M&E plan upfront

  • Clarify overarching principles

Other 

  • Plan for resources you have, not wish for

  • Who pays for prolonged planning time?

  • Develop a narrative for what programme hopes to achieve

  • Allow for iterative learning processes / stages

  • Involve stakeholders

  • Don’t be too prescriptive. Clarify end project outcomes and detail for the first year (only), then reset.

Tips for... managing complex evaluations and evaluations of complex programs

Top 3

  • Go early with findings

  • Review M&E plans extensively (annually)

  • Communication – copious

Other 

  • Keeping stakeholders involved – different layers

  • Dissemination is too passive

  • Evaluation plans must be able to evolve, as the program will change

  • Monitor progressive impacts – tailor action plans / risk matrix

  • Commence at a point where program is somewhat settled

  • Feedback system is needed, to respond to emerging findings

Tips for... optimising the use of findings from complex evaluations and evaluations of complex programs

Top 2

  • Differentiate products for different audiences; human-centred design

  • Create a safe space for learning

 

Other 

  • Use a knowledge broker (best case)

  • Make sure the evaluation design and questions are what is wanted in the first place

  • Share early findings for formative purposes, e.g. adaptation / re-design in use / midstream

  • Identify and nurture internal champions

  • Establish learning partnership, break down silos from the beginning

  • Think about who needs to know what, build on their ideas about how best to do the evaluation

  • Budget for info graphics

  • Open-ness to sharing I.P.

  • Contextualise evaluation findings in the broader organisational / business / process etc.