Pre-conference workshop program: Tuesday 18 September 2018

>>> DOWNLOAD a printable conference workshop program

View Sunday pre-conference workshop program here.


8–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM: First half of full day workshops; Morning half-day workshops
12:30–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM: Second half of full day workshops; Afternoon half-day workshops

WORKSHOP DETAILS

Categories: 
A.
Foundational evaluation skills and capabilities 
B. New tools; approaches and ways of thinking   
C. Advanced evaluation topics


Empowerment evaluation

presented by David Fetterman   FULL DAY  |  CATEGORY: B

Empowerment evaluation builds program capacity and fosters program improvement. It teaches people to help themselves by learning how to evaluate their own programs. Key concepts include: a critical friend, cycles of reflection and action, and a community of learners. Principles guiding empowerment evaluation range from improvement to capacity building and accountability. The basic steps of empowerment evaluation include:  1) establishing a mission or unifying purpose; 2) taking stock – creating a baseline to measure growth and improvement; and 3) planning for the future – establishing goals and strategies to achieve objectives, as well as credible evidence to monitor change. A dashboard is used to compare annual goals with quarterly progress. The role of the evaluator is that of a coach or facilitator in an empowerment evaluation, since the group is in charge of the evaluation itself. The workshop will also highlight how empowerment evaluation produces measurable outcomes with case examples ranging from high tech companies such as Google and Hewlett-Packard to work in rural Arkansas and squatter settlements in South Africa. Employing lecture, activities, demonstration and discussion, the workshop will introduce you to the theory, concepts, principles, and steps of empowerment evaluation as well as the technological tools to facilitate the approach.

David Fetterman is President and CEO of Fetterman & Associates, an international evaluation consulting firm. He has 25 years of experience at Stanford University, serving as a School of Education faculty member, School of Medicine director of evaluation, and senior member of Stanford administration. Fetterman concurrently serves as a faculty member at Pacifica Graduate Institute,  the University of Charleston, and San Jose State University.  He is also a co-director of the Arkansas Evaluation Center. Previously, Dr. Fetterman was a professor and research director at the California Institute of Integral Studies, Principal Research Scientist at the American Institutes for Research, and a senior associate at RMC Research Corporation.

David is a past president of the American Evaluation Association. He received both the Paul Lazarsfeld Award for Outstanding Contributions to Evaluation Theory and the Myrdal Award for Cumulative Contributions to Evaluation Practice. Fetterman also received the American Educational Research Association Research on Evaluation Distinguished Scholar Award and the Mensa Award for Research Excellence.

David is the founder of empowerment evaluation. He has published 17 books, including Collaborative, Participatory, and Empowerment Evaluation: Stakeholder Involvement Approaches (with Rodríguez-Campos and Ann Zukoski), Empowerment Evaluation: Knowledge and Tools for Self-assessment, Evaluation Capacity Building, and Accountability (with Kaftarian and Wandersman), Empowerment Evaluation in the Digital Villages: Hewlett-Packard’s $15 Million Race Toward Social Justice, Empowerment Evaluation Principles in Practice (with Abraham Wandersman), Foundations of Empowerment Evaluation, and Ethnography: Step by Step.

> register


Making it stick 2 – Impactful evaluation reporting and beyond

presented by Samantha Abbato  FULL DAY |   CATEGORY: B

In this workshop we go outside the box of evaluation reporting to the use of technical advances and a transdisciplinary environment for maximising evaluation communication and use. It is designed for professionals who need to communicate evaluation findings and want to do so with greater effect. This will benefit evaluators and other professionals at an intermediate level of evaluation report crafting.

This workshop is for evaluators and other professionals who would like to:

  • Improve their skills in professional writing for impact.
  • Use technological advances in digital methods to move beyond the traditional written report to incorporating visual approaches such as video.
  • Incorporate design-thinking to communicate with stakeholders;Go beyond infographics to a suite of innovative technological approaches for evaluation communication with graphic design and system thinking.
  • Use advances in data dashboard capability to keep the evaluation conversation going.

The workshop will be interactive, involve the sharing of experiences as well as hands-on activities. Out-of-the-box case studies from the experience of the presenter as part of an interdisciplinary team including: Videography, Graphic Design, Virtual Reality, Systems Thinking and Organisational Psychology will be discussed.

Participants will be provided opportunities to apply new skills to their own work.

Building from the ‘Making it Stick’ introductory workshop, four main areas will be covered.

  1. Professional writing for impact: Reduce content by 25% or more and increase the precision, clarity, persuasiveness and usability of your reports.
  2. Video reporting: Use of technology we all have at our fingertips (e.g. Smartphones), videography and digital evaluation interview.
  3. Beyond infographics:Innovative visual tools such as online systems thinking graphics, design-thinking, graphic design and embedded.
  4. Data dashboards:Practical communication tools that keep the conversation going and maximise continued stakeholder engagement and motivation.

Samantha Abbato is an evaluation consultant and director of Visual Insights, a Pictures and Stories approach to evaluation. As an independent evaluation consultant for the past 14 years working with more than 50 NGO and government organisations in Queensland, New South Wales, the Northern Territory and England, she regularly applies a innovative transdisciplinary approaches to evaluation and reporting. Her evaluation work is based on an extensive quantitative and qualitative academic background that includes a PhD in epidemiology (UC Berkeley), an MPH in biostatistics and four years of applied academic training in the qualitative methods of medical anthropology (UC Berkeley) applied to a thesis and publication in Aboriginal and Torres Strait Islander health. She is a specialist in health and community sector evaluation with extensive experience in qualitative and quantitative evaluation approaches

With a passion for communication, maximising evaluation use, and a transdisciplinary approach to evaluation, the facilitator is able to offer a wealth of case studies of unboxing and increasing the impact of evaluation reporting using transdisciplinary approaches.  Her most recent evaluation work includes evaluation and M&E reporting using systems thinking, videography, graphic communication, design-thinking and virtual reality. In addition, the facilitator has completed more than 100 evaluation and research reports for a range of government, non-government organisations and community stakeholders including several peer-reviewed publications in 2018 incorporating transdisciplinary approaches. 

> register


Introduction to program logic and theory of change

presented by Carina Calzoni   FULL DAY |  CATEGORY: A

This workshop introduces the program logic/theory of change concept and lays out a step by step process for creating a logic/theory of change model. A program logic/ theory of change focuses not just on what, and how a project is trying to achieve change but also on ‘the who’ will be changing. The course includes discussion of how program logic / theory of change can be used for program design and how it can be used to provide the structure for monitoring and evaluation plans.

The course will commence with an overview of program logic / theory of change and a hands-on introduction to developing a simple hypothetical logic model. Following a more detailed overview of the various approaches to program logic development and their relative strengths, participants will be introduced to a structured process for developing a logic / theory of change, using a hypothetical behaviour change project.

The course will conclude with a bridging session that outlines the process for using program logic / theory of change to develop meaningful targets, monitoring systems and well-targeted evaluation plans. The training will include a mix of expert presentation, small group work, and questions and answer sessions.

Carina Calzoni is passionate about program design, monitoring and evaluation. She has nearly 20 years of professional evaluation experience within government and consulting to governments and not-for-profit organisation across a wide range of sectors and levels of complexity. She has an in-depth understands public policy and program design and management, and has a deep appreciation for a utilisation-focused approach to evaluation in this context.  

Carina has a Masters in Evaluation as well as qualifications in Public Policy and Applied Science which gives her the breadth of skills and knowledge to work adaptively across a range of specialist fields.  She has been involved in a large number of complex evaluations involving both qualitative and quantitative methods and program planning processes across a wide range of sectors including agriculture, natural resource management, regional development, education, health, mental health and social enterprises. She is also an experienced monitoring and evaluation trainer and facilitator.

> register


Out of the program box: Evaluating place-based and systems change approaches

presented by Jess Dart    FULL DAY  |  CATEGORY: B

The world in which ‘program evaluation’ was born and crafted is shifting. To address 'wicked' challenges such as entrenched placed-based disadvantage or climate change no one person, sector, or discipline can hope to achieve lasting change alone. There is an increasing call to move beyond traditional programmatic and sectoral approaches. More joined-up and long-term 'systems change' and 'place-based' initiatives are becoming more commonplace and receiving attention from Government, Philanthropy and the not-for-profit sector. Unsurprisingly, these non-programmatic approaches do not lend themselves to being evaluated using traditional program evaluation. 

To kick this workshop off, we provide an overview of systems change and place-based approaches and then invite participants to explore the evaluation challenges for this type of work. Drawing on both international literature and practice we then explore emerging thinking and practical tools about more agile, adaptive and developmental evaluation that seems more suitable to these settings.  We will share lessons about how to plan and phase evaluation in this context (learn about planning sprints,  evaluation zones, loops and the cube!), as well as tips about applying evaluation in the initial years, middle years and final years of a place-based initiative.

This workshop is pitched at the intermediate level.

Jess Dart is the founder and CEO of Clear Horizon – an Australian-based specialist evaluation company. Receiving the 2018 award for outstanding contribution to evaluation from the Australian Evaluation Society (AES), Dr Jess Dart is a recognised leader with over 25 years experience in strategy, design and evaluation for organisations in Australia and overseas. She specialises in the evaluation of complex and emergent programs and approaches. As part of her PhD she tested the ‘Most Significant Change Technique’ in an Australian context and later co-authored the User Guide with Rick Davies, which is now translated into 12 different languages.

Jess is passionate about supporting communities to thrive, and believes that evaluation has a place to support communities achieve their shared agenda. Evaluation when done well, can help communities uncover what works and what does not, as well as telling the story of change and creating momentum and clarity.

> register


Moving from evaluation to valuation

presented by Taimur Siddiqi, Arjun Ravi    FULL DAY  |   CATEGORY: C

For evaluation to help strengthen the lives of individuals and communities, it has to provide the right information to the right people at the right time. All organisations, from the largest multinationals through to government departments down to volunteer run NGOs, have to constantly make investment decisions based on what they hope is the best information to deliver the greatest value for money, however that is defined. Most organisations do not have the resources to undertake detailed economic analyses and instead commission evaluations which quantify value for money in non-monetary terms. While understandable, this often means ‘intangible’ outcomes such as improving quality of life, reducing recidivism, or developing social networks are overlooked or relegated in decision making. It’s essential that evaluation practice responds to this so that where appropriate, the full range of social, environmental and economic outcomes can be valued. One option is to blend traditional economic valuation techniques with in depth stakeholder engagement using Social Return on Investment (SROI). SROI is a leading, principles-based methodology for measuring and valuing the impact of programs, policies and organisations. By enabling tangible and intangible costs and benefits to be represented in a common unit (money), SROI allows evaluators and funded organisations to speak the language of funders while still ‘valuing what matters’.  

This interactive workshop will focus on how to apply an SROI analysis to value outcomes as part of ongoing monitoring and evaluation (M&E) activities and using M&E data. It will also encourage participants to consider the benefits and challenges of valuing outcomes. It will be based on peer learning, with a series of cooperative learning exercises and opportunities for group discussion. Participants will be asked to bring their own examples and provided with a take home Excel-based template and resources to assist them with their analyses. 

The learning outcomes are: 

  • Ability to apply SROI techniques in your M&E work
  • Use SROI principles to more critically interpret programs and make more effective funding decisions 
  • Ability to identify appropriate financial proxies for valuing outcomes
  • How to interpret SROI ratios 

This workshop enables participants to address the following AES domains of competence:

  • Culture, Stakeholders and Context
  • Research Methods and Systematic Inquiry
  • Evaluation Activities

The workshop is delivered by a Social Value International accredited practitioner and trainer and designed for those with intermediate evaluation experience and an interest in quantifying and valuing the impact of programs.  

Taimur Siddiqi is an experienced practitioner and trainer of the Social Return on Investment (SROI) methodology and has completed several SROI projects and delivered accredited training to over 100 individuals. Taimur has applied SROI valuation techniques as part of his role as an evaluation consultant working with a range of public sector, corporate and not-for-profit organisations. His project work has included a peer reviewed SROI analysis of Indigenous financial counselling; a cost effectiveness evaluation of a child protection community legal pilot and an extensive SROI research project for Foodbank Australia. Prior to co-founding The Incus Group, he spent four years with Net Balance where he eventually managed the $2 million Social Impact business line. Taimur holds a Bachelor of Science and Master of Environment from the University of Melbourne. 

Arjun Ravi is director of The Incus Group and an experienced impact measurement and evaluation professional with extensive experience advising clients in the corporate, not-for-profit and government sector. Prior to co-founding The Incus Group in 2015, he spent five years with Net Balance and later EY Australia building and establishing the social sustainability practice. His experience in impact measurement has involved leading innovative engagements seeking to measure, value and manage the often “intangible” impacts of organisations’ operations and investments. He is accredited by Social Value International as a Social Return on Investment (SROI) practitioner and is a licensed trainer of the SROI methodology and has trained hundreds of diverse participants worldwide. Some of his project highlights include working with Save the Children to forecast the social return of a suite of their child protection initiatives and determining the social value of providing improved transportation access to persons with a disability to advocate for greater funding. Arjun holds a Bachelor of Finance degree from the University of Illinois and a Master of International Development from Monash University.

> register


Questionnaire design: Asking the right questions

presented by Jasper Odgers, Klas Johannson   HALF DAY (MORNING)  |   CATEGORY: A

This applied workshop is a practical forum to learn the fundamentals of good survey design through practice. It aligns to AES professional learning competency 4 ‘research methods and systematic inquiry’. It is designed for people who need to collect standardised satisfaction and outcomes data from clients or stakeholders as part of their professional practice, but have not had previous experience designing questionnaires. It is also suitable for funders of evaluation and research to understand what constitutes good practice when asked to review survey instruments as part of managing the contract for an evaluation.

The workshop covers what is needed to make a good survey – from using a survey to ask the right questions, approaches to sampling, scale design, question wording, options for distribution, implications for analysis and reporting techniques.

The learning objectives are for participants to:

  • Understand situations suited to different survey methods
  • Identify appropriate survey methods for your project
  • Understand approaches to sampling and their strengths and weaknesses
  • Understand implications of design choices for analysis
  • Have a basic understanding of reliability and validity
  • Understand different types of scales and when to use them
  • Design appropriate scales
  • Identify and avoid common question design errors
  • Understand processes to refine questionnaires
  • Consider survey distribution channels and their strengths

The workshop uses applied techniques to support adult learning – outlining the theory, bringing this to life through participant discussion of how this fits with their project and practice-based examples (e.g. correcting poorly worded questions and picking up problems with scales). We ask participants to provide survey instruments or information about their project ahead of time, so we can delivery to their context. Participants also receive a workbook with more detailed notes to take away and reflect on, including a list top tips to avoid in question construction. This allows more time for practical skills building exercises in the workshop.

Jasper Odgers has been studying and working in quantitative research and data analysis for the past eight years. He manages online surveys, quantitative data analysis and data visualisation for ARTD’s reporting. He has recently managed the Queensland Government’s renting reform consultation which collected over 200,000 responses. Using digital engagement strategies to increase response rates and factor analysis to fine tune instruments, Jasper has successfully delivered stakeholder surveys for NSW and Commonwealth departments over the last five years.

Klas Johansson has extensive experience in research and evaluation and specialist expertise in the design, testing and implementation of monitoring systems. He specialises in research-oriented data collection systems and is a leader in the field of online monitoring and reporting for grants programs. With his background in evaluation and extensive experience working with a range of stakeholders to collect and report performance and monitoring data, Klas is adept at balancing the theoretical, technical and practical requirements for data systems. Klas has designed and implemented monitoring systems for Commonwealth Government and NSW Government programs, including grants programs, in a range of policy sectors. Klas was the recipient of the 2010 Best Evaluation Policy and System Award from the Australasian Evaluation Society for a monitoring and reporting system tracking the outcomes for children with a disability transitioning from school.

> register


Applying implementation science to evaluation: An introduction to implementation evaluation

presented by Jessica Hateley-Browne, Vanessa Rose   HALF DAY (MORNING)  |  CATEGORY: A

Implementation science is increasingly being used by government and other agencies to enhance the effectiveness of programs and these organisations are looking to evaluators to measure the effectiveness of their implementation efforts. The purpose of this workshop is to provide policy-makers, program managers and program evaluators with an introduction to key implementation science concepts and evaluation frameworks, methodologies and tools for measuring implementation effectiveness. 

By the end of the workshop participants will be able to:

  • Understand key implementation science concepts and strategies relevant to evaluations
  • Understand frameworks and methodologies for implementation evaluation and differentiate these from process evaluation
  • Apply these frameworks to common evaluation situations
  • Gain awareness of the range of implementation evaluation measurement tools
  • Consider how these concepts and frameworks could apply in their work

The presenters will use an approach to teaching and learning based on a common teaching and learning cycle:

  • Connect (e.g. engaging with participants and their learning needs)
  • Activate (e.g. introducing new content)
  • Demonstrate (e.g. participants applying new content to scenarios) 
  • Consolidate (e.g. participants reflect on what was learned during the session and how it will be used in the workplace)
  • Consistent with principles of adult learning we will place particular emphasis on participant reflections of how this content can apply to their work.

Jessica Hateley-Browne, PhD, is a researcher with a background in health psychology. She has more than 10 years of experience in applied behavioural research, particularly in the health services and population health fields. She has held roles in academic and applied research centres, and in a government agency. Jessica has worked on a variety of large-scale trials and evaluation projects, and has expertise in using mixed-methods and hybrid designs in research that seeks to address health and social program and policy challenges. She recently completed a first of its kind implementation evaluation of evidence-based programs in the child and family services sector in Victoria. Jessica is committed to high-quality and creative knowledge translation, and is passionate about contributing to and utilising the best evidence to inform policy and practice. 

Vanessa Rose, PhD, is a research psychologist with substantial experience developing and evaluating evidence-based programs implemented within health and human services. She previously led a program of work at the University of NSW focused on improving the lives of children, families and communities facing adversity, establishing a community-based research centre to bridge the gap between research-practice and research-users. Vanessa has also led work in NSW government developing an evidence-informed framework to establish shared outcomes across agencies and improve the wellbeing of vulnerable populations. Her background includes child psychological assessment, managing a community health centre and teaching undergraduate and postgraduate students in health inequalities, health promotion and research methodology. In her role with CEI, Vanessa is responsible for a portfolio of evaluation projects. These include large-scale, whole-sector evaluations using robust methodologies to measure outcomes and assist government and other policy agencies in decision-making for future investment

> register