At OPEN, we collaborate with organisations to conduct high-quality, independent evaluations while building their internal capability for long-term impact. In evaluating the MERLE Program, we took a hands-on, developmental approach— supporting the team in strengthening program delivery, refining implementation, and embedding sustainable evaluation skills.
Dr Mandy Charman Senior Manager - OPEN
Sarah Ryan Senior Project Officer - OPEN
This case study provides an overview of the evaluation process we conducted for the MERLE program. The MERLE program, delivered by South East Community Links was a two-year pilot initiative funded by the Australian government. It aimed to support 60 young people from multicultural backgrounds at risk of school disengagement or youth justice involvement. By offering one-on-one support, group work, outreach activities, and connections to services, the program sought to improve young people’s confidence, well-being, and life skills to help them achieve their potential. Delivered by two youth workers across five schools in Melbourne’s Southeastern suburbs, the program used a flexible, multi-modal approach— including one-on-one support, group work, and outreach—to meet participants where they were.
The program’s goals focused on improving young people’s confidence, self-esteem, and well-being while building pathways for positive community engagement, educational engagement and attainment, and improved employment readiness. These goals were linked to broader outcomes such as strengthening personal and peer networks, promoting community safety, and increasing access to services that support health and well-being.
To learn more about the MERLE program and its outcomes, see our MERLE Program Snapshot.
Traditionally, evaluations are often treated as end-point activities—conducted after a program concludes to report back to funders or stakeholders. While this approach has its place, it can miss opportunities to embed evaluation into the program cycle in ways that improve its effectiveness and long-term value.
At OPEN, we focus on maximising the value of evaluation by integrating it throughout program delivery. This approach provides actionable insights for ongoing improvement while simultaneously building the organisation’s internal capability to use data for decision-making. We call this approach Everyday Evaluation. We prioritise evaluation capability building, ensuring the processes and tools developed during the evaluation are reusable, scalable, and embedded into the organisation’s operations moving forward.
Our Everyday Evaluation model blends the best of both worlds—delivering high quality, independent evaluations while equipping organisations with the tools and skills needed to sustain their own evaluation efforts over time. This approach reduces long-term costs and maximises value by enabling teams to use evaluations not just for compliance but also for continuous learning, program improvement, and advocacy.
We also help organisations become evaluation ready. Through consultation and tailored advice, we support teams to build internal capability, ensuring they have the systems and skills in place to conduct quality evaluations internally or, if a formal, independent evaluation is needed, undertake or engage an external evaluation provider effectively—all while keeping costs low.
This snapshot highlights the key components of a capability-building, developmental evaluation. It outlines what organisations should expect from an external evaluator, including core elements essential for any evaluation and additional complementary components to enhance long-term capability and value.
At OPEN, we specialise in partnering with organisations to evaluate their programs in ways that deliver high-quality, independent evaluations and build long-term internal capability. For the evaluation of MERLE, we adopted a hands-on, developmental evaluation approach that supported the organisation’s team to strengthen program delivery, refine implementation, and embed evaluation skills. Below is an overview of the process:
1. Inception and discovery workshops – Establishing the purpose of the evaluation and its audienceWe kicked off the evaluation by facilitating inception workshops with the program team. These sessions established a shared understanding of the program’s goals, delivery methods, and target outcomes. We also clarified the evaluation’s purpose and audience, aligned data collection needs with reporting requirements, and addressed potential challenges, setting the stage for a collaborative and efficient evaluation process.
2. Program Logic Workshops- Establishing the purpose of the program and its target cohortThe program’s initial logic model was refined during targeted workshops. We worked with the team to map out the program’s activities, clearly define the target cohort, and articulate expected outcomes over time. As a pilot program, activities required clearer articulation to support consistent implementation by youth workers and effective communication with partner schools. This refinement also helped program staff and schools work with a shared understanding of the outcomes and timeframes, aligning activities to these goals for greater clarity and efficiency.
3. Monitoring & Evaluation (M&E) Plan/Framework Development
We developed a bespoke M&E framework tailored to MERLE’s needs. This framework was designed to integrate with existing systems, align with reporting requirements, and respect ethical considerations, particularly when working with vulnerable cohorts. Processes for informed consent, riskmanagement, and safeguarding participant data were built into the framework to ensure responsible and transparent evaluation practices.
4. Developing or Adapting Data Collection ToolsTo capture meaningful data, we developed and adapted tools such as the Personal Wellbeing Index (PWI) tool. This tool tracked participant progress over time in areas like confidence, emotional well-being, social connections, and engagement in education. The tools were designed to fill gaps in existing data collection systems while aligning with the broader evaluation framework.
5. Monthly Coaching and Implementation Support
We conducted monthly coaching calls to support youth workers and program staff in implementing the evaluation framework. Through the mid-term evaluation and monthly coaching, we identified challenges stemming from inconsistencies in the pilot’s implementation. We supported the team in addressing these issues by:
6. Mid-Term Evaluation: Interviews, Surveys, and Findings WorkshopsThe mid-term evaluation involved collecting data through interviews, surveys, and feedback from key stakeholders. The insights gathered informed an emerging findings workshop, where we presented key themes and recommendations. This process helped the team identify areas for improvement and adjust their approach to strengthen program outcomes in real-time.
7. Final Evaluation and CommunicationBuilding on mid-term findings, the final evaluation assessed the program’s overall impact. We delivered the findings of the evaluation in different forms, to maximise the value of the evaluation for the different audience and organisational needs. This included:
The evaluation of MERLE demonstrated its ability to deliver meaningful improvements in young people’s confidence, well-being, and community engagement. Importantly, the process also strengthened the organisation’s capacity to evaluate and improve its programs independently. By building in-house expertise and embedding a culture of evidence-informed decision-making, the program team was better positioned to adapt to challenges, advocate for resources, and scale their work effectively.
Let us help you use data and evidence to demonstrate your outcomes and impact, and strengthen your organisation’s potential for future growth.
At OPEN, we offer start-to-finish program evaluations or tailored support to help organisations become “evaluation ready.” Whether you need:
We can provide the expertise and tools you need to demonstrate your program’s impact, improve delivery, and secure funding. Let us help.
Want to learn more? OPEN can help!
Get assistance !
OPEN Team
open@cfecfw.asn.au