About the Organisation

Anglicare Victoria delivers more than 120 services from over 40 locations across Victoria. They focus on transforming the futures of children and young people, including those in foster and residential care. Their work is based on three guiding pillars; prevent, protect and empower. For this project, Anglicare worked partnered with a researcher from the Monash Centre for Health Research and Implementation (MCHRI), which is part of the Monash University School of Public Health and Preventive Medicine. MCHRI aims to create, synthesise, implement and translate clinical, health care and public health knowledge, underpinned by cross sector end user engagement, to deliver health impact.

We Spoke With

David Poynter
General Manager Business Development and Research Based Models, Anglicare Victoria

Bengianni Pizzirani
Research Fellow, Monash Centre for Health Research and Implementation (MCHRI) Monash University

AT A GLANCE

Anglicare Victoria is seeking to confirm the value of Rapid Response, a short-term intensive prevention program for families at imminent risk of having a child placed in out of home care.

In a partnership with Monash University, they are conducting a longitudinal evaluation of the outcomes for children receiving this intervention.

The Challenge

Based on a limited pilot study, Anglicare Victoria saw promising results from the Rapid Response program. This program provides ‘last chance’ crisis intervention offering targeted intensive support for families that is designed to avoid removal of children to out of home care.

Anglicare needed solid evidence to confirm the program’s effectiveness and value for widespread use. To this end, Monash Centre for Health Research and Implementation designed an evaluation with:

  • a range of regions (Bendigo, Southern, Northern Metro, Western and Outer East) and up to 150 participants
  • a plan for a matched comparison group – a group of those eligible for but not receiving the intervention (due to limits on the program’s capacity)
  • assessment of implementation as well as effectiveness and costs
  • data drawn from families, staff and management, and including follow up of families 8 weeks after the program has concluded
  • an embedded researcher model with the main researcher working closely in an integrated team with practitioners – to support systematic data collection and build team capability and learning throughout the project.

The evaluation is designed to provide a detailed account of what works, for whom, why, and in what circumstances.

Lessons Learned

  • Academic models are not always suitable — Strict academic models (for example Randomised Control Trials) are difficult to translate to the practice context.
  • Two-way partnership — A genuine partnership approach, where research is practice-led, builds an effective bridge between research and practice.
  • Continuous improvement — Continuous improvement in practice is a key goal of NGOs and their staff and is enhanced by incorporating simple evaluation techniques routinely in daily work.
  • Systems thinking — Effects on the whole organisational system, rather than just the isolated program, should be considered when assessing outcomes of an intervention.
  • Rigorous and transparent data — Research is most useful when data and measures are rigorously defined, and failures as well as successes are recorded and shared.

The Journey

The project is ongoing, due to conclude during 2020, so final results are not yet available. However a number of observations can be made about its design and conduct so far:

Evaluation begins from day one, and practitioners are a vital part of the research effort

It is important to de-mystify evaluation for practitioners and stress that simple data collection is useful, and any evaluation always beats no evaluation.

Practitioners in the Rapid Response project were involved in regular discussions which helped to build their skills, interest and confidence, allowed researchers to draw on their experience, and enabled sharing of learning as the project has progressed.

In this project, having an evaluation practice lead has been critical, with the researcher acting as a resource and critical friend.

Data precision is crucial

The project was very strict about including only examples where the family met strict criteria and the program had been delivered exactly to plan. This is important to allow clear conclusions to be drawn about effects and treatment to be replicated by others in future.

Relaxing the rules would have included more cases, but data transparency is more important. Each case excluded from analysis was mapped with the reason recorded (for example if the treatment period extended beyond the nominated four weeks).

The aim is to reliably capture influences and pre-conditions that result in the observed outcomes.

A matched comparison group proved difficult to achieve

A control group, while useful, proved not to be achievable in the context.

A fully randomised control trial could not be attempted as it would not be ethical to deliberately deny families potentially helpful interventions.

To create a ‘natural’ control group, the project tried to identify appropriate comparisons by asking workers to nominate cases that would have been eligible for Rapid Response but could not receive it because at the time there was no available capacity.  This approach did not work well when it came to the point, because of the urgency of time frames (removal of the child must be imminent before use of the program is considered) coupled with a long chain of approval for inclusion in the control group. By the time approval could be received it was usually too late to gather full data for those families that would have qualified.

Implementation is not a linear process

Continuous improvement and iterative learning aligns with practitioners’ aims. The central purpose of evaluation for most NGO staff is to answer the question ‘How can I be sure what I am doing is working? I want to make a real difference through my work.’

This means that many aspects of delivering the program will be tested, adapted and improved while the trial is progressing. A linear model of inputs, treatments, outputs and outcomes as represented at the start of the project is a simplified snapshot of what will and should actually occur.

“A good analogy is designing and building a new car engine. Throughout the process all the parts are subjected to constant testing, looking for how well they work and improvements that could make them better, so that the final engine is perfect.”

David Poynter, General Manager Business Development and Research Based Models, Anglicare Victoria

Evaluating outcomes is complex

The usefulness of the program needs to be considered in the context of the whole organisation (Anglicare’s operations) and the broader system of child protection.

For instance, it is possible that some positive characteristics of this project could usefully be applied in other programs being delivered in the organisation. Similarly, assessment of children’s levels of safety in the home undertaken as screening for entry to Rapid Response can provide critical information to the child protection system about those who most urgently need out of home placement.

The Outcome

Preliminary evidence suggests the Rapid Response program is generating a high level of family engagement, and that the vast majority of participating children are so far able to remain with their parents.

Outcomes have been consistent across different teams, and the program is easy to extend to more locations.

It can also be delivered at low cost compared to the provision of out of home care.

The Future

  • Identifying interventions that can reduce the need to remove a child from their parents has important implications for the child welfare system.
  • As well as improving outcomes for individual families, such interventions can reduce the burden on health and welfare systems that occur when increasing numbers of children require out of home care.

Click here to read about the learnings from this project on our blog

Get in touch

Want to learn more? OPEN can help!

Get assistance !

OPEN Team

open@cfecfw.asn.au

Join the OPEN community - It's Free