Process evaluation questions
Appropriateness
Evaluation questions | Indicators | Measure | Data sources |
---|---|---|---|
What is the evidence of continued need for the program and role for government in delivering this program? (P1) | Evidence that need is not being met by other programs for targeted cohort groups |
|
|
Inability to access MBCPs |
|
|
|
Diversity of participants based on needs and circumstances |
|
|
|
Have the initiatives been implemented as designed? (P2) | Realisation of delivery activities as outlined in submissions and program logic |
|
|
How are the initiatives innovative and contributing to best practice? (P3) | Evidence of innovative program features and contribution to best practice |
|
|
Effectiveness
Evaluation questions | Indicators | Measure | Data sources |
---|---|---|---|
Are there early positive signs of change that might be attributable to the program? (P4) |
Increase in people who experience violence’s feelings of safety and support People who use violence report to understand the factors contributing to their behaviour, and how it impacts others |
|
|
To what extent are the outputs being realised? (P5) | Uptake of programs among people who use violence and people who experience violence |
|
|
Have people who use violence and people who experience violence responded positively to the program, including enrolment, attendance/retention and satisfaction? (P6) |
Increase in people accessing the programs |
|
|
Increase in referrals |
|
|
|
Reduction in number of referrals not taken up for case management and intervention programs |
|
|
|
People who use violence reported level of satisfaction of the program |
|
|
|
What are the barriers and enablers to effective referral of participants? (P7) | Number of referrals and drivers of this |
|
|
What governance and partnership arrangements have been established to support the implementation of the initiatives and are these appropriate? (P8) | Presence of governance and partnership arrangements and attitudes toward these |
|
|
Frequency and nature of FSV and DHHS’s interaction with service providers |
|
|
|
Do the program workforces have a clear idea of their roles and responsibilities? (P9) | Stakeholders report to have a clear understanding of their role in program delivery |
|
|
What components of the model are perceived to be the most valuable? (P10) | Identification of enablers |
|
|
What improvements to the service model could be made to enhance its impact? (P11) | Identification of barriers and improvement opportunities |
|
|
Have there been any unintended consequences, and if so, what have these been? (P12) | Identification of unintended consequences |
|
|
Efficiency
Evaluation questions | Indicators | Measure | Data sources |
Has the department demonstrated efficiency in relation to the establishment and implementation of the program? (P13) | FSV/DHHS resources used to implement the program have not been wasted |
|
|
Impact evaluation
Appropriateness
Evaluation questions |
Indicators |
Measure |
Data sources |
|||
Are the programs responding to the identified need/problem? (I1) |
Increase in perpetrators and women who use force accessing intervention programs and case management, including where they otherwise would not have (uptake) |
|
|
|||
Perpetrator and women who use force report the program has been appropriate for their needs |
|
|
||||
What are the design considerations of the program to support scalability? (I2) |
Stakeholder assessment of program scalability |
|
|
|||
Effectiveness
Evaluation questions |
Indicators |
Measure |
Data sources |
|||
Have the program inputs, activities and outputs led to the desired change mapped out in the program logic? [1] (I3) |
Service provider workers challenge violence, threatening and controlling attitudes and behaviours |
|
|
|||
Service provider workers encourage people who use violence to recognise the effects of their violence on others and take responsibility for their behaviour |
|
|
||||
People who use violence report to understand the factors contributing to their behaviour, and how it impacts others |
|
|
||||
Have program participants and victim/survivors responded positively to the program (enrolment, attendance, completion, satisfaction)? (I4) |
As per the process evaluation question plus: Number or enrolments, attendance rates, completion rates |
|
|
|||
What are the drivers for effective participant engagement in the programs? Does this differ according to the different cohorts? (I5) |
Reasons for the increase in people accessing the programs Reason for engagement in the program |
|
|
|||
What is the impact of the program on victims/survivors perceptions of safety? (I6) |
Increase in people who experience violence’s feelings of safety and support |
|
|
|||
What were the barriers and facilitators to the programs being integrated into the broader service system? (I7) |
Stakeholders views on system barriers and facilitators |
|
|
|||
What impact has the program had on the management of risk associated with this cohort? (I8) |
Providers use and experience of MARAM (risk assessment framework) |
|
|
|||
Decrease in perpetrator use of violence and women who use force |
|
|
||||
What impact has the program had on referral pathways and information transfer between community services and relevant authorities? (I9) |
|
|
|
|||
What impact has the program had on the confidence, knowledge and skill of the case management and service delivery workforces in supporting the target cohort in the community? (I10) |
Case managers reportedly feel confident in undertaking their role |
|
|
|||
Are key stakeholders, including the program workforces, supportive of the model? (I11) |
Stakeholders express support for the model |
|
|
|||
What would be the impact of ceasing the program (for example, service impact, jobs, community) and what strategies have been identified to minimise negative impacts? (I12) |
Identification of the impact and mitigation strategies |
|
|
|||
Efficiency
Evaluation questions |
Indicators |
Measure |
Data sources |
|||
Has the program been delivered within its scope, budget, expected timeframe, and in line with appropriate governance and risk management practices? (LP) (I13) |
Extent to which the program was delivered with fidelity and within planned scope, budgets and timeframes |
|
|
|||
Has the department demonstrated efficiency and economy in relation to the delivery of the program? (LP) (I14) |
The program could not have been delivered in less time, or with less human or financial resources |
|
|
|||
The number of people who use violence referred to the program is as anticipated |
|
|
||||
Does the initial funding allocated reflect the true cost required to deliver the program? (I15) |
Cost to deliver the program compared with original budget |
|
|
Italicised = lapsing program evaluation guidelines
Program refers to both the case management program and perpetrator intervention trials
Italicised evaluation questions reflect those that have been added by Deloitte Access Economics, that are in addition to the lapsing program guidelines and the questions posed by Family Safety Victoria in the RFP
[1] This question aligns with the lapsing program evaluation question: What is the evidence of the program’s progress toward its stated objectives and expected outcomes, including alignment between the program, its output, departmental objectives and any government priorities?
Updated