Over the past three years the APS program in Nevada has searched high and low for a Quality Assurance (QA) Best Practice for Adult Protective Services (APS) handbook, but no specific handbook exists. There are hundreds of articles that outline QA practices for manufacturing and health care but searches never yielded much in terms of QA practices specifically for APS. Through our QA program development, we have adopted procedures to collaborate, improve and promote the Nevada APS program. We are happy to share our learning with other APS programs.
Commitment to QA
Why develop a QA process? Why is it worth investing time and effort into a QA process? We felt that a QA process would lead to:
- improved services for clients;
- accountability for staff and the program;
- policy and program improvements;
- identification of service gaps for clients and identification of staff training needs.
Create and Implement the Process
We began by defining the QA standards and goals and created aligning policies and procedures. For example, we created a QA policy that delineated the timeline of case file review submittals by supervisors, the amount of reviews and the type of cases that should be reviewed. In the process, QA staff solicited input from program staff on what standards and goals made sense in practical application. Finally, we created QA tools to be utilized by program and QA staff.
One QA tool consists of the Case File Review (CFR) Form, which is designed to help maintain consistency statewide. For example, one question asks supervisors to review if the “Client has been offered information on community resources?” Each question is directed to provide further information for supervisors and QA staff to determine if the investigator is following policy and procedures appropriately. QA staff created the Case File Review Form Instructions as a tool to ensure that supervisors statewide were answering questions consistently. The QA staff provide spot check reviews of random selected CFR’s monthly.
Figure 1 shows the Quarterly Case Review Meeting Process. We integrated the QA process into the program through small steps. We implemented changes in phases. We organized short- and long-term goals strategically and created a clear timeline of the QA process to reduce stress for program staff.
Compiling and Measuring Results and Turning Case File Review Responses into Meaningful Statistics
We considered the best tracking system for the QA unit and the APS program including cost and availability, how we wanted to present the data, and what was the best fit for our program. We established how to compile data, measure results, and interpret findings that are meaningful and understandable to the program by using a spreadsheet to track and trend results. Monthly, we track the answers provided for each question on the CFR form submitted by program staff that result in a dashboard reflecting trends. We document the steps of how we track the results and provide a presentation file to share with the APS program staff each quarter.
Figure 2 shows an example of the dashboard reflecting quarterly trends pulled from a spreadsheet.
Data to Action
Once we have the information, what do we do with it? We use data to assist with compiling the CFR answers statewide. The data consists of percentages of "yes" for all applicable answers to each CFR question. The compiled data is important in implementing needed changes in the APS program. Data tells a story of the great work accomplished by the program. Data can also be used to give positive praise, identify needed training, and to identify areas of needed improvement.
And, of course, we share the results! Because, establishing time frames for sharing results is important, we created specific timelines to share our results every quarter or as needed. We established a mode of sharing the results with the program. We provide results via email and within a standing meeting. We determined that we would share the results with management, supervisors, and all APS staff. We found it important to establish where results were saved and who had access to the results. And, importantly, we save APS QA results in a cloud storage drive that both the program and QA unit have access to.
Action to Change
To increase action to change, QA has adopted a new procedure where specific areas of improvement for the program are identified by the QA staff. This lessens the stress and effort of the program by streamlining the specific areas of improvement. We adopted the practice of providing a “shout-out” for an area of improvement from the previous quarter. We additionally provide four areas that need improvement, or a process or practice change that may need to be considered to achieve sustainable change in a particular area.
shows QA quarterly recommendations that reflect key findings and provides strategy recommendations to the program.
Solicit Feedback, Be Open to Revisions, and Adjust Process as Needed
It is important that QA staff seek feedback from program management and staff. The goal is to make continued improvements to the program and if a QA practice is not working or needs be changed, QA will adjust as needed. For example, Elder Protective Services changed to APS resulting in changes to the CFR form and instruction, to address the unique needs of the new population served. We have an open and honest communication with the APS program, and this makes it is easier to give and receive feedback.
Understanding the Potential Drawbacks in Developing a QA Process
We evaluated drawbacks to establishing a QA process. We understand that a QA process can be time consuming and there can be pushback from program staff. We are careful to ensure that QA goals are focused on process improvement and on the final data results. It is important to realize that QA staff and program staff are accountable for issues uncovered through QA data and findings. The results need to be used to make meaningful changes to the program and program practice. We continually ask ourselves, what do we do now with the discovered data?
Nevada’s experience creating a QA unit and implementing QA practices has been very positive. The practice of meaningfully engaging APS program management, supervisors, and staff has paid off. Staff understand the need for the QA procedures, and supervisors and managers appreciate the information that the QA procedures produce. The information has helped the program identify training needs, provided data to APS leadership to support needed policy changes, and identified client needs. Nevada’s QA process helps to ensure that their APS program is guided by data and is constantly evolving and improving for the sake of its workers and clients.
What did you think of this blog post? Take our five-question satisfaction survey to let us know!