February 20, 2009

Assessment of Mental Health Post Payment Review
Prepared for the Illinois Dept. of Human Services
Division of Mental Health

Parker Dennison & Associates, Ltd.
480.419.4147 Phone
480.247.5100 Fax

RustyD@ParkerDennison.com

www.ParkerDennison.com

Executive Summary

At the direction of the Department of Human Services/Division of Mental Health (DHS/DMH), the Illinois Mental Health Collaborative for Access and Choice (Collaborative) initiated post payment review (PPR) of Medicaid and non-Medicaid fee for service claims in September 2008. In response to growing fee for service billing and consequent Federal audit risk for Medicaid claims, and the heretofore absence of any monitoring of non-Medicaid fee for service claims, the process implemented by the Collaborative represented a significant enhancement to the mental health system's audit risk management. Key changes included:

  • Increased Frequency of PPR-Audits increased from once every three years to once per year, recognizing that three years created the potential for unreasonable audit risk in a fee for service environment.
  • Sample from Medicaid and non-Medicaid Claims-Previously, non-Medicaid funded services, representing a significant portion of funded services, were never reviewed for consistency with applicable rules and standards.
  • Prepare for Extrapolation-At DHS/DMH direction, the PPR sample size has been increased to have validity for purposes of extrapolation should that prove necessary. This is in line with Federal OIG practice where audit findings are extrapolated at the provider level.
  • Enhanced Auditor Credentials-The Collaborative enhanced the credentials of the reviewers to the LPHA level meaning that all reviewers are licensed at the independent practice level, thereby ensuring all audits are viewed through the highest standard of clinical training.

Through December 31, 2008, a total of 3,070 claims had been audited from 32 providers statewide. As expected with any new process, there has been feedback from providers and trade groups regarding the process and results. DHS/DMH requested that Parker Dennison and Associates, Ltd. (Parker Dennison) conduct a detailed assessment of all aspects of the post payment review process. Information sources for the review included: focus group feedback from 25 providers statewide, Regional staff, the Collaborative, review of policies, forms, training materials, and protocols, review of comparable Federal Office of Inspector General (OIG) audits, and review of applicable state and Federal Rules and guidance. Analysis of preliminary audit results data was also completed.

Focus of Post Payment Review

Parker Dennison found the focus of the PPR process to be very appropriate and consistent with elements cited in comparable Federal OIG audit reports from Illinois and other states. Not all elements represented in OIG audit results can be found in the Illinois audit tool and DHS/DMH should consider periodically rotating in additional or different elements. There does appear to be a limited opportunity for DHS/DMH to identify some elements of the audit as 'procedural' (such as "method of communication" on the mental health assessment) that would necessitate correction but not recoupment. Since there is an inconsistent definition of 'procedural' elements as evident in OIG audit reports, DHS/DMH must be cautious in applying this interpretation.

Provider Training & Communication

Materials reviewed indicate that there were adequate, though not ideal, notice and training offered to providers regarding the revised post payment review. The training offered regarding PPR was comparable to that which is offered in many states but was not reflective of best practice. Primary omissions included lack of provision of detailed interpretive guidelines, lack of a structured and an ongoing 'frequently asked questions' process for providers, and lack of ongoing aggregate feedback and communication with providers after the audits began in the state.

The Collaborative's Audit Staff Resources

A review of credentials found that all PPR audit staff are credentialed at the independent practice level (LPHAs), and have experience in Illinois community mental health. In addition, two of the auditors are Certified Recovery Support Specialists. Feedback from providers indicated a high degree of satisfaction with the professionalism, courtesy, and clinical knowledge of the auditors. The initial audit-specific training of auditor staff was adequate but could be improved by an expanded inter-rater reliability process, involvement of DHS/DMH Regional staff for cross training purposes, and a competency scoring process inclusive of an observed audit at the conclusion of the training. Ongoing training should include periodic inter-rater reliability review.

Post Payment Review Process

Feedback directly from providers that had been audited under the PPR process was the primary source of information for this portion of the review. Wherever practical, Parker Dennison verified an issue before including it in the findings or recommendations. The majority of concerns regarding the PPR process were related to procedural issues regarding how the audit was conducted. Concerns regarding the disruption to staff and client services associated with the unannounced nature of the PPR review are valid and DHS/DMH should review this policy going forward. Some inconsistencies between reviewers regarding the audit protocols and provider staff participation were noted, as were concerns about the lack of provider training regarding interpretive guidelines. Opportunities for process improvement were noted such as improving the detail of the final audit report to the provider so that they may use it more effectively for their own process improvement and for appeals if necessary.

Analysis of PPR Results

Parker Dennison was able to review and analyze two new reports of PPR results through December 31, 2008. These reports, not yet out of internal beta review but deemed accurate, indicate both the significance of the PPR issue for the Illinois mental health system, and the powerful tool that PPR offers to support system-wide process improvement and the management of audit risk. Additional detail in reporting, and training/distribution to providers are noted as important improvements necessary to enhance ongoing utility of the data. Results from the audits of the first 32 providers indicate a statewide average of 55% of reviewed claims unsubstantiated by documentation. Actual (non-extrapolated) value of these unsubstantiated claims is $101,937 or an approximate average of $3,186 per audited provider. If this pattern of results continued with the remaining providers in the network, and extrapolation was immediately applied, the negative impact on funding would be devastating. However, the data also indicated that the overwhelming majority of these unsubstantiated claims could be addressed by improvement in relatively few elements. For example, correcting just the top two most frequently cited deficiencies would reduce the statewide average for unsubstantiated claims from 55% to 29%, while correcting the top seven elements would reduce it to just 12%. The majority of the most frequently cited elements are clear absent/present issues which would suggest that they should be correctable through focused training and improved internal quality assurance measures at the provider level. Significantly, the most frequently noted deficiencies on the Illinois PPRs are also frequently cited in Federal OIG audit reports for recoupment. Lastly, the data available did not indicate conclusive evidence of inter-rater reliability concerns, though this should be consistently monitored via the Collaborative's internal quality monitoring process and reported to DHS/DMH.

Summary

Overall, Parker Dennison found the DHS/DMH post payment review process as implemented by the Collaborative to be above average, especially considering reviews have been occurring for only five months. Nonetheless, the assessment identified significant improvements that should be considered in order to move the Illinois DHS/DMH model closer to best practice.

In addition, preliminary results from the completed post payment reviews suggest a very significant unsubstantiated claim problem that requires immediate and thoughtful process improvement. Given that virtually all of the elements audited by the Collaborative can be found cited in OIG audit findings from other states, inattention to improvement would likely place Illinois at compounding risk of adverse OIG determinations at some point in the future.

Background and Context

Consistent with the role delegated by the Illinois Medicaid Authority, the Department of Healthcare and Family Services (HFS) to the Department of Human Services/Division of Mental Health (DHS/DMH), the Bureau of Accreditation, Licensure, and Certification (BALC) has historically conducted certification and post payment review for Medicaid claims on behalf of DHS/DMH. As part of DHS/DMH's preparation for effective stewardship in a fee for service reimbursement environment and in response to increased scrutiny of Medicaid claiming across the country by the Federal Office of Inspector General (OIG), DHS/DMH included an enhanced post payment review process in its procurement of an Administrative Services Organization (ASO) in FY 2008. As part of its contract with DHS/DMH, the selected ASO, ValueOptions doing business as the Illinois Mental Health Collaborative for Access and Choice (the Collaborative), was contracted in part to:

  • Implement DHS/DMH Policy-The Collaborative does not set policy but rather administers policy set by the state authority including DHS/DMH. By contract all criteria, forms, correspondence, and training materials used to administer policy must first be approved by DHS/DMH.
  • Increase Frequency-increase the frequency of post payment review (PPR) from once every three years, to annually for all DHS/DMH contracted providers. This was necessary because three years of claiming in a fee for service environment created an excessive potential audit risk for individual providers and the mental health system if there were inappropriate service claiming practices.
  • Sample from Medicaid and non-Medicaid-Expand PPR to include fee for service claims from both Medicaid and state funded services. Historically, BALC conducted PPR on Medicaid claims only, meaning that a significant portion of funded services was never reviewed for consistency with applicable rules and standards.
  • Prepare for Extrapolation-Increase the PPR sample size to allow DHS/DMH to extrapolate results within a provider should it choose to do so. Federal OIG has historically extrapolated their audit results across all Medicaid claims at the state level but has notified states that it is now extrapolating results at the provider level.
  • Enhanced Credentials-Enhance reviewer credentials to 100% Licensed Practitioners of the Healing Arts (LPHAs). As the Federal OIG has increased its focus on issues of medical necessity and distinguishing between habilitation and rehabilitation, DHS/DMH wanted review staff at the independent license level to conduct its PPR to be in the best position to counsel providers and defend claims as needed.

Beginning in September 2008, the Collaborative formally began conducting post payment reviews of the DHS/DMH contracted network. While early feedback from the provider community including the provider trade groups was positive, DHS/DMH is receiving feedback from providers with questions and concerns regarding the PPR process. Accordingly, DHS/DMH requested that Parker Dennison and Associates, Ltd. (Parker Dennison) conduct an assessment of the DHS/DMH post payment review process as administered by the Collaborative.

Objectives

The scope of Parker Dennison's efforts included:

  • Understand specific provider feedback regarding what is working in the post payment review process, what needs improvement and/or detailed review, and their recommendations;
  • Review the model of post payment review for consistency with best practices for fee for service, Medicaid and state funding models;
  • Review the policies, processes, and tools used to conduct post payment review for consistency with applicable State/Federal Rules or guidelines;
  • Review the Collaborative's internal policies and processes for audit staff training, including ongoing inter-rater reliability;
  • Review the coordination and oversight of the Collaborative's post payment review process by DHS/DMH;
  • Review post payment review results by provider and in aggregate for patterns/trends; and
  • Provide written feedback and recommendations regarding the PPR process.

Method

Two Parker Dennison consultants, Rusty Dennison, MA, MBA and Lee Ann Slayton, MS conducted this

assessment. Information regarding the post payment review process was gathered through:

  • Statewide teleconference with Parker Dennison consultants and 25 providers representing all five regions;
  • Review of correspondence (including letters and emails) regarding PPR from the provider trade groups and providers;
  • Interviews with DHS/DMH staff including regional staff who have participated in portions of the PPR process with Collaborative staff;
  • Interviews with review staff from the Collaborative;
  • Review of Collaborative written materials including forms and policies/procedures;
  • Review of Collaborative review staff related materials including verification of staff credentials, staff training materials, internal supervision, and inter-rater reliability efforts;
  • Review of communications and training materials for providers regarding the PPR process;
  • Analyses of PPR results by provider/region/aggregate;
  • Examination of all appeals including results; and
  • Review of relevant portions of the Illinois Medicaid Rehabilitation Option State Plan, Illinois Rule 132, Federal guidance where available, and reports from mental health related Federal OIG audits.

This report summarizes the findings and recommendations resulting from the above information sources.

Post Payment Review Overview

Why Post Payment Review-Medicaid

For services paid by the Medicaid program, Federal guidelines require the state's single Medicaid Authority to conduct surveillance to identify and remediate fraud and abuse. The importance of this requirement was enhanced with the passage of the Deficit Reduction Act of 2005 (DRA), which required the Centers for Medicare & Medicaid Services (CMS) to establish the Medicaid Integrity Program.1 Currently funded at $75 million annually, this program was designed to support states in effective efforts to combat fraud and abuse.

The DRA requires Federal CMS to hire contractors to conduct the following activities:

  • Review Medicaid providers' actions to determine if fraud or abuse has occurred;
  • Audit claims for services;
  • Identify overpayments; and
  • Educate providers, beneficiaries, and others with respect to payment integrity and quality of care issues

The CMS contractors are retained in part to implement the National Medicaid Audit Program that has the

objectives to ensure that paid Medicaid claims are:

  • For services provided and properly documented
  • For services billed properly using the appropriate procedure codes
  • For covered services
  • Reimbursed appropriately according to State policies, rules or regulations

The audit risk potential for Illinois DHS/DMH is significant. DHS/DMH's FY2008 Medicaid fee for service budget is approximately $184 million and is expected to grow in FY2009 due to an increase in the number of Medicaid eligible individuals, state efforts to increase appropriate Medicaid billing, and various Federal initiatives designed to expand Medicaid eligibility and temporarily increase Federal financial participation (FFP) rates.

Why Post Payment Review-Non-Medicaid

In an effort to maintain access to a uniform benefit package for all persons, regardless of their Medicaid eligibility status, except for a few specifically defined 'non-Medicaid services'2, DHS/DMH requires that providers adhere to Rule 132 (Illinois' Mental Health Medicaid Rule) for fee for service billed services. In part, this has allowed DHS/DMH to maintain a substantial portion of its community expenditures for individuals not covered by Medicaid, enabling access to earlier intervention. While this position clearly reflects the values of the Division of Mental Health and is consistent with the early intervention tenets of the New Freedom Commission, it is a position increasingly rare among other states. In addition, following trends in other states, the need for non-Medicaid resources are expected to increase due to the economic downturn, while state funding continues to be reduced. Lastly, as Medicaid expenditures increase, DHS/DMH will face increasing pressure to shift non-Medicaid service dollars into paying the state's portion of Medicaid costs (50%).

DHS/DMH's FY2008 non-Medicaid fee for service budget is approximately $72.5 million. While the BALC has historically conducted PPR on Medicaid claims, non-Medicaid claims have not been subject to a uniform program of post payment review in recent years. Therefore, this highly valuable resource that is central to DHS/DMH's vision of a comprehensive mental health system has effectively gone unmonitored. There are no data to suggest that error rates on non-Medicaid claims are less than those on Medicaid claims, and in fact the lack of monitoring arguably creates an atmosphere where internal controls by a provider would have better been focused on Medicaid claims.

Post Payment Review Focus

Compliance with Federal/State Regulations

In order to ascertain the focus of audits conducted by Federal Health and Human Services, OIG, Parker Dennison reviewed reports from mental health audits conducted in Illinois as well as other states. Reports included the most recent audit of Medicaid services in Illinois community mental health providers, as well as audits conducted in Iowa (three separate audits related to mental health services), Georgia, and Ohio. All six

reports reiterated that the scope of the audits included ensuring compliance with all of the following:

  • The Federal requirements of the State Medicaid Manual;
  • The State requirements of the approved Medicaid State plan;
  • The State Administrative Code or Rule governing implementation of the Medicaid State plan; and
  • Payment rate schedules.

Elements Requiring Recoupment

Audited elements of claims and supporting information varied somewhat from state to state but were nearly uniformly consistent with the elements reviewed in the Illinois audit. The most recent Illinois OIG audit, which was conducted in 2006, cited the following as the scope of review for each claim selected3:

  • Reviewed the supporting documentation including assessments, treatment plans, medication authorizations, and admission and service notes to assess overall compliance with regulatory requirements;
  • Confirmed that services were paid accurately based on correct payment rates and service locations;
  • Verified client eligibility for services;
  • Confirmed that services were furnished by qualified staff at appropriately certified providers; and
  • Determined whether provider documentation supported the provision of services for purposes of direct client care and for diagnosing, treating, preventing, or minimizing client physical or mental health impairments.

Reports from other states noted additional audited elements including:

  • Medical necessity;
  • Habilitation versus rehabilitation service provision;
  • Services targeted for the benefit of the covered person; and
  • Services inclusive of active treatment and direct intervention.

In addition to the consistently reviewed elements above, the scope of audits was detailed in the nature of findings. For example, in the Illinois audit report, findings which were extrapolated for payback included:

  • The provision of services was not documented;
  • The furnished services did not involve direct patient care, or were not for the purpose of diagnosing, treating, or preventing impairment to an individual's physical or mental health;
  • Treatment plans were not signed or reviewed by the appropriate staff;
  • Incorrect service payment rates were used;
  • Treatment plans did not support furnished services;
  • The number of service units was not supported; and
  • Staff were not appropriately designated as required.

Findings

  1. Scope of DHS/DMH PPR is Appropriate-After reviewing the specific elements audited by the Collaborative on behalf of DHS/DMH, it appears that they are well within the scope likely to be audited by the Federal OIG. Though Parker Dennison found the audit tool to be reasonably thorough, it does not contain all elements found to be audited in at least some OIG audit results from other states.
  2. OIG Audits Do Not Seek Recoupment On All Elements-Some OIG audit reports, including Illinois', note audit findings that do not comply with a state's Medicaid Administrative Rule, but are deemed to be 'procedural' and do not require a payback to Federal CMS for that claim. There was no discernable pattern that Parker Dennison could identify regarding which parts of a state's Medicaid Rule OIG considered worthy of a recoupment versus those that were considered 'procedural'. In fact, in at least one instance the same element was viewed as procedural in one state but was on the recoupment list in another.

Recommendations

  1. Sample Additional Elements-Since the PPR tool does not cover all elements that are noted in OIG audit reports of community mental health providers, DHS/DMH may consider periodically rotating in additional or replacement elements (for items with consistent compliance) such as medical necessity, habilitation, active treatment, and family services for the direct benefit of the covered individual. In order to be consistent with OIG review patterns, adverse findings in these elements likely should be considered for recoupment.
  2. Identify Elements Considered Procedural-Critical to this recommendation is a careful assessment of risk by DHS/DMH. DHS/DMH could explore designating some Rule 132-based audited elements as 'procedural' that would not require recoupment if found deficient (though correction would still be required). Typically the 'procedural' element would likely be a subpart of a given item or an ambiguously defined element. Examples noted from providers that might fit into this category include an 'incorrectly' identified method of communication ("English" instead of "speaking" or "signing"), or the absence of a signature for a QMHP and an LPHA on a treatment plan when the responsible clinician is an LPHA (which is inclusive of the QMHP designation). While Parker Dennison is generally supportive of this recommendation, since there is no definitive guidance from CMS or OIG and there is an inconsistent pattern evident in OIG audit reports, DHS/DMH likely should be conservative regarding how much of the specifically defined Rule 132 elements it allows to be designated 'procedural'.

Provider Training

Parker Dennison has found provider training to be a best practice to maximize the overall 'success' of the mental health system, whether that success is measured as favorable audit results, consumer satisfaction, or recovery outcomes. Being understandable, specific, and unambiguous about required processes and rules through training, sets the stage for monitoring and detailed feedback to improve the results of the process or compliance with the rules. In Parker Dennison's experience, those mental health systems that have been transparent in their audit processes by specifically sharing audit processes, forms, interpretive guidelines, and individual as well as peer group results, tend to more rapidly improve and maintain overall system performance.

A brief survey of the amount of training done in other states regarding post payment review found a great deal of variability. Approaches generally fell into one of three categories:

  • No discernable training by the state on PPR or on underlying Rules;
  • Training on the underlying Rules but little or no training on the PPR process, tools or interpretive guidelines; or
  • Training on the underlying Rules and on the PPR process, tools and interpretive guidelines.

In most states, provider trade associations were also notably active in organizing training regarding underlying Rules, documentation practices, and compliance in general.

Findings

The PPR-related training provided by DHS/DMH and the Collaborative was adequate and certainly more than many states offer. However, it was not reflective of best practices and can be improved upon.

  1. Training Related to the Underlying Rule Was Provided-DHS/DMH did provide provider training on each of the last two changes to Rule 132. The trainings provided an overview of the Rule, summarized the changes, and provided reference documents or links to documents for additional information.
  2. Training Related to the PPR Process Was Provided-DHS/DMH and the Collaborative provided training in January 2008 that provided a brief look at upcoming PPR, and again in August 2008 with a more detailed training regarding PPR. The detailed training covered:
  • Timeline;
  • Notice that PPR will continue to be unannounced and in conjunction with BALC whenever possible;
  • Overview of the PPR process;
  • Posting of process/agenda and tools on the Collaborative website;
  • Clarification of roles: BALC, DHS/DMH, Collaborative;
  • Information on fidelity/clinical reviews held in conjunction with PPR;
  • Introduction of staff;
  • Overview of reporting process, including timelines for sending in additional information;
  • Overview of appeals process;
  • Statement PPR findings will not be extrapolated at this time.
  1. PPR Forms, Protocols, and Tools Were Provided-DHS/DMH and the Collaborative made available to providers the PPR forms, protocols and tools on the Collaborative website. It should be noted that not all of these documents were on the website at the time of the first audits. Documents included:
  2. FY08 Provider Monitoring Tools

    • ACT Fidelity Instructions
    • ACT Review Tool
    • CST Fidelity Instructions
    • CST Review Tool
    • Clinical Record Review Tool - Percent Version
    • Clinical Record Review Tool - YES/NO Version

    FY09 Provider Monitoring Tools

    • ACT Fidelity Instructions
    • ACT Review Tool
    • CST Fidelity Instructions
    • CST Review Tool
    • Clinical Practice Review Tool
    • External Protocols for Post-Payment and Clinical Practice and Guidance Reviews
    • Mental Health Assessment Required Elements
    • Post-Payment Review Tool
  1. Interpretive Guidelines to the New PPR Were Not Provided-While interpretive guidelines have been provided regarding Rule 132 Certification Reviews that is based on Rule 132, DHS/DMH and the Collaborative did not provide interpretive guidelines to match up to the PPR review forms/tools. These interpretive guidelines do exist and are used as part of the training for Collaborative audit staff. Making these guidelines available via training would address a significant issue voiced by providers who felt that the Collaborative staff were interpreting elements differently than BALC had done previously.
  2. No FAQ or Aggregate PPR Feedback to Providers-DHS/DMH has a history of providing a forum for Frequently Asked Questions (FAQs) for providers regarding Rule 132, ROCs, and other processes or procedures that come up from time to time. This process has not been implemented for post payment review so there is no uniform 'clearing house' for questions from the field regarding PPR, nor is there a written record available to providers documenting the answers to questions. Instead, providers have been referred to the Collaborative, DHS/DMH Regional Staff, and/or Central Office Staff creating an obvious opportunity for inconsistent answers. Though early it is still early in the PPR implementation process, DHS/DMH and the Collaborative have not provided aggregate feedback to the providers summarizing patterns of PPR scores, frequent issues/problems, and the associated solutions. This inhibits the process improvement potential that such feedback and ongoing 'training' would otherwise provide.

Recommendations

  1. Offer PPR Training for Providers Again-The PPR training materials from the August 2008 training should be reviewed and updated based on the knowledge gained from field application, and be offered again prior to the beginning of the next fiscal year. These training materials should also be made available on the Collaborative website at all times as reference for provider staff. Information regarding sampling methodology should also be included, especially if DHS/DMH anticipates implementing extrapolation any time in the next fiscal year.
  2. Include Interpretive Guidelines in Training Materials-DHS/DMH should update the interpretive guidelines for the forms and tools audit staff use, make them available to providers, and offer provider training regarding their application for provider staff training and self-auditing.
  3. Maintain a PPR FAQ Process-DHS/DMH and the Collaborative should establish a dedicated, wellpublicized PPR inquiry e-mail address for questions from the field. This process and content must also be coordinated and consistent with the existing Rule 132 inquiry process to avoid confusion regarding interpretation issues. These should be triaged between DHS/DMH and the Collaborative as part of DMH's policy setting role and their oversight of the PPR process, and responses should be documented and posted on the Collaborative website. Frequency of these postings will depend on the volume of questions and the urgency of the questions.
  4. Provide Aggregate Feedback to Providers-DHS/DMH should make reports available to providers which summarize PPR scoring by provider, region, and statewide. Periodic training calls with providers should be held to review the patterns evident in the reports and provide technical assistance regarding performance improvement, whether that action is internal to providers or system wide.

Collaborative Audit Staff Resources

Audit staff credentials, training, supervision, and performance feedback, all contribute to the quality and effectiveness of the post payment review process. In Parker Dennison's experience, PPR best practice is reflected when audit staff are:

  • Clinically trained to at least the QMHP level;
  • Experienced with community mental health;
  • Thoroughly trained including some form of competency review;
  • Supervised regularly including a periodic, measured inter-rater reliability process; and
  • Evaluated on their skill of provider interface measuring the process, professionalism, and clarity of their audits.

DHS/DMH has made significant progress toward PPR best practice through its policies as implemented by the Collaborative. Though it is infrequent to find a PPR process staffed, prepared, and supervised at the full best practice level, optimal practice would set these as the goal. Acknowledging that the importance of these practices rise as the financial consequences to the provider increases, it should also be noted that lack of full adherence to all of these practices in no way should be interpreted as singularly invalidating the audit findings.

Findings

  1. Audit Staff Have Exceptional Credentials-DHS/DMH required audit staff from the Collaborative to minimally meet the credentials to qualify as a QMHP. Recognizing the criticality of these positions, of their own volition the Collaborative chose to propose that the audit staff be qualified as Illinois LPHAs', fully licensed at the independent practice level. DHS/DMH accepted this proposal and Parker Dennison verified that all audit staff were qualified as LPHAs. In addition, all auditors have direct experience with community mental health and currently, two are also credentialed as Certified Recovery Support Specialists.
  2. Training Curriculum for Audit Staff is Adequate but Needs Improvement-The training curriculum for Collaborative audit staff was thorough and inclusive of expected areas. However, the curriculum seemed heavy on didactic training with only a modest amount of experiential training. Fairly detailed interpretive guidelines were provided for all forms and tools. Some items on the audit form that were noted by providers as being inconsistently interpreted did not have associated interpretive guidelines, which likely contributes to the problem.
  3. Competency Review and Inter-rater Reliability is Inadequate-In review of the Collaborative's audit staff training program, there was no formally defined competency testing/review process at the conclusion of training, and only a very modest inter-rater reliability review. There was no evidence of ongoing inter-rater reliability review between audit staff once their original training was complete.
  4. Ongoing Supervision of Audit Staff is Adequate but Needs Improvement-All Collaborative audit staff report to the Director of Provider Relations and receive weekly supervision via face to face or a phone meeting. This is primarily accomplished through a weekly call co-staffed by DHS/DMH and the Collaborative supervisor, in which the audit staff bring up questions and issues they are encountering in the field. While this is useful, there is no formal record of issues and outcomes and the process does not address comparative performance, or data based feedback (e.g., audit scores, satisfaction survey results).

Recommendations

  1. 1. Maintain LPHA Staffing Credentials-Though audit-specific training and supervision can be improved, the fact that audit staff are LPHAs creates tremendous clinical credibility and positions them to be a significant factor in practice change. LPHAs in this role can also be useful to validate authorizations, target population determinations, LOCUS scores, and other qualitative measures that should be monitored from time to time as part of a system-wide quality management system.
  2. 2. Enhance Audit Staff Training Curriculum-The training curriculum for audit staff should be updated to reflect experience from the field and provider feedback. Specifically, changes should include:
    1. More experiential training with peer review and feedback; 
    2. Revised and more detailed interpretive guidelines;
    3. One or more directly observed audits; and d. Involve DHS/DMH Regional and Central Office staff for cross training and oversight.
  3. Formalize Competency Review and Inter-rater Reliability-Staff who are new to the PPR audit process should successfully complete a documented competency review before initial training is considered complete. The competency review should test specific skills and knowledge necessary to be proficient in the review process. This should include an inter-rater reliability component that measures the auditor's ability to score audit items consistent in interpretation with a minimum standard.
  4. Enhance Supervision of Audit Staff-Group supervision of audit staff is a valuable and time efficient method. It is particularly helpful in offering support among peers and in encouraging common understanding of interpretations. This should be supplemented with additional techniques such as:

    All supervision should be documented either through minutes of group supervision or individual supervision notes, all of which should be available for periodic review by DHS/DMH as part of its contract oversight of the Collaborative.

    1. Periodic individual supervision, especially to discuss feedback from the field or personalsupervision issues;
    2. Direct observation of the audit staff in situ by the supervisor;
    3. Periodic re-scoring of inter-rater reliability; and
    4. Review and discussion of data based elements such as aggregate scoring variations among auditors and audit satisfaction survey results.
  5. Include PPR Survey Results and Inter-rater Reliability in QI Process-The Collaborative currently surveys each provider on their satisfaction at the end of each onsite PPR visit. These surveys should be analyzed, reported, and where indicated, addressed through the Collaborative's internal quality improvement (QI) plan and process. Similarly, as the Collaborative implements a process for ongoing inter-rater reliability for audit staff, those results should flow into the internal QI process. The overall evaluation of these processes should ultimately be reported to DHS/DMH as part of their contract oversight.

PPR Process

This section primarily reflects specific feedback obtained from providers that have experienced a post payment review by the Collaborative. Similar to Parker Dennison's experience in other states, comments and issues raised by providers were mixed. Given that PPR was just implemented approximately five months ago and less than 50% of the DHS/DMH contracted network has had a PPR audit, feedback was more positive than what Parker Dennison would have expected at this stage of the process.

Provider Feedback

Comments have been grouped into categories.

  1. Audit Staff Clinical Knowledge, Professionalism, Courtesy-Multiple comments, letters, and emails noted an overwhelmingly positive impression of audit staff. Even comments that indicated issues with the audit process indicated that they did not get enough time to benefit as much as they could have from the technical assistance of the audit staff.
  2. Unannounced Audits-The consequences of unannounced audits was a frequent comment. Issues included the time and difficulty of gathering medical records from multiple sites, unplanned disruption of clinician and supervisor time resulting in cancelled or changed consumer appointments, key staff away on vacation, and difficulty in setting up stations for review of electronic medical records.
  3. Sample Size-Several providers, especially those with fewer consumers, noted that the sample size for audit was very large. These small providers reported that up to 60-70% of their client records were needed for audit and that this felt like an unusually large amount. Providers noted that they were given no information regarding how the sample size was determined or its confidence level.
  4. Provider Staff Participation During Audit-Several providers expressed a desire to have some of their staff participate in the audit process. Specific requests for participation included:
    1. Assisting to find missing documentation;
    2. Understanding interpretations;
    3. Broader participation in exit conferences, especially with specific examples; and
    4. Review of detailed audit summary sheet (specific charts and findings).

      It was also noted that there is inconsistency among Collaborative audit staff regarding how they will allow provider staff to participate. Also noted was that the audit protocol posted on the Collaborative's website indicated that provider staff would be asked to assist in finding records or missing components, yet this was not a standard part of the audit process.

  5. Lack of Interpretive Guidelines and Associated Training-Many providers expressed concern regarding the manner in which certain audit elements were interpreted. Specific types of concerns included:
    1. Differing interpretation from previous audits by BALC;
    2. Differing interpretation from DHS/DMH Regional staff guidance;
    3. Differing interpretation between Collaborative staff;
    4. Too literal of interpretation of some elements;
    5. Reviewing at too detailed level for some elements;
    6. Interpretations that appeared to not be supported by Rule 132; and
    7. No opportunity to review interpretive guidelines in advance and no associated training.
  6. Audit Report Not Specific Enough-Many providers expressed concern that the audit report did not provide enough detail for them to use effectively for internal correction or appeal. It was noted that the current version of the audit report does cite the record and the claim but does not identify the specific problem or deficiency. This is especially problematic for items with several parts with the most frequently noted example being the mental health assessment.
  7. Call for Standardized Mental Health Assessment Form-Since the overwhelming majority of audit findings state-wide related to the mental health assessment (see Audit Results Analysis later in this report), several providers requested that DHS/DMH create a standard format that meets Rule 132 standards and that it be required of all providers. As expected, several other providers indicated they would not be in favor of this because it would conflict with their electronic medical record software.
  8. Early Problems with Appeals Process-Several of the providers expressed confusion about the appeals process, especially during the first couple of months of audits, cited the lack of detailed reports making it difficult to discern what specifically needed to be addressed in the appeal, and differing information from DHS/DMH Regional staff and Collaborative staff.

Findings

  1. Unannounced Audits-The issues providers reported are inherent consequences of unannounced audits. The original purpose of the unannounced audits was to be consistent with BALC practice, especially for those audits where BALC and Collaborative staff have coordinated to be onsite at the same time in an attempt to respond to past provider requests to minimize provider staff time and disruption.
  2. Sample Size-Prior to the initiation of PPR, at DHS/DMH's request, the Collaborative proposed a sampling methodology that would be valid for purposes of extrapolation should DHS/DMH choose to do so at some point in the future. The Collaborative proposed a sample size of 100 claim lines for each provider that would result in a precision of +/- 10% at the 90% confidence level. This is substantially higher than Federal OIG audit sampling. For example, for the last Illinois OIG audit, there were a total of 2,965,413 qualified claim lines and the OIG auditors sampled just 200 claim lines and used that as the basis for full extrapolation of results.
  3. Provider Staff Participation During Audit-The Collaborative's published audit protocol does indicate that auditors should allow provider staff to help find missing documentation. It is also their protocol to allow provider staff (at the provider's discretion) to attend the exit conference. It has largely not been BALC's past practice to allow provider staff to sit in at all times during record review and the Collaborative has generally followed this practice except with those providers with electronic medical records where provider staff may assist for efficiency.
  4. Lack of Interpretive Guidelines and Associated Training-This has been identified previously in this report including specific recommendations for remediation.
  5. Audit Report Not Specific Enough-The providers' feedback on the audit reports appears to be accurate. The attachment to the audit report does list specific claims, consumer RIN, and a reason code, but it does not describe the specific deficient element(s) on those elements that are multipart. As noted, this was reported most frequently as an issue for providers when the deficiency was part of the mental health assessment. Collaborative auditors do have a detailed worksheet for the mental health assessment review which indicates the sub part(s) and whether each subpart complied but they do not summarize this data at the detail level and therefore, do not report it in the audit summary.

    Unfortunately, they also do not maintain this detail data in a database or retain the hard copy so do not have the ability to summarize it at this time.

  6. Call for Standardized Mental Health Assessment Form-Parker Dennison understands that this issue has been discussed and explored with DHS/DMH previously. The use of standardized forms is an issue in many other states and the responses range from states requiring a specific form, to providing a standard form but not requiring its use, to no guidance at all. Valid arguments can be raised on all sides though it should be noted that the best form in the world does not ensure appropriate completion, timely maintenance, or otherwise full compliance with Rules if not used appropriately.
  7. Early Problems with Appeals Process-At the time of this assessment, fiscal year to date there had been only three appeals submitted to the Collaborative. The Collaborative appears to have an appropriate appeal flow, a timely tracking log, and is following through with their responsibilities. At this time, it is difficult to discern whether the early processes were problematic. Since deficiencies are not extrapolated at this time, nor do providers have to pay back actual findings yet (though claims must be backed out of the billing system), it would seem that there is no direct harm to providers from any early confusion that might have existed. It is critical however, that the appeal processes (one for Medicaid and one for non-Medicaid claims) be clear to providers, especially should DHS/DMH choose to extrapolate results at some point in the future.

Recommendations

  1. Unannounced Audits-Parker Dennison recommends that DHS/DMH explore changing the PPR audit protocol so that advance notice (probably in the 2 week range) is given to providers prior to a routine audit. Of course, the state must always retain the right to conduct unannounced audits should it be necessary in their judgment. This policy change may require either a change in BALC's protocol or a separation of the PPR audit from the BALC certification visits.
  2. Sample Size-Sample size should not be a one-size-fits-all model as it is most currently applied now (100 claim lines per audit regardless of the provider's specific universe of claims)4. Rather, the Collaborative should use a statistical tool to determine sample size unique to each provider's claim line volume for the review period. This individualized sample size should have a uniform confidence level. This will result in smaller providers having a fewer number of claim lines audited while larger providers will have more claim lines audited. It will also require the Collaborative to schedule onsite time based on the number of claim lines to review at each provider. To be consistent with Federal OIG, the Collaborative should use RAT-STATS 2007, Version 2, which is available at no cost through the Federal Department of Health and Human Services, OIG - Office of Audit Services website: http://www.oig.hhs.gov/organization/oas/ratstats.asp.
  3. Provider Staff Participation During Audit-DHS/DMH should clarify the policy it wishes the Collaborative to consistently follow regarding this issue and the Collaborative's protocol should be modified accordingly. Efficacy would suggest a balance between having provider staff present at all times and potentially engaging in 'negotiation' of each finding, with the practical need of getting assistance from provider staff to find documentation while onsite and not forcing extra administrative follow up work.
  4. Audit Report Not Specific Enough-Parker Dennison also found that though adequate, the audit summary reports do lack the necessary detail to be most useful. Deficiency coding should map to all subparts of an element as well as have a line for text entry of additional information. For example, a mental health assessment that did not meet criteria might be coded as "2A" where "2" mapped to 'mental health assessment' and "A" mapped to subpart A. The text line after this could be "indicated English as method of communication". Alternatively, the Collaborative could attach copies of their worksheets for claims found to be deficient. The audit summary report must also indicate for each claim found deficient whether that claim is a Medicaid or non-Medicaid claim since each type has a different time frame for appeal. Additionally the Collaborative should be profiling all PPR scoring at the detail level and reporting profiles of issues from this to DHS/DMH so appropriate systemic provider training and performance improvement can be taken when indicated.
  5. Clarify and Communicate Updated Appeal Process-DHS/DMH should review the appeals process and training materials to ensure that it is clear, concise and current, for both Medicaid and non-Medicaid process. If changes are indicated, the Collaborative should be directed to update materials and either re-train providers and/or communicate the revised policy to providers. This revised policy or a reference to it on a website should also be included with each final audit notice where any claims are found deficient.

Analysis of PPR Results

Parker Dennison was able to review two newly generated reports summarizing PPR results, both spanning 9/1/2008 to 12/31/2008. The reports summarize the audit results by provider, region, and statewide for 32 providers with reviews of a total of 3070 claims. One report focused on summarizing the unduplicated 4 Out of 32 providers, 2 have been sampled materially less than 100 claims. Unsubstantiated claims and their associated dollar value while the other provided a frequency distribution of reasons for unsubstantiated claims by reason code. It should be noted that these reports, though deemed accurate, are still in beta review and have not been distributed to Regions or providers as of this date.

Findings

  1. Reports are Useful for Macro Analysis but Limited-Both reports are very helpful in beginning to understand macro issues within the PPR process. However, there are at least two issues that limit their utility for detailed analysis and overall system risk assessment.
    1. As noted previously, detailed reason codes are not present so it is impossible to identify  specific sub-parts of elements that caused deficiencies. This limits focused training and performance improvement.
    2. Neither report breaks down unsubstantiated claims by Medicaid and non-Medicaid. This limits  the analysis of system risk from Federal OIG audits (that only look at Medicaid claims and are extrapolated) versus that from internal/state audits where there are greater degrees of freedom to set policy on recoupment and extrapolation.
  2. Percentage of Unsubstantiated Claims is Considerable-Though variable by Region, when expressed as a percentage of total claims audited, the statewide average for unsubstantiated claims is 55%, meaning that only 45% of audited claims had complete documentation. See Table 1 for a break down by Region. Again, it must be noted that the reports did not break down the data by Medicaid/non-Medicaid, though anecdotally, Collaborative auditors report that the overwhelming percentage of claims audited have been Medicaid.
  3. Non-Extrapolated Dollar Value of Unsubstantiated Claims is Modest-Assuming the value of actual unsubstantiated claims only (i.e., without extrapolation), the statewide grand total of recoupable value is $101,937 or an approximate average of $3,186 per audited provider. Unsubstantiated claim value by provider ranged from a low of $400 to a high of $13,582. Again this is highly variable by Region. Table 1 provides a break down by Region.

Table 1

DHS/DMH Region Unsubstantiated Claims

Grand Total Value of Actual

Unsubstantiated Claims

Region 1C (N=7) 51% $29,201
Region 1N (N=6) 44% $25,355
Region 1S (N=2) 23% $3,715
Region 2 (N=6) 27% $6,990
Region 3 (N=1) 92% $4,186
Region 4 (N=6) 71% $16,265
Region 5 (N=4) 89% $16,225
State Wide Average Total (N=32) 55% $101,937
  1. Extrapolated Dollar Value of Unsubstantiated Claims is Devastating-Specific analysis of the extrapolated dollar value of unsubstantiated claims was not possible because Parker Dennison did not have the total claims amount for the audited period for these providers. In addition, the total value would be significantly affected by the Medicaid/non-Medicaid status of the claims unless DHS/DMH were to establish a policy of uniform extrapolation for both fund sources. For purposes of illustration, however, if one were to assume that 100% of claims reviewed were Medicaid AND the above pattern was valid and remains consistent across the remaining 120 providers, given a Medicaid fee for service budget of $184 million, the recoupment value with full extrapolation would be $101.2 million, 50% of which would be owed to Federal CMS.
  2. Specific Audit Elements Comprise the Majority of Deficiencies-When analyzing a frequency distribution of unsubstantiated claims by reason code, clear patterns emerge at both the Regional and statewide level. Statewide aggregate analysis shows only 2 out of 19 elements that fall below a 90% compliance level, and only 7 out of 19 elements below a 95% compliance level. Detail is provided in Table 2 below.

    Table 2

    Element Label Compliance
    Elements Below 90% Compliance Statewide (N=3,070 claims)
    Q2: MHA contains all required elements 66.87%
    Q10: Note contains description of interaction with consumer including response and progress towards ITP goal 81.14%
    Elements Below 95% Compliance Statewide (N=3,070 claims)
    Q2: MHA contains all required elements 66.87%
    Q10: Note contains description of interaction with consumer including response and progress towards ITP goal 81.14%
    Q3: Tx plan is timely and in effect at time of service 92.67%
    Q5: Service activity documented supports volume of service billed 92.83%
    Q11: There is a service note to match the billing date on the claim 92.05%
    Q12: Note description matches the actual service billed 93.71%
    Q16: Service is authorized by the Individual Treatment Plan 94.07%
  1. Improvement in a Few Elements Would Improve Results Substantially-Since one claim can have more than one deficiency, there have actually been 3,407 citations of deficiencies producing 1,695 unsubstantiated claims, indicating that on average each unsubstantiated claim had approximately two deficiencies. Assuming that pattern continues, correction of  Q2 and Q10 above could net an improvement of 798 fewer unsubstantiated claims out of 1,695 or in other words, a 47% improvement statewide. Illustrated another way, this would drop the unsubstantiated claim level from 55% to 29% statewide.

    If all seven of the questions in column two of Table 2 were corrected, following the same analysis the impact would rise to 1,330 fewer unsubstantiated claims out of 1,695, thereby dropping the statewide total of unsubstantiated claims from 55% to 12%.

  2. Frequent Deficiencies are Also Cited in OIG Audits-All of the most frequently noted deficiencies in Table 2 above have previously been cited for recoupment in Federal OIG audits in Illinois and/or other states. While it is possible that DHS/DMH policy has imposed a more stringent interpretation of the guidelines for elements where qualitative judgment is involved, many of these questions are 'present/absent' oriented and should not be affected by qualitative judgment on the part of the audit staff.
  3. Insufficient Data to Determine the Impact of Inter-rater Reliability-Parker Dennison was provided  names of Collaborative staff who were the principal auditors for each provider. There was insufficient data to identify confirmed patterns of scoring between audit staff. Since audit staff have not been exclusively designated to Regions and out of necessity have at times done reviews in locations throughout the state, there also was no discernable correlation between principal auditor and Region. As somewhat of a 'failsafe' measure, appeal levels at present are low, perhaps suggesting that auditor reliability is good. However, as addressed elsewhere in this report, in the absence of a documented and ongoing inter-rater reliability process at the Collaborative, this dimension cannot be ignored.

Recommendations

  1. PPR Reports Should be Enhanced and Available to Providers-Existing reports should be enhanced and/or additional reports created and made available to Region staff and providers. The enhancements should include:
    1. Detailed reason codes that identify specific sub-parts of elements that caused deficiencies; and
    2. Summarization by Medicaid and non-Medicaid categories.

It would be helpful to go through these reports the first time on a statewide or Regional webinar to improve the likelihood of a common understanding of their content and utility.

  1. Magnitude of PPR Results Requires Timely Action-Given the analysis contained in this report, by any measure the volume of unsubstantiated claims is a significant problem. Simply stopping PPR or requiring the Collaborative to remove audited elements would not eliminate the problem but rather exacerbate the issue by not correcting an ever growing OIG audit risk. Therefore, DHS/DMH should work with the Collaborative to develop a specific and timely process improvement plan addressing the most significant of the issues identified in this report or subsequent analysis that may occur. The process improvement plan should be multifaceted and include at least:
    1. Consideration of designating some sub-parts of elements as 'procedural' which would not result in an unsubstantiated claim but would require correction on the part of the provider;
    2. Re-training of providers on PPR using revised training materials and interpretive guidelines;
    3. Competency review for new Collaborative audit staff and an ongoing Inter-rater reliability process for existing staff;
    4. Targeted technical assistance (TA) and training on the few most frequently cited deficiencies;
    5. Written, structured, and monitored performance improvement plans for providers scoring below an established threshold;
    6. Enhanced and timely reporting on PPR so progress may be monitored; and
    7. Clarification of policy by DHS/DMH regarding expectations for providers regarding recoupment, extrapolation, contract impact, and performance improvement requirements.

Summary

Overall, Parker Dennison found the DHS/DMH post payment review process as implemented by the Collaborative to be above average, especially considering reviews have been occurring for only five months. Nonetheless, the assessment identified significant improvements that should be considered in order to move the Illinois DHS/DMH model closer to best practice.

In addition, preliminary results from the completed post payment reviews suggest a very significant unsubstantiated claim problem that requires immediate and thoughtful process improvement. Given that virtually all of the elements audited by the Collaborative can be found cited in OIG audit findings from other states, inattention to improvement would likely place Illinois at compounding risk of adverse OIG determinations at some point in the future.

Given that this is a new and more intensive review process, it is reasonable that there be a DHS/DMH policy and timeline for phasing in recoupment and extrapolation. Recognizing that few of the requirements are truly "new" given that they are referenced to Rule 132, the intensified scrutiny does create new emphasis to which providers may need some time to adjust.

Footnotes

  1. The following information is summarized from the MICFactSheet508: "Centers for Medicare & Medicaid Services/Medicaid Integrity Group Increasing Medicaid Integrity through Support & Oversight", July 2008.
  2. Includes such services as vocational, outreach, and non-consumer specific education. These services are available to consumers covered by Medicaid as well as those not eligible for Medicaid coverage and are defined as Group C services in the DHS/DMH Reimbursement Guide.
  3. These elements are directly cited from the CMS Report A-05-05-00055, "Review of Medicaid Community Mental Health Provider Services in Illinois:, September 29, 2006.