Skip Navigation
Policy Innovation Profile

Statewide, All-Payer Financial Incentives Tied to Process, Patient Experience, and Outcomes Measures Lead to Better and Less Variable Hospital Performance


Tab for The Profile Tab for Expert Comments
Comments
(0)
   

Snapshot

Summary

An adaptation of the Centers for Medicare & Medicaid Services Hospital Value-Based Purchasing Program, Maryland’s Quality-Based Reimbursement Program adjusts individual hospital payment rates each year based on performance on a set of metrics in three areas: adherence to evidence-based care processes in four care domains (heart attack, heart failure, pneumonia, and surgical care), the patient care experience across eight dimensions of care, and overall risk-adjusted mortality rates. Poor-performing hospitals lose up to 0.5 percent of revenues, whereas better performers earn up to a similar amount. The program has generated improvements in process-of-care measures, reduced variations in performance on these measures across hospitals, and served as a catalyst for cross-hospital efforts to boost patient experience scores. It is premature to assess the program’s impact on mortality rates because that component has not yet gone fully into effect.

Evidence Rating (What is this?)

Moderate: The evidence consists of pre- and post-implementation comparisons and comparisons between State and national averages of performance on various process-of-care measures, along with post-implementation activities stimulated by the program to improve hospital performance on patient experience measures.
begin do

Developing Organizations

Maryland Health Services Cost Review Commission
end do

Use By Other Organizations

As noted, the CMS Hospital Value-Based Purchasing Program, which went into effect in October 2012, operates very similarly and applies to hospitals in most other states.

Date First Implemented

2009
The first adjustments to payment rates took place in July 2009, based on performance during calendar year 2008.

Problem Addressed

Many hospitalized patients receive suboptimal care and have a less-than-ideal patient experience. For example, many do not receive services that have been proven to produce better outcomes, and many feel that physicians and staff could communicate better and be more responsive. Hospitals’ poor performance stems in part from a lack of financial incentives, as the traditional fee-for-service (FFS) payment system does not reward hospitals that do a good job or penalize those that do not.
  • Suboptimal quality: Many patients receive suboptimal inpatient care. Most hospitals do not routinely follow evidence-based standards that have been shown to improve the quality of care for patients. For example, myocardial infarction patients sometimes do not receive aspirin at admission, daily during their stay, and at discharge. Heart failure patients do not routinely receive an angiotensin-converting enzyme inhibitor or angiotensin II receptor blocker or have an assessment of left ventricular function. Patients with pneumonia may not receive pneumococcal and influenza vaccinations and an antibiotic within 4 to 6 hours of arrival. Surgical patients do not always receive an antibiotic 1 hour before incision or have the drug discontinued 24 hours after surgery. Patients who do not receive these services face an increased risk of hospital-acquired infections, readmissions, and death.1
  • Suboptimal patient experience: Many patients do not have a positive experience in the hospital. For example, before implementation of this program in Maryland, hospitalized patients rated their experience more poorly than patients in many other states on dimensions of care such as cleanliness and quietness, communication, pain management, and responsiveness of hospital staff.
  • Driven in part by misaligned incentives: The traditional FFS payment system used by most public and private payers does not reward hospitals that go a good job in providing high-quality, patient-centered care, or penalize those that do not. To address this problem, the Centers for Medicare & Medicaid Services (CMS) launched the Medicare Hospital Value-Based Purchasing Program in 2012, which provides additional reimbursement to hospitals that perform well on a set of core measures, most of which evaluate hospital adherence to the provision of key evidence-based processes. However, this program does not apply to hospitals in Maryland, an “all-payer” state in which the State government sets payment rates used by all payers to reimburse hospitals for specific inpatient and outpatient services.

What They Did

Back to Top
begin pqmxml

Pertinent Quality Measures

The program evaluates performance on established process-of-care, patient experience, and risk-adjustment mortality measures; see the Additional Tools and Resources section for more details.end pqm

Description of the Innovative Activity

An adaptation of the CMS Hospital Value-Based Purchasing Program, Maryland’s Quality-Based Reimbursement (QBR) program adjusts individual hospital payment rates each year based on performance on a set of metrics in three areas: adherence to evidence-based care processes in four care domains (heart attack, heart failure, pneumonia, and surgical care), the patient care experience across eight dimensions of care, and overall risk-adjusted mortality. Worse-performing hospitals lose up to 0.5 percent of revenues, whereas better performers earn up to a similar amount. Key elements of this value-based reimbursement policy are outlined below:
  • Measures and measurement domains: Based largely on the CMS Hospital Value-Based Purchasing Program, Maryland's program measures performance in adhering to evidence-based care processes, the patient experience, and overall risk-adjusted mortality, as outlined below:
    • Evidence-based care processes: The program currently includes 21 of the 22 CMS core measures that evaluate adherence to evidence-based care processes in four care domains (surgery, heart failure, heart attack, pneumonia). (Program leaders decided not to include one CMS measure within the heart attack domain because the measure had no relevance to most hospitals in the state.) Originally, process measures included in the program did not tie that closely with the CMS core measures, but over time program leaders have decided to more closely align the two programs.
    • Patient care experience: In 2012, program leaders decided to begin evaluating the patient care experience, adding measures in the eight dimensions of care evaluated by the Hospital Consumer Assessment of Healthcare Providers and Systems (more commonly referred to as HCAHPS) survey. The eight dimensions include the following: cleanliness and quietness of hospital environment, communication about medicines, communication with doctors and nurses (two separate categories), discharge information, overall rating of the hospital, pain management, and responsiveness of hospital staff.
    • Overall risk-adjusted mortality: Starting in calendar year 2013, the program will evaluate hospitals’ overall risk-adjusted mortality rates, with 2013 serving as the first performance period and rates being adjusted beginning in fiscal year 2015.
  • Calculation and reporting of aggregate performance score: The program calculates an aggregate performance score for each hospital based on an overall clinical process-of-care score (which will account for 25 percent of the aggregate score beginning in 2015), an overall HCAHPS score (37.5 percent), and an overall risk-adjusted mortality rate (also 37.5 percent). (Before full implementation of the mortality measure in calendar year 2015, the clinical measure accounts for 70 percent of the total score and HCAHPS for 30 percent.) Each year hospitals receive information on the relevant performance benchmarks that will be used to calculate performance within each component of each measurement category. At the end of the year, hospitals receive reports detailing their actual performance versus these benchmarks. More details on the major components of the aggregate performance score are outlined below:
    • Clinical process-of-care score: The clinical process-of-care score is calculated as the percentage of patients who receive the evidence-based care process outlined in each measure, with performance on individual measures being aggregated to arrive at a total “opportunity” score. (Previously, the clinical process-of-care score had a second component—known as an “appropriateness” score—that evaluated the proportion of patients within a care domain receiving “perfect care” on all recommended care processes for their condition. Beginning in fiscal year 2015, this component will no longer factor into the scoring, as program leaders wanted to simplify and better align the program with the CMS program.) To determine the opportunity score, performance for each measure is calculated in two different ways, with the hospital receiving the better of the two scores, as outlined below:
      • Achievement score: This score compares performance on each measure at a point in time with both a benchmark (mean performance among the top 10 percent of hospitals in the state) and a threshold level of performance (the 50th percentile for hospitals in the state). To be consistent with the CMS initiative, program leaders recently decided to exclude “topped-off” measures from this calculation--i.e., those measures in which performance by most hospitals is already quite strong and variations in performance are minimal. (Before this decision, topped-off measures were included, but the benchmarks and thresholds were set at a lower level.) A hospital’s score on a particular measure will be determined by how well it performs, with more points awarded for exceeding benchmark than threshold performance.
      • Improvement score: This score evaluates the change in a specific hospital’s score since the previous year. Hospitals that improve significantly generally receive a high score in this area. However, if that improvement represents a “bounce back” from poor performance, the hospital does not get credit for improving and the hospital’s achievement score solely determines the clinical process-of-care score. In other words, hospitals that continue to improve year after year receive credit for that improvement. Those that perform well one year, experience a decline the following year, and then bounce back with better performance the year after that do not.
    • HCAHPS (patient experience) score: The HCAHPS or patient experience score is calculated in the same manner that CMS uses, with two separate components, as outlined below:
      • Base performance score: The base performance score, which accounts for 80 percent of the overall HCAHPS score, measures the percentage of “top-box” answers (e.g., “always” or the best-possible score) within each of the eight HCAHPS dimensions of care. Within each dimension, hospitals get credit for the better of an improvement score (based on change in score from the prior baseline period) or an achievement score, which compares performance with the statewide median during the baseline period.
      • Consistency score: The consistency score, which accounts for 20 percent of the overall HCAHPS score, evaluates whether hospitals are meeting achievement thresholds across all eight dimensions of care. Hospitals that meet the achievement thresholds receive the highest possible consistency score.
    • Risk-adjusted mortality rates: This component measures a hospital’s overall risk-adjusted mortality rate and compares it with an expected mortality rate based on the performance of other hospitals with patients of similar severity. Overall mortality rates are aggregated based on specific rates within each case type, using proprietary software developed by the 3M Corporation (3M™ All Patient Refined DRG Grouper Software). As noted, the mortality measure will not begin to have an impact on hospital rate adjustments (see below) until 2015, as 2012 scores will represent the first performance period.
  • Rate adjustments based on performance: In July of each year, reimbursement rates for individual hospitals are adjusted up or down based on the overall performance score during the most recent calendar year for which data are available. For example, rate adjustments for fiscal year 2014 (which begins July 1, 2013) are based on performance during the 2012 calendar year, with the 2011 calendar year used as the baseline. These adjustments are made in a revenue-neutral manner, with the better performing hospitals receiving net increases in rates funded by reductions in rates for the poorer-performing hospitals. The worst-performing hospital loses up to 0.5 percent of its total inpatient revenue, whereas the best performer receives an equivalent amount. During the 2012 fiscal year, the state reallocated more than $7 million across 46 acute-care hospitals through this program.

Context of the Innovation

Through the Maryland Health Services Cost and Review Commission, the state of Maryland operates a prospective payment system that establishes specific payment rates for all inpatient and outpatient services provided by acute-care hospitals in Maryland. These rates apply to all public and private payers, as the state has a Federal waiver that exempts it from the national fee schedules established by Medicare and Medicaid (including any pay-for-performance systems). Because it sets rates for all payers, the Commission has the ability to create consistent payment incentives for hospitals across Maryland, something that few other State governments can do because payment systems and financial incentives typically vary across public and private payers. For a number of years, the Commission has been experimenting with new payment systems to promote higher value care, in some cases acting in advance of and in other cases in tandem with evolving payment programs established by CMS. Historically, most of these efforts focused on controlling the growth of hospital costs, something that Maryland has been successful in doing.3

Recognizing that little attention had been paid to the potential for innovative payment systems to promote higher quality care, Commission leaders decided in the mid-2000s to begin exploring the development of pay-for-performance programs to encourage improvements in this area. Around the same time, CMS leaders were planning two similar types of initiatives for the Medicare program—the Medicare Hospital Value-Based Purchasing Program (which required congressional approval and did not get implemented until the 2012 Federal fiscal year) and a nonpayment policy for hospital-acquired conditions. The QBR Program was modeled after the former program, with refinements made to address Maryland-specific issues and concerns. A few years after the launch of QBR, Commission leaders decided to follow CMS by creating financial incentives to reduce hospital-acquired conditions. To that end, they created the Maryland Hospital-Acquired Conditions Program (MHAC), a "sister" program to QBR that rewards and penalizes hospitals based on their success in preventing common, costly hospital-acquired conditions. This program is featured in another profile, available at: http://www.innovations.ahrq.gov/content.aspx?id=3854.

Did It Work?

Back to Top

Results

The QBR program has generated improvements in performance on all process-of-care measures, reduced variations in performance on these measures across hospitals, and served as a catalyst for cross-hospital efforts to boost HCAHPS scores. It is premature to assess the program’s impact on mortality rates because that component has not yet fully gone into effect.
  • Better performance on process measures: Between 2007 and 2010, performance on all process measures improved, with the average improvement being 7.31 percentage points, slightly more than the 6.86 percentage-point increase that occurred among hospitals across the entire country. The degree of improvement in Maryland exceeded that for the country as a whole for nearly half of the measures, although the difference was statistically significant for only three of them. The biggest difference occurred for influenza vaccines to pneumonia patients, in which performance improved by an average of 20.5 percentage points in Maryland (from 71.5 to 92.0 percent), well above the 15.1-percentage point increase in the rest of the Nation (from 78.7 to 93.8 percent). The degree of performance improvement in Maryland lagged that of the rest of the Nation for only two measures—providing smoking cessation counseling to heart failure patients and giving surgery patients an antibiotic at the right time.2
  • Trend toward less variation in performance: As the improvement described above occurred, the difference between the best- and worst-performing hospitals also narrowed, suggesting less variation in performance across facilities.
  • Catalyst for collaborative efforts to boost HCAHPS scores: Thus far, only one set of rate adjustments based on HCAHPS performance has been made, and data do not yet exist to determine if those adjustments have affected performance. However, the baseline data showed that Maryland hospitals significantly lagged their counterparts elsewhere in the Nation, suggesting ample opportunity for improvement. In response, the Maryland Hospital Association launched an HCAHPS learning collaborative to promote quality improvement. The vast majority of hospitals (37 out of 46) quickly agreed to join this voluntary effort, suggesting that the QBR program’s combination of performance measurement and financial incentives has created a strong motivation for hospitals to improve.
  • Premature to assess impact on mortality: The mortality component of the program will not go into effect until fiscal year 2015, so the program has not yet had a chance to influence inpatient mortality rates.

Evidence Rating (What is this?)

Moderate: The evidence consists of pre- and post-implementation comparisons and comparisons between State and national averages of performance on various process-of-care measures, along with post-implementation activities stimulated by the program to improve hospital performance on patient experience measures.

How They Did It

Back to Top

Planning and Development Process

Various committees and workgroups took charge of the planning and development of the QBR program, as outlined below:
  • Multistakeholder workgroup to initiate program: Formed in 2005, an initiation workgroup comprising representatives from the hospital, payer, policy, public health, and quality measurement and improvement communities (including those at the Federal and local level) took charge of developing the program. This group worked over a period of several years to create the initial version of the program, including the measures to be used and the scoring and payment adjustment methodologies. This group disbanded in 2008.
  • Evaluation workgroup to evaluate and refine program: An evaluation workgroup formed in 2008 (the initial baseline year for the QBR program). Made up of experts in research and evaluation along with many of the same representatives from the hospital, payer, policy, and public health communities, this group took charge of evaluating how the program was working and made recommendations for needed changes and additions. This group has also disbanded.
  • QBR/MHAC workgroups to update and align programs: In 2011, a joint QBR/MHAC update workgroup formed. Made up of industry representatives with expertise in quality, this group has taken charge of comparing and aligning the QBR program with the CMS Hospital Value-Based Purchasing Program. A separate joint payment workgroup made up of representatives from the payer and hospital communities who have expertise in finance and payment also formed, taking charge of updating payment-related components of the QBR and MHAC programs, including the appropriate magnitude of revenues at stake and scaling and payment methodologies.

Resources Used and Skills Needed

  • Staffing: Four Commission employees work on the QBR program on a part-time basis, including two associate directors and two analysts. One associate director has a nursing background and expertise in quality and policy issues. The second has a background in clinical quality and analytics; this individual produces the targets, benchmarks, thresholds, and tools to assist hospitals in measuring and improving performance. Overall, the time spent by these four individuals on QBR is roughly equivalent to one full-time employee. At the hospital level, program-related requirements are generally folded into the responsibilities of existing staff. Because Maryland is an all-payer state, hospitals may have a lower overall quality reporting burden than in other states in which multiple payers have incentive programs, each with different requirements.
  • Costs: The program entailed modest upfront costs for the Health Services Cost and Review Commission, including purchasing software and conducting staff education and training. On an ongoing basis, program-related costs consist of a share of the compensation costs (salaries and benefits) for the four employees who work on the program. The performance data used in the program are required to be reported to the state as well as to CMS and the Joint Commission and are publicly reported; therefore they do not contribute to the program costs.

begin fsxml

Funding Sources

Maryland Health Services Cost Review Commission
The Maryland Health Services Cost and Review Commission pays for program-related costs out of its internal operating budget; as noted, the payment adjustment methodology is designed to be revenue neutral.end fs

Tools and Other Resources

More information on the Maryland program is available at: http://hscrc.maryland.gov/init_qi_qbr.cfm.

More information on Medicare’s Hospital Value-Based Purchasing Program is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/Hospital-Value-Based-Purchasing
/
.

More information on scoring methodologies used for the HCAHPS component of the Medicare Hospital Value-Based Purchasing Program is available at: http://www.hcahpsonline.org/Executive_Insight/.

More information on the 3M software used as part of this program is available at http://solutions.3m.com/wps/portal/3M/en_US/Health-Information-Systems/HIS/Products-and-Services/Products-List-A-Z/APR-DRG-Software/.

Adoption Considerations

Back to Top

Getting Started with This Innovation

  • Involve key stakeholders in planning and vetting program: Although it took a long time to put the program in place, the iterative, multistakeholder process proved necessary and worthwhile, allowing key players (e.g., payers, hospitals) to have their concerns raised and addressed. This process resulted in several modifications to the initial proposal, such as the decision not to use one measure related to heart attack care because it was not relevant to the majority of hospitals.
  • Model and share impact of payment scenarios: Early in the process, the initiation workgroup developed and shared various realistic payment scenarios with key stakeholder groups so that they could see the potential impact of the program.
  • Consider merits of aligning with national initiatives: The QBR program began as a fairly complex initiative that did not align that closely with the CMS Hospital Value-Based Purchasing Program. This lack of alignment was deliberate, as QBR program leaders wanted to develop a program that would resonate and have an impact locally. Over time, however, this complexity became difficult for participants to understand, particularly as the larger CMS program matured. As a result, QBR leaders decided to simplify the program and align it much more closely with the CMS initiative. That being said, not all components are the same—for example, QBR is following CMS's lead by adding mortality to the performance score, but it is not using the same mortality measures as CMS.
  • Ensure appropriate mix of measure types: The QBR program initially included only the evidence-based process measures, because at this time these types of measures were the only ones that had been widely tested and accepted in research and provider communities. Moreover, these measures did not require use of sophisticated and sometimes controversial risk-adjustment methodologies. As the science of measurement has improved and other types of measures have become accepted, the QBR program has added outcomes and patient experience measures. Any organization considering adoption of this type of program today should consider starting with the mix of measurement types that QBR now has in place, including process measures (which offer actionable information on how to improve), patient experience measures, and outcomes measures.
  • Start with modest rewards and penalties: Winning the support of key stakeholders may be easier if the potential rewards and penalties start out small, thus giving an opportunity for any issues or challenges to be addressed before the program begins to have a major financial impact on hospitals.

Sustaining This Innovation

  • Consider increasing rewards and penalties over time: To date, the QBR program has not increased the potential money at stake, leaving the maximum reward or penalty at 0.5 percent of revenues. Program leaders have, however, increased the incentive pool in MHAC (QBR’s “sister” program that addresses hospital-acquired conditions), and consideration may be given to raising the money at stake in QBR as well.
  • Provide benchmarks and thresholds as early as possible: Hospital leaders need to understand the performance targets they are being evaluated against as early as possible. In the past, these targets have sometimes not been available until after the performance year started. This issue is being addressed so that the targets can be known ahead of time.
  • Provide actionable tools: Hospitals need actionable reports, tools, and other resources to help them understand and improve performance, including detailed information that pinpoints specific process-of-care and patient experience domains or measures where performance may be lagging. At present, data-timing issues have prevented the QBR program from providing hospitals with their performance scores more than once a year. However, the goal is to provide such information more regularly—ideally on a quarterly basis (as is done with the MHAC program).

Use By Other Organizations

As noted, the CMS Hospital Value-Based Purchasing Program, which went into effect in October 2012, operates very similarly and applies to hospitals in most other states.

More Information

Back to Top

Contact the Innovator

Dianne Feeney
Associate Director, Quality Initiatives
State of Maryland Department of Health and Mental Hygiene
Health Services Cost Review Commission
4160 Patterson Avenue
Baltimore, MD 21215-2299
(410) 764-2582
E-mail: dianne.feeney@maryland.gov

Sule Calikoglu, PhD

Associate Director of Performance Measurement
State of Maryland Department of Health and Mental Hygiene
Health Services Cost Review Commission
4160 Patterson Avenue
Baltimore, MD 21215
(410) 764-2522
E-mail: sule.calikoglu@maryland.gov

Innovator Disclosures

Ms. Feeney and Dr. Calikoglu reported having no financial interests or business/professional affiliations relevant to the work described in the profile.

References/Related Articles

Calikoglu S, Murray R, Feeney D. Hospital pay-for-performance programs in Maryland produced strong results, including reduced hospital-acquired conditions. Health Aff. 2012;31(12):2649-58. [PubMed]

More detailed information on this program is available at: http://www.hscrc.state.md.us/init_qi_qbr.cfm.

Footnotes

1 Centers for Medicare & Medicaid Services. Quality Measures Compendium V.2.0: Medicaid and SCHIP Quality Improvement Compiled by the Division of Quality Evaluation and Health Outcomes, Family and Children's Health Programs Group. 2007.
2 Calikoglu S, Murray R, Feeney D. Hospital pay-for-performance programs in Maryland produced strong results, including reduced hospital-acquired conditions. Health Aff. 2012;31(12):2649-58. [PubMed]
3 Murray R. Setting hospital rates to control costs and boost quality: the Maryland experience. Health Aff. 2009;28(5):1395-405. [PubMed]
Comment on this Innovation

Disclaimer: The inclusion of an innovation in the Innovations Exchange does not constitute or imply an endorsement by the U.S. Department of Health and Human Services, the Agency for Healthcare Research and Quality, or Westat of the innovation or of the submitter or developer of the innovation. Read more.

Original publication: July 03, 2013.
Original publication indicates the date the profile was first posted to the Innovations Exchange.

Last updated: July 03, 2013.
Last updated indicates the date the most recent changes to the profile were posted to the Innovations Exchange.