Skip Navigation
Service Delivery Innovation Profile

Unobtrusive, Actionable Reminders and Performance Feedback Improve Physician Performance on Standardized Quality Measures


Tab for The Profile
Comments
(0)
   

Snapshot

Summary

Through its commercial electronic medical record system, a large internal medicine practice provides physicians with unobtrusive reminders related to 16 measures for heart disease, heart failure, diabetes, and preventive services. The practice also makes it easy to address the alerts by ordering appropriate tests or treatment, or by documenting legitimate exceptions. Physicians receive quarterly feedback on individual performance, incorporating exceptions and monthly lists of individual patients who appear not to be receiving an essential medication. The program improved performance on the vast majority of the 16 targeted measures by increasing provision of recommended care and/or improving documentation of exceptions.

Evidence Rating (What is this?)

Moderate: The evidence consists of comparisons of the rate of performance improvement on 16 standardized measures during the 12-month period before implementation to the 12-month period after implementation. It also includes comparisons of the percentage of physicians achieving “perfect” (100 percent) or “near-perfect” (above 90 percent) performance on drug-prescribing measures at baseline and a year after program implementation.
begin do

Developing Organizations

Northwestern Medical Faculty Foundation; Northwestern University
end do

Use By Other Organizations

Northshore University Health System in Evanston, IL and the University of Texas Southwestern Medical Center in Dallas, TX have implemented a similar approach.

Date First Implemented

2008
Februarybegin ppxml

Patient Population

The program served patients in a large internal medicine practice who qualified for one of the 16 measures. Roughly three-fourths of eligible patients were female, while nearly half were white, one-fourth African American, and 16 percent Hispanic. The vast majority had insurance, typically private coverage (67 percent) or Medicare (26.8 percent).end pp

Problem Addressed

Point-of-care alerts and reminders generated through electronic medical records (EMRs) have met with mixed success in increasing the provision of recommended care. Many physicians become frustrated by and ultimately ignore alerts that seem intrusive and disruptive, fail to incorporate legitimate reasons for not providing the recommended service, and/or make it difficult to address the gap or document a reason for not doing so. In addition, point-of-care systems fail to address care gaps in patients who do not regularly seek care. Additional details on these problems appear below:
  • Disruptive to workflow: Point-of-care reminders often cover only one or a small number of processes recommended in quality measures, thus making their appearance relatively rare and hence disrupting to the physician. In addition, many alerts appear as “popup” screens on the EMR, thus interfering with the doctor’s normal work. Before this program began, the internal medicine practice implementing it tried a few popup reminders that physicians generally disliked and routinely ignored.
  • Inaccurate due to failure to incorporate exceptions: Physicians often ignore alerts because they think that the patient does not qualify for the recommended care. Many legitimate clinical reasons exist for not giving a patient such care (e.g., allergies, patient refusal, terminal illness), and many systems do not integrate information on these legitimate exceptions, thus leading to many “false positives.” Over time, physicians begin to ignore all alerts, assuming them to be inaccurate.1
  • Difficult to address: Even when physicians want to pay attention to alerts, doing so often proves difficult. Systems typically require multiple (sometimes not well-explained) steps for the doctor to order the needed test or treatment. If multiple care gaps exist, doctors may need to address them individually. Finally, many systems, including the one previously used by the practice implementing this program, do not make it easy (or in some cases even possible) for physicians to document a legitimate reason for not providing the service.1
  • Failure to address those who do not come in for care: Point-of-care reminders offer no help in addressing care gaps for patients who miss appointments or who have dropped out of care.1

What They Did

Back to Top
begin pqmxml

Pertinent Quality Measures

The program targets the 16 quality measures outlined below, most of which are based on national quality measures. The program developer has modified the measure criteria in some instances to make the measures as relevant and actionable as possible to physicians and feasible within the EMR. New measures have been added as the program has matured.

For chronic stable coronary artery disease:

For heart failure:

  • Percentage of adult patients with a current diagnosis of heart failure due to left ventricular dysfunction (LVD) who are currently treated with angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB), who can tolerate therapy and for whom there is no contraindication. (Based on: http://www.qualitymeasures.ahrq.gov/content.aspx?id=34467)
  • Percentage of patients aged 18 years and older with a diagnosis of heart failure with a current or prior left ventricular ejection fraction (LVEF) less than 40% who were prescribed beta-blocker therapy either within a 12 month period when seen in the outpatient setting or at hospital discharge. (Based on: http://www.qualitymeasures.ahrq.gov/content.aspx?id=34935)

For diabetes mellitus:

  • Percentage of adult patients with diabetes mellitus (type 1 and type 2) whose most recent hemoglobin A1c (HbA1c) level is less than 8.0% (controlled). (Based on: http://www.qualitymeasures.ahrq.gov/content.aspx?id=38873)
  • Percent of male patients age 18 and older and female patients age 50 and older whose most recent fasting low-density lipoprotein (LDL) was less than 100 mg/dL (in the last 12 months).
  • Percent of patients 40 years and older who have a current prescription for aspirin or other antithrombotic agent. The practice has retired this practice due to changing clinical evidence.
  • Percent of eligible patients with a diagnosis of diabetes mellitus having a nephropathy screening test during the past year or who were prescribed angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) therapy.

Preventive services:

  • Breast cancer screening: Percent of women age 50 to 69 screened with mammography in the past 2 years for breast cancer.
  • Cervical cancer screening: Percent of women age 21 to 64 screened for cervical cancer in the past 3 years.
  • Colorectal cancer screening: Percentage of patients age 50 to 75 who meet criteria for colorectal cancer screening who are up-to-date with screening.
  • Pneumonia vaccination status: Percentage of adults age 65 and older who have ever received a pneumococcal vaccination.
  • Osteoporosis: Percentage of female patients aged 65 years and older who have a central DXA measurement ordered or performed at least once since age 60 or pharmacological therapy prescribed within 12 months. (Based on: http://www.qualitymeasures.ahrq.gov/content.aspx?id=10333&search=osteoporosis+screening+or+therap)
end pqm

Description of the Innovative Activity

Through its commercial EMR, a large internal medicine practice provides physicians with unobtrusive reminders related to 16 standardized measures and makes it easy for them to address the alerts by ordering appropriate tests or treatment or documenting legitimate exceptions. It also provides physicians with accurate quarterly feedback on individual performance incorporating exceptions and monthly lists of individual patients with important medication care gaps. Key program elements are outlined below:
  • Single set of unobtrusive reminders: A single tab on the side of the EMR screen highlights in yellow any of 16 situations where a recommended process of care has not been provided—or desired outcome achieved—and a legitimate exception not documented. (The tab does not appear if no care gaps exist.) The screen does not interfere with the physician’s ability to navigate the medical record, nor does it “pop up” during the visit. Rather, the tab remains visible off to the side throughout the encounter, allowing the physician to view it when he or she desires. The 16 measures (listed previously in the Pertinent Quality Measures section) cover preventive care and care of those with heart disease, heart failure, and diabetes. The practice continues to add new high priority measures to this system.
  • Support in addressing reminders: The physician uses the system to address identified care gaps at any time during the visit, either by ordering the needed test or treatment or by documenting a legitimate reason for not doing so, as outlined below:
    • Ordering needed test or treatment: As noted, all identified care gaps appear in a single place on the screen. By clicking on the alert, physicians can easily navigate to the relevant portion of the medical record, such as ordering the test or medication or recording receipt of the test or treatment by an outside provider. The process works in one of two ways, as outlined below:
      • Order sets for clear-cut situations: Order sets exist for situations where the likely course of action seems clear, such as patients in need of a routine screening test for colorectal cancer or laboratory test for diabetes. The system automatically checks off the needed test and relevant diagnostic code on the order set. The physician need only review the checked boxes and click a button to confirm.
      • Physician-directed order for other situations: Order sets generally become too complicated in situations where many options exist—such as choosing from the large number of available drugs for heart disease. In these situations, the physician—who generally has a preferred drug in mind—types in the first few letters of the name to identify the desired drug and reveal a list of potential dosing options.
    • Documenting exceptions: The system provides standardized, easy-to-use methods to document a wide variety of legitimate reasons for not providing the service or treatment in question. Examples include patient refusals, “global” exceptions (such as the patient being terminally ill), and other medical reasons. Once exceptions have been documented, the system generally uses them in determining the existence of care gaps at future visits, effectively turning off the alert. In some situations, however, the system may not turn off the alert. For example, physicians generally continue to receive alerts for patients who refuse to accept a needed service, with the hope that they can convince the patient to accept it. The alert can be turned off if it becomes clear that the patient will not accept the care and does not want to be “bothered” anymore. Exceptions appear in a common place in the EMR.
  • Accurate quarterly performance reports: Physicians receive a quarterly report summarizing their performance on each measure, as compared with the practice average and their own prior performance. The report provides the number of eligible patients within each measure, along with the number and percent of patients in three categories—those receiving recommended care, those with legitimate reasons documented for not receiving such care, and those in need of such care (i.e., legitimate gaps). Before implementation of this program, physicians received similar reports, but they did not incorporate exceptions, thus making the data inaccurate (and hence easy to ignore).
  • Monthly lists of those with medication gaps: Each month physicians receive a list of any patient with an identified medication gap, many of whom have not been seen in the clinic recently. The lists serve as a reminder to reach out to these patients to get them in for care, or to proactively prescribe a medication for them. The list does not include patients with other types of care gaps (e.g., cancer screening). Program leaders feared that this longer list could prove overwhelming to physicians who do not have time to reach hundreds of individuals each month.

Context of the Innovation

The program took place in a 39-physician internal medicine practice that is part of the Northwestern Medical Faculty Foundation. The practice had been using an EMR for roughly 10 years. Before implementation of the program, the practice worked with the American Medical Association to test the validity of roughly a dozen quality measures. This testing found that an EMR in a high-performing practice may miss many legitimate exceptions, thus making the alerts and performance feedback systems prone to inaccuracies. In fact, physicians in the practice had been receiving quarterly performance reports for approximately 2 years, and receiving embedded “popup” alerts related to a few of the measures. However, physicians generally paid no attention to these alerts and some may not have used the reports due to these inaccuracies and other problems outlined earlier. As a result, program leaders decided to revamp the system to make it more accurate and actionable and less disruptive for physicians.

Did It Work?

Back to Top

Results

The program improved performance on the vast majority of the 16 targeted measures by increasing provision of recommended care and/or improving documentation of exceptions.
  • Improved performance across most measures: During the first 12 months after implementation, performance improved by a statistically significant amount in 14 of the 16 measures. By contrast, during the 12-month period before implementation, statistically significant improvements occurred in only 8 of 15 measures (data for cervical cancer screening was not available for the year before implementation).1 Two factors drove the better performance after implementation, as outlined below:
    • Enhanced provision of services: Part of the improvement resulted from more patients receiving recommended processes or achieving desired outcomes, with statistically significant increases in 9 of the 16 measures in the year after implementation.
    • Better documentation of exceptions: Physicians increased their documentation of legitimate exceptions, with statistically significant increases in documentation in all measures, ranging from 0.5 percent (for breast and colorectal cancer screening) to 16.6 percent (for use of anticoagulation in heart failure and atrial fibrillation).
  • Significant improvements in drug prescribing: The program has improved the prescribing of drugs to those with care gaps, as outlined below:
    • More physicians meeting targets: Program leaders initially set an aggressive goal of having each physician prescribe recommended drugs or document a legitimate reason for not doing so in all patients. The proportion of physicians meeting this 100-percent goal increased in the year after implementation across all measures (though not all of these increases met the test of statistical significance). Program leaders later realized that the 100-percent goal was likely too ambitious because they wanted to use alerts for some patients who refused a needed medication (thus making perfect adherence virtually impossible because any alert is considered a care gap). Consequently, they performed a subsequent analysis of those meeting a more realistic goal (90-percent performance), which showed significant improvements in virtually all drug-prescribing measures in the first year after implementation.
    • More proactive prescribing: Anecdotal evidence suggests that physicians act on the monthly list of patients with medication gaps, as the practice has experienced increases in prescriptions for aspirin and cardiovascular drugs, not just improved documentation.

Evidence Rating (What is this?)

Moderate: The evidence consists of comparisons of the rate of performance improvement on 16 standardized measures during the 12-month period before implementation to the 12-month period after implementation. It also includes comparisons of the percentage of physicians achieving “perfect” (100 percent) or “near-perfect” (above 90 percent) performance on drug-prescribing measures at baseline and a year after program implementation.

How They Did It

Back to Top

Planning and Development Process

Key steps included the following:
  • Producing “thought” piece: Several physician leaders collaborated to produce a conceptual framework for the ideal system. The framework called for creation of a “virtuous cycle” in which the system captured legitimate exceptions at the point of care, thus allowing it—over time—to produce highly accurate alerts and performance reports that doctors trusted and paid attention to by addressing care gaps.
  • Securing grant funding: As the leaders conceptualized this framework, the Agency for Healthcare Research and Quality (AHRQ) announced the availability of grant funding to support design, implementation, and testing of decision support systems. Northwestern applied for and received a grant to support the program's development.
  • Iterative design and testing with physicians: System design and testing proceeded in an iterative fashion, with designers developing mockups based on clinician input. Physicians tested those designs and provided feedback on needed changes in monthly 1-hour meetings over lunch. This highly iterative process led to several important design features, including steps to make the alerts fit into the flow of the clinical encounter and the selective use of prepopulated order sets where appropriate (as described earlier).
  • Training physicians on system: Clinicians not involved in system design became familiar with the program through a 1-hour meeting and electronic training materials, both of which emphasized the benefits of the new, less intrusive approach and the need to use the revamped decision support tools to provide recommended care or document legitimate reasons for not doing so. The training made clear that compensation would not be affected by performance, but that peers would review all documented medical exceptions.

Resources Used and Skills Needed

  • Staffing: Staff required for upfront design, testing, and implementation varies depending on the current state of an organization’s EMR system, including the ease with which it integrates decision support. For Northwestern, an experienced front-end programmer spent roughly one-fourth of his time over a 6-month period developing the order sets and related tools. On the back end, another programmer spent roughly half his time for a year designing systems to collect, analyze, and report on performance for the various quality measures. As noted, physicians also provided regular feedback throughout the development process. Because national groups are now developing greater specificity related to standardized quality measures, the upfront time required may be less for a new adopter of this program. Once the program is up and running, the amount of required staff time drops off considerably; the process may eventually be completely automated and hence require no meaningful staff time.
  • Costs: Program-related expenses consist primarily of the programming time outlined above. A would-be adopter would likely spend $100,000 to $125,000 to cover these costs, plus the associated costs of the time spent by physicians providing input. Northwestern spent $1.2 million on this program, but much of this effort involved having physician researchers engage in one-time activities related to measure specifications, groupings, and other activities that a would-be adopter need not do.
begin fsxml

Funding Sources

Agency for Healthcare Research and Quality
Financial support for system design, testing, and implementation came from an AHRQ grant (#1R18HS17163–01) and career development award (1K08HS015647–01).end fs

Adoption Considerations

Back to Top

Getting Started with This Innovation

  • Seek partners (particularly smaller practices): Small practices likely need financial and/or technical assistance in implementing this type of system. Hospitals and health systems may want to play this role as a way of improving relations with physicians and stimulating quality improvement, particularly given recent loosening of Federal legislation to allow hospitals to provide such support. The Federal government has also set up Regional Extension Centers to assist physician practices in meeting the Office of the National Coordinator “meaningful use” criteria related to EMRs and health information technology.
  • Offer “one-stop” shop for alerts: The program works best if it provides information on all missing care elements in one place, rather than forcing doctors to see or look for them on a piecemeal basis.
  • Avoid obtrusive popups: As noted, alerts consisted of an unobtrusive bar on the side of the EMR screen that did not disrupt the physician. This passive approach runs counter to conventional wisdom that physicians will only pay attention to popup alerts.
  • Incorporate exception documentation: Physicians will not pay attention to reminders or performance data if they think they are inaccurate. Systems that can easily capture exceptions will, over time, lead to fewer false positives. This approach also makes it easy for physicians to focus on those patients who truly have care gaps.
  • Make it easy to do right thing: The system should make it as easy as possible for physicians to address care gaps by ordering/providing needed services and/or documenting legitimate exceptions.
  • Exclude unnecessary information: The program provides only relevant, actionable information to physicians. For example, alerts appear only for care gaps; the system does not inform the physician about all the appropriate care given to the patient. Program leaders used the same philosophy in designing the monthly patient lists. They decided to focus only on medications, both to make the list manageable in size and to give physicians information that they alone need to act on (because only physicians can prescribe medications or document a legitimate reason for not doing so). By contrast, most other care gaps (e.g., patients in need of screening tests) can be addressed more effectively by having other staff reach out to patients due for services.

Sustaining This Innovation

  • Provide actionable information on ongoing basis: Regular performance reports and monthly lists of individual patients with care gaps may spur physicians into action. Overall, the high levels of performance achieved by the practice during the initial study have been sustained.
  • Consider benchmarking versus top performers: As noted earlier, Northwestern provides physicians with information on their performance as compared with the practice average. However, offering comparisons with the best performers (e.g., the top 10 percent) may create more motivation to improve.
  • Consider financial incentives: Despite the initial success, a handful of physicians still do not fully buy into the program. Financial incentives could be considered as a way to get these remaining physicians completely on board.
  • Share success stories: Physicians often realize over time that the program has made their lives easier and improved their productivity. Sharing these “success stories” can build momentum and enthusiasm with other physicians.
  • Consider adding other care processes to target disease control: The program improved performance on process measures much more than outcomes measures that evaluate disease control. Incorporating additional relevant care process measures could help to reduce this gap.

Use By Other Organizations

Northshore University Health System in Evanston, IL and the University of Texas Southwestern Medical Center in Dallas, TX have implemented a similar approach.

Additional Considerations

Program success depends in part on physicians already being familiar with using an EMR. As noted, participating physicians had 10 years of EMR experience before implementation of this program.

More Information

Back to Top

Contact the Innovator

Stephen D. Persell, MD, MPH
Division of General Internal Medicine
Feinberg School of Medicine
Northwestern University
750 N. Lake Shore Drive, 10th Floor
Chicago, IL 60611
E-mail: spersell@nmff.org

Innovator Disclosures

Dr. Persell has received grant funding from AHRQ to support the work in the profile. He has received additional grant funding from AHRQ, NIH, and HRSA. He also is a paid consultant for the American Board of Internal Medicine and the American Medical Association and is employed by Northwestern Medical Faculty Foundation and Northwestern University.

References/Related Articles

Persell SD, Kaiser D, Dolan NC, et al. Changes in performance after implementation of a multifaceted electronic-health-system-based quality improvement system. Medical Care. 2011;49(2):117-125. [PubMed]

Footnotes

1 Persell SD, Kaiser D, Dolan NC, et al. Changes in performance after implementation of a multifaceted electronic-health-system-based quality improvement system. Medical Care. 2011;49(2):117-125. [PubMed]
Comment on this Innovation

Disclaimer: The inclusion of an innovation in the Innovations Exchange does not constitute or imply an endorsement by the U.S. Department of Health and Human Services, the Agency for Healthcare Research and Quality, or Westat of the innovation or of the submitter or developer of the innovation. Read more.

Original publication: July 06, 2011.
Original publication indicates the date the profile was first posted to the Innovations Exchange.

Last updated: February 12, 2014.
Last updated indicates the date the most recent changes to the profile were posted to the Innovations Exchange.

Date verified by innovator: June 21, 2013.
Date verified by innovator indicates the most recent date the innovator provided feedback during the annual review process. The innovator is invited to review, update, and verify the profile annually.