Skip Navigation
Policy Innovation Profile

Hospital-Affiliated Physician Group Offers Modest Performance-Based Incentives to Salaried Physicians, Leading to Sustained Improvement on Quality-Related Metrics


Tab for The Profile
Comments
(0)
   

Snapshot

Summary

The Massachusetts General Physicians Organization Quality Incentive Program offers modest financial incentives to salaried physicians every 6 months based on their performance on three designated quality metrics for that period. Physicians receive regular comparative performance reports and other communications intended to help them improve performance. The program has generated sustained improvements in institution-wide and department-specific performance on a wide array of quality-related metrics.

Evidence Rating (What is this?)

Moderate: The evidence consists primarily of pre- and post-implementation comparisons of physician performance on numerous quality metrics targeted by the incentive program, along with comparisons of actual to expected payouts and post-implementation feedback on the program from participating physicians.
begin do

Developing Organizations

Massachusetts General Physicians Organization
end do

Use By Other Organizations

Geisinger Health System has a somewhat similar compensation system for its employed physicians, with 20 percent of compensation tied to performance on quality metrics aligned with the organization’s strategic aims and applied to specific clinical service groups.1

Date First Implemented

2006
December

Problem Addressed

Traditional compensation systems create little or no incentive for individual physicians to focus on quality improvement activities. Newer pay-for-performance (P4P) systems often create incentives for quality at the practice or organizational level, but in many cases these incentives do not filter down to the individual doctor working as an employee of a larger organization.
  • Failure of traditional compensation systems to promote quality: Traditional fee-for-service (FFS) and productivity-based compensation systems encourage physicians to provide more—not necessarily better—care, as income depends almost entirely on the quantity rather than the appropriateness or quality of services delivered.
  • Failure of P4P to create incentives for individual employed physicians: Payer-led P4P programs that tie practice- or organization-wide payments to performance in specific areas have generally had mixed results, with some generating modest improvements in performance on targeted metrics.2,3,4,5 Potential reasons for this relatively lackluster impact include inadequately sized incentives, poorly structured measures (including wide variability across payers in the metrics used), and the failure to translate organization-wide incentives to the individual doctor.6 In addition, the data used by payers in P4P programs are often not current and may require extensive review and adjustment to be usable, meaning that incentive payments do not arrive until long after the period being evaluated has ended. The problem of inadequate incentives for individual doctors will become larger as the proportion of physicians working as salaried employees continues to grow. As of 2013, 61 percent of practicing physicians worked as employees for a group practice, hospital, or other organization, up from 43 percent in 2000.7

What They Did

Back to Top

Description of the Innovative Activity

The Massachusetts General Physicians Organization (MGPO) Quality Incentive Program offers modest financial incentives to salaried physicians every 6 months based on their performance on three designated quality metrics for that period. Physicians receive regular comparative performance reports and other communications intended to help them improve. Key elements of this financial incentive policy are detailed below:
  • Open to most physicians with a minimum level of clinical activity: The program covers clinically active physicians other than trainees who participate in at least two managed care contracts and have at least 50 relative value units (RVUs) of activity over a 6-month period. (RVUs are a common measure of physician work.) With variation across specialties, 50 RVUs translates into spending roughly a day a week on clinical activity.
  • Maximum incentive based on level of activity: The maximum incentive payment for each 6-month performance period ranges from $500 to $2,500 depending on a physician’s level of clinical activity. For each 6-month period, the 73 percent of MGPO physicians with 750 or more RVUs can earn up to $2,500, while the19 percent with 250 to 750 RVUs can earn up to $1,250, and the 8 percent with 50 to 250 RVUs can earn up to $500.6
  • Three metrics and associated performance targets: During each 6-month performance period, performance is evaluated against targets set for three quality measures. Two measures are chosen by program leaders and generally apply to all physicians, while one is chosen by department/division leaders and applies only to physicians in that clinical area, as described below. To date, most quality measures have assessed processes, with less than 10 percent assessing outcomes.6
    • Two universal measures, with some alternatives: Program leaders choose two measures that generally apply to all participating physicians. Because physicians in different specialties and departments engage in different clinical activities, sometimes one or both of these metrics may not be applicable to certain specialties, and alternative measures may be used. For example, hospital-based pathologists do not prescribe medications, so an overall measure related to medication reconciliation does not apply; and anesthesiologists do not enter information into patients' electronic health records (EHRs), so any metric targeted at improving EHR documentation would not apply. Alternative measures almost always are found in such cases; for the rare instance in which they are not, the physicians automatically receive the full incentive for the measure so they are not penalized.
    • One measure specific to department or division: Each performance period, leaders within 40 designated clinical areas choose the third metric and associated performance targets in consultation with leaders of the overall program. Initially, most departments relied on existing measures used by specialty societies or other organizations. Over time, some have begun to develop their own measures to focus physician incentives on specific issues or regulatory requirements faced by the division/department. Examples of department-specific measures are detailed below:
      • Primary care: For two consecutive 6-month terms, primary care leaders tied incentives to reducing emergency department (ED) use among patients of the 18 primary care practices affiliated with Massachusetts General Hospital.
      • Radiology: The radiology department initially focused on reducing the mean time from a preliminary reading to a final report on the results of a screening. More recently, the department focused on systematic peer review assessments, setting the performance target for physicians to participate in twice-a-week meetings to review randomly selected images interpreted in the past 3 days.
      • Hematology/oncology: For two consecutive 6-month terms, leaders of the hospital cancer center focused physicians on reducing the frequency of exceptions to cycle 1 chemotherapy treatment, measuring exception orders as a percentage of total orders at the level of the individual physician.
    • Targeting and evaluation: Each metric has a specific performance target for each 6-month period. During the program's first few years, targets for most measures were set with the expectation that 80 percent of participating doctors would achieve the goal. For each measure, performance is calculated at the level of the individual physician, practice group, department, or hospital, depending on the nature of the measure and the availability of data. In some cases, the same or related measures may be used for multiple 6-month periods to sustain physicians’ focus on a particular high-priority area. In these instances, the performance targets often increase, or the unit of measurement for a given measure may change over time. For example, a quality measure might apply first to the entire hospital, then to a department or group, and then to individual physicians. In the following examples, the measures remained the same or similar over time, while performance targets and units of measurement changed.
      • EHR documentation: In 2007, Partners Community Health Care, Inc. (PCHI, a regional network that includes MGPO) adopted a requirement that all physicians must document patient interactions through EHRs by 2010. The MGPO Quality Incentive Program was used to help physicians learn the new workflows and documentation systems associated with this requirement. To that end, the initial measure focused on completing 4 hours of classroom training to improve technical proficiency in using the EHR. The measure later changed to focus on the timely incorporation of preliminary notes into the EHR, with an initial target of 80 percent or more being documented within 5 days of the visit, measured at the department level. Once the departments reached that target, the measure switched to focus on the final (rather than preliminary) note, the timeliness criteria changed to within 8 days of the visit, the performance target increased to 90 percent, and the locus of measurement switched to the individual physician rather than the group.
      • Electronic order entry for radiology studies: A similar staged approach was employed to encourage physicians to use an electronic order entry system with decision support to order high-cost outpatient imaging studies. Performance targets increased incrementally over three terms, after which these radiology studies could be ordered only through the electronic system.
      • Electronic prescribing: The program focused on encouraging use of electronic prescribing over seven 6-month terms, beginning with a simple metric measuring whether physicians used the electronic system to generate at least 10 prescriptions during the 6-month period. This relatively low bar was chosen to help clinicians become familiar with the new system, overcome their reluctance to adopt it, and encourage them to solve associated workflow problems. During subsequent terms, metrics and associated performance targets increased; ultimately, the target called for 85 percent of prescriptions to be issued electronically, measured at the level of the individual physician.
  • Communication: Program leaders use the hospital’s physician-credentialing database to get an accurate, up-to-date list of clinically active physicians. Leaders then communicate with these physicians throughout each 6-month period, as outlined below:    
    • Personalized e-mail at start of period: Each eligible physician receives a personalized e-mail message at the start of each 6-month term listing the three quality measures and associated performance targets.
    • Regular communications throughout period: During the term, the program generates 20 to 30 e-mail messages that are branded with a logo to distinguish them from other communications. (On average, physicians open 60 percent of these branded e-mails, a higher percentage than for other messages sent by MGPO.) In many cases, messages are targeted to a particular subgroup of doctors. For example, the system tracks which physicians do and do not open a message, and a repeat message can be sent to those who did not open the first one. In addition, messages can be targeted based on performance. For example, roughly half way through each period, physicians get an e-mail summarizing their performance on key measures. Those performing at or above target receive a message applauding their performance and encouraging them to keep up the good work. Those performing below target levels are encouraged to improve, often with specific suggestions. Similarly, for measures that relate to completing a training activity (e.g., attending a seminar or taking an online course), e-mails can be sent only to those who have not yet done so.
    • End-of-term performance report: At the end of the term, each physician receives an e-mail showing whether he or she achieved the target level of performance on each of the three metrics, with a green check mark if the target has been met and a red “X” if not. The e-mail includes that physician's performance scores and provides a link to a list detailing who within the department did and did not meet the target. This comparative information allows an underperforming physician to identify and seek out advice from peers who performed well. Physicians also get information on the program in a monthly newsletter and through a password-protected intranet site.
  • Calculation and distribution of incentive payment: Physicians who meet the performance targets for all three measures receive the maximum payment for the period; those who meet fewer targets receive a partial payment. For example, a physician with the highest level of clinical activity earns the full $2,500 incentive for meeting all three targets, roughly $1,666 for meeting two targets, or $833 for meeting one. The incentive payment is included in the physician’s paycheck and is identified separately from regular pay. (For the first few years, MGPO mailed separate checks for the incentive; however, this process was cumbersome, and sometimes checks were never received or cashed.)    
  • Appeals: At the end of each 6-month term, physicians have the right to appeal if they believe there is an issue with how their performance has been calculated. During the first 13 performance periods, an average of 4 percent of physicians appealed, ranging from 0 to 11 percent. Overall, 65 percent of appeals were found to be “meritorious” (meaning the physician complaint was considered valid) and the incentive payments were made.6 Program leaders regularly use the appeals process to improve data and measurement systems, which has led to declines in the frequency of appeals.

Context of the Innovation

The MGPO employs more than 98 percent of physicians on the medical staff at Massachusetts General Hospital. MGPO is part of PCHI, a regional network of providers that contracts with payers. In the 1990s, PCHI signed several full-capitation contracts with payers, but by 2001 these had changed to P4P contracts, with a portion of payments being withheld and subsequently paid out to hospitals and physicians depending on their performance on agreed-upon measures related to the quality and efficiency of care. As described earlier, this approach had several weaknesses, notably that targets were set and performance measured at the aggregate (not individual physician) level and the various payers involved in these programs often used different metrics, making it difficult for the organization as a whole or individual physicians to know where to focus their efforts. In addition, the data were not current and often needed extensive review and adjustment, causing incentive payments to lag behind the start of each performance period by up to 2 years. As a result, many physicians did not fully engage in the program. To address these issues, MGPO leaders decided to establish an internal program that would create more meaningful and timely incentives for individual physicians to improve.

Did It Work?

Back to Top

Results

The program has generated sustained improvements in institution-wide and department-specific performance on a wide array of quality-related metrics.
  • Better institution-wide performance: The incentive program has led to meaningful, sustained improvement in performance on a variety of the universal measures included in one or more 6-month periods. Selected examples include the following:
    • Greater use of information technology (IT): The incentive program has spurred significant improvements in institution-wide performance on various metrics related to use of electronic systems by physicians, including completing 4 hours of EHR training (which rose from 0 percent at baseline to 92 percent); using the radiology order entry system for high-cost imaging tests (28 to 52 percent); using electronic prescribing, first to create 10 prescriptions during a 6-month period (0 to 99 percent), then to create 85 percent of all prescriptions during a period (62 to 95 percent for specialists and 81 to 95 percent for primary care physicians); and completing the final clinical note within 8 days of an outpatient visit (72 to 91 percent). In addition, physician performance on a variety of measures related to Meaningful Use requirements increased significantly between 2011 and 2012. These improvements helped MGPO meet many of the requirements and qualify for $15.5 million in incentive payments in 2013.6
    •  A line graph showing an improvement in physician hand hygiene and a simultaneous decline in MRSA infections over a 10-year period.

      Figure 1. Data show that MRSA rates have decreased as physician hand hygiene has improved. Click the image to enlarge. Image courtesy of MGPO/Massachusetts General Hospital. Used with Permission.

    • Better hand hygiene and fewer infections: Compliance with hand-hygiene protocols (as measured through observation) before contact with the patient increased from 74 percent at baseline to 92 percent during the last period in which the measure was used. Compliance after patient contact improved from 89 percent to 95 percent between baseline and the last period in which the measure was used. These improvements have largely been sustained over time.6 Over the same period, the incidence of methicillin-resistant Staphylococcus aureus (MRSA) infections declined markedly (see Figure 1). Program leaders believe the decline in MRSA infections stems in part from the improvements in hand hygiene. 
    • Completion of training and assessment activities: The program has improved physician participation in a variety of required and suggested training and assessment activities, including implementing the Joint Commission’s Ongoing Professional Practice Evaluation process (from 0 percent at baseline to 100 percent in the final period in which the measure was used), completing online training to prepare for the Joint Commission accreditation survey (0 to 96 percent for the first round of training and 0 to 97 percent for the second round), and completing physician–patient communication training (0 to 91 percent). After physicians took the communication training, scores on several HCAHPS (Hospital Consumer Assessment of Healthcare Providers and Systems) survey questions related to the quality of physician communication increased. For example, the proportion of patients answering “always” on communication-related questions (e.g., that the physician always respected and listened to them and always explained things in a way they could understand) increased from 79.6 percent in 2009 to 82 percent in 2012.6
  • Better department-specific performance: The incentive program has led to meaningful, sustained improvement in performance on a variety of department-specific measures; descriptions of a few of these improvements appear below:            
    • Less ED use by primary care patients: After primary care leaders chose to focus on reducing ED use for two 6-month periods, ED visits by patients in the hospital’s 18 affiliated primary care practices fell by 3.7 percent between 2009 and 2010. Overall, 16 of 18 practices met the target level of performance.6
    • Faster radiology turnaround times: The mean time from a preliminary reading to a final report fell from 23 hours to 4 hours.8
    • Fewer chemotherapy exceptions: The proportion of exception orders for cycle 1 chemotherapy fell from over 12 percent in the fall of 2008 to under 4 percent 2 years later, with more than 90 percent of physicians meeting the targeted level of reduction. Exception orders continued to decline even after the measure was no longer used, reaching roughly 2 percent by the fall of 2012.6
  • Higher-than-expected payouts: Over the first 13 6-month terms, physicians earned 90 percent of the available incentive dollars, above the 80-percent payout rate that program leaders expected when they set the performance targets.6    
  • High satisfaction among physicians: An internal survey of clinically active clinicians conducted in 2010 found that 78 percent of respondents believed the program had increased their focus on quality-related issues and 79 percent wanted it to continue.6

Evidence Rating (What is this?)

Moderate: The evidence consists primarily of pre- and post-implementation comparisons of physician performance on numerous quality metrics targeted by the incentive program, along with comparisons of actual to expected payouts and post-implementation feedback on the program from participating physicians.

How They Did It

Back to Top

Planning and Development Process

Key steps included the following:
  • Forming steering committee to oversee program development and implementation: Starting in mid-2006, a multidisciplinary steering committee took charge of developing the program over a period lasting approximately 6 months. The committee initially included the MGPO medical director, the vice president of communications, several computer experts and project managers, program staff with responsibility for physician compensation, and program staff with responsibility for quality/safety. During the development and implementation process, the committee met as a whole on a weekly basis, with individual members conducting additional work between meetings. Key activities during this process are outlined below:
    • Developing database of active physicians: The committee combined financial data with information from an existing credentialing database to create an “active-physician” database. On an ongoing basis, the active-physician database can identify physicians who meet the minimum criteria for clinical activity (50 RVUs over a 6-month period) and determine their precise level of activity (and hence maximum incentive payment).
    • Determining size of maximum payout: Aware of the total dollars available for the program (initially around $5 million a year, now $6 to $7 million a year) and the approximate number of eligible physicians, the committee decided on a maximum annual per-physician payout of $5,000. While in most cases this figure represented only 1 or 2 percent of a physician’s total income, program leaders felt that, with a properly structured program, this level of incentive would be enough to stimulate improvement.
    • Aggregating physicians into clinical groups: Even though many subspecialty physicians practice in unique niches, program leaders recognized the need to limit the number of measures used. Consequently, they decided to aggregate physicians into clinical groups and at first identified as many as 90 clinical areas. The leaders quickly realized they needed a more manageable number and ultimately created 40 groups.
    • Introducing program: At the program’s outset, all eligible physicians received a letter introducing the program, along with a one-time payment equivalent to the maximum 6-month incentive for which the physician qualified based on level of clinical activity. Program leaders chose to provide this one-time payment for a variety of reasons. First, it helped get the attention of physicians, making it easier to educate them about the program. Second, the payment tied into “prospect theory”—the idea that individuals are more motivated to avoid a loss than to realize a gain. Program leaders felt that after receiving a one-time full incentive payment upfront, physicians would be highly motivated to do whatever was necessary to continue receiving (i.e., not lose) the payment in the future. 
  • Ongoing meetings to manage program: The steering committee continues to meet on a weekly basis, focusing primarily on determining the appropriate universal measures for each period, reviewing proposals for department-specific measures, determining performance targets for each measure, monitoring performance, managing communications with physicians, making sure physicians receive the right payment, and reviewing and adjudicating appeals. The composition of the group tends to change over time, with content experts rotating in and out depending on the areas being targeted. For example, a pharmacy analyst might join the committee to provide expertise on measures related to electronic prescribing, while an infection control specialist would join when the focus shifts to hand hygiene.

Resources Used and Skills Needed

  • Staffing: Administering the program requires the equivalent of four full-time equivalent (FTE) staff.6
  • Costs: The incentives cost between $6 and $7 million annually ($3 to $3.5 million for each 6-month term); the four FTEs dedicated to the program add roughly 7 percent to this figure ($420,000 to $490,000 a year).6
begin fsxml

Funding Sources

Massachusetts General Physicians Organization
The incentive payments come out of a fee that physician practices pay MGPO to handle their billing and administrative functions (e.g., credentialing, IT), which is calculated as a percentage of practice revenues. During the several-year period before the incentive program began, two factors led to a substantial increase in the money collected through this fee—higher reimbursement rates from payer contracts and increased efficiencies by MGPO in providing services to the practices. As a result, MGPO has not yet needed to raise the fee to cover the costs of the incentive program. Because physician reimbursement rates have stagnated in recent years, program leaders may consider a fee increase at some point in the future.end fs

Tools and Other Resources

A technical summary that includes all measures used by MGPO since implementing the program is available by request. Contact the innovators for more information.

Adoption Considerations

Back to Top

Getting Started with This Innovation

  • Set right length for performance period (roughly 6 months): Lengthy performance periods (such as a full year) force physicians to wait too long after performing an action to receive the incentive for doing so. Lengthy periods also limit the ability of the organization to change metrics in response to the external environment, such as a major payer launching a new program or incentive targeted at a specific area. Shorter periods (such as 3 months) create undue administrative burdens and may not be enough time for physicians to act or for their actions to have an impact.
  • Be flexible, including offering appeals process: Data collection and analysis problems will inevitably occur, and program leaders must be willing to review the data and admit when something needs to be corrected. Building in such flexibility, including giving physicians the right to appeal, improves the program’s credibility and provides a vehicle for identifying and addressing systemic problems related to measurement and analysis.
  • Consider initial bonus payment to garner attention: As noted, MGPO gave eligible physicians a one-time payment equal to the maximum incentive payment for which they qualified. This payment helped to engage physicians in the program.
  • Consider setting maximum incentive at relatively modest level: Program leaders believe that relatively modest incentives are enough to motivate improvement in performance, as long as they are targeted at the right areas and communicated in an effective manner. While some experts suggest that incentive systems will not work unless they offer a maximum of 10 to 20 percent of income, the MGPO Quality Incentive Program has generated substantial improvements with incentives that represent only 1 or 2 percent of total compensation for most doctors.

Sustaining This Innovation

  • Choose metrics tied to existing priorities and requirements: This strategy helps to ensure that the program affects areas that matter most to the organization as a whole and to individual departments and divisions, including priorities laid out in internal strategic plans and those mandated by payers and regulators (e.g., Meaningful Use requirements). This type of program sends a strong signal to physicians about what really matters and hence motivates them to improve in these areas.
  • Allow program leaders to review department-proposed metrics and targets: In some cases, metrics or performance targets proposed by clinicians at the division or department level might be relatively easy to achieve, such as attending staff meetings or grand rounds or achieving small improvements in performance. To avoid this problem, program leaders should review all department-proposed metrics and targets and, as necessary, work with department leaders to adjust them.
  • Communicate clearly and consistently with participants: Regular communication with participating physicians and division/department leaders is critical to the program’s success, including easy-to-read performance reports and timely and clear explanations of any program-related changes.
  • Elicit and respond to feedback: Physicians will inevitably have issues with particular measures or other program-related issues. They should be given mechanisms to voice their objections and receive a timely, clear response when they do.
  • Explain use of group-level measurement: Physicians generally prefer to be accountable only for their own performance, not for the performance of their entire practice or department. However, in some cases, data limitations make it impossible to measure accurately at the level of the individual physician. In these instances, program leaders should clearly explain the rationale behind the decision not to use physician-specific measurement.
  • Coordinate with other internal incentive programs: A practice or department might have existing incentive systems in place. Program leaders need to communicate with the leaders of these initiatives so they can coordinate activities and avoid confusion or overlap. For example, the Massachusetts General Hospital primary care department recently created its own internal incentive program focused on helping practices achieve accreditation as medical homes. Leaders of the MGPO Quality Incentive Program send monthly communications to these physicians explaining how the larger incentive program relates to the department-specific one and supports their efforts to become accredited.
  • Search for alternative measures for subspecialists: Faced with increasing subspecialization and limited resources, program leaders have found it challenging to find measures that apply to all physicians. In fact, during the first 6-month term, 28 percent of physicians received incentive payments for measures that did not apply to them.6 To the extent possible, programs should try to find alternative measures that do apply. MGPO has been able to get this “free-rider” figure down to 1 percent.6 
  • Adjust targets and level of measurement over time: As noted, MGPO often began with relatively easily achieved metrics and targets within a particular area and then changed them over time. Changes might include switching to a different, related metric; raising the performance bar to qualify for the incentive payment; or narrowing the unit of measurement (e.g., from hospital to department, or from department to individual physician).
  • Consider team-based incentives: As organizations increasingly adopt medical homes and other care delivery models that use multidisciplinary teams, incentive systems may need to expand beyond physicians to incorporate other team members.
  • Consider “term limits” for individual metrics: While the appropriate amount of time to focus on a particular area will vary depending on the initial level of performance and rate of improvement, it may make sense to set a limit of three 6-month terms for any specific measure.
  • Adjust metrics (particularly for specialists) as organization takes on more risk: As organizations begin signing payer contracts that entail greater accountability and risk sharing, elements of the incentive payment system may need to change accordingly, including the structure of the incentive and the specific metrics and targets used. New metrics to evaluate the appropriateness and costs of care may be needed. This evolution is particularly important for specialists, for whom relatively few such measures exist today. For example, in the past year MGPO has asked each of its departments to identify two or three areas where cost reduction might be possible. In response, several departments have come up with cost-reduction strategies, including greater reliance on generic drugs (dermatology) and increased use of guidelines and algorithms to determine the appropriate course of action (gastroenterology, cardiology, urology). In some cases, department leaders have proposed use of specific measures tied to these initiatives as part of the MGPO incentive program.

Use By Other Organizations

Geisinger Health System has a somewhat similar compensation system for its employed physicians, with 20 percent of compensation tied to performance on quality metrics aligned with the organization’s strategic aims and applied to specific clinical service groups.1

More Information

Back to Top

Contact the Innovator

Deborah G. Colton
Senior Vice President for Strategic Communication
Massachusetts General Physicians Organization
55 Fruit Street, Suite 205D
Boston, Ma 02114
(617) 726-1357
E-mail: dcolton@partners.org  

Sarah K. Lenz
Director of Physician Incentive Programs
Massachusetts General Physicians Organization
100 Cambridge Street, Room 1531
Boston, MA 02114
(617) 643-0405
E-mail: slenz@partners.org

Innovator Disclosures

Ms. Colton and Ms. Lenz reported having no financial or business/professional relationships relevant to the work described in this profile.

References/Related Articles

Torchiana DF, Colton DG, Rao SK, et al. Massachusetts General Physicians Organization’s quality incentive program produces encouraging results. Health Aff (Millwood). 2013;32(10):1748-56. [PubMed]

Meyer GS, Torchiana, DF, Colton D, et al. The use of modest incentives to boost adoption of safety practices and systems. In: Henriksen K, Battles JB, Keyes MA, et al., eds. Advances in patient safety: new directions and alternative approaches. Vol. 3: Performance and tools. AHRQ Publication No. 08-0034-3. Rockville (MD): Agency for Healthcare Research and Quality; August 2008:334-47.

Footnotes

1 Lee TH, Bothe A, Steele GD. How Geisinger structures its physicians’ compensation to support improvements in quality, efficiency, and volume. Health Aff (Millwood). 2012;31(9):2068-73. [PubMed]
2 Glickman SW, Ou FS, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297(21): 2373-80. [PubMed]
3 Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356(5):486-96. [PubMed]
4 Casalino L, Gillies RR, Shortell SM, et al. External incentives, information technology, and organized processes to improve health care quality for patients with chronic diseases. JAMA. 2003;289(4):434-41. [PubMed]
5 Scott A, Schurer S, Jensen PH, et al. The effects of an incentive program on quality of care in diabetes management. Health Econ. 2009;18(9):1091-108. [PubMed]
6 Torchiana DF, Colton DG, Rao SK, et al. Massachusetts General Physicians Organization’s quality incentive program produces encouraging results. Health Aff (Millwood). 2013;32(10):1748-56. [PubMed]
7 Gamble M. 42 statistics on independent physicians from 2000 to 2013. Becker’s Hospital Review. 2012 Nov 19. Available at: http://www.beckershospitalreview.com/hospital-physician-relationships
/42-statistics-on-independentphysicians-from-2000-to-2013.html
.
8 Boland GW, Halpern EF, Gazelle GS. Radiologist report turnaround time: impact of pay-for-performance measures. AJR Am J Roentgenol. 2010;195(3):707-11. [PubMed]
Comment on this Innovation

Disclaimer: The inclusion of an innovation in the Innovations Exchange does not constitute or imply an endorsement by the U.S. Department of Health and Human Services, the Agency for Healthcare Research and Quality, or Westat of the innovation or of the submitter or developer of the innovation. Read more.

Original publication: July 16, 2014.
Original publication indicates the date the profile was first posted to the Innovations Exchange.

Last updated: August 13, 2014.
Last updated indicates the date the most recent changes to the profile were posted to the Innovations Exchange.