Abstract
Background The UK government introduced two financial incentive schemes for primary care to tackle underdiagnosis in dementia: the 3-year Directed Enhanced Service 18 (DES18) and the 6-month Dementia Identification Scheme (DIS). The schemes appear to have been effective in boosting dementia diagnosis rates, but their unintended effects are unknown.
Aim To identify and quantify unintended consequences associated with the DES18 and DIS schemes.
Design and setting A retrospective cohort quantitative study of 7079 English primary care practices.
Method Potential unintended effects of financial incentive schemes, both positive and negative, were identified from a literature review. A practice-level dataset covering the period 2006/2007 to 2015/2016 was constructed. Difference-in-differences analysis was employed to test the effects of the incentive schemes on quality measures from the Quality and Outcomes Framework (QOF); and four measures of patient experience from the GP Patient Survey (GPPS): patient-centred care, access to care, continuity of care, and the doctor–patient relationship. The researchers controlled for effects of the contemporaneous hospital incentive scheme for dementia and for practice characteristics.
Results National practice participation rates in DES18 and DIS were 98.5% and 76% respectively. Both schemes were associated not only with a positive impact on QOF quality outcomes, but also with negative impacts on some patient experience indicators.
Conclusion The primary care incentive schemes for dementia appear to have enhanced QOF performance for the dementia review, and have had beneficial spillover effects on QOF performance in other clinical areas. However, the schemes may have had negative impacts on several aspects of patient experience.
INTRODUCTION
Dementia is an umbrella term covering a range of progressive neurological conditions. It is a terminal condition that has a devastating effect on individuals and their families, and presents a huge challenge to society.1 In 2015 around 850 000 people were estimated to be living with dementia in the UK and this number is expected to rise to >2 million by 2052.2 However, in 2009 underdiagnosis was ‘the norm’,1 with between one-half and two-thirds of people in the UK with dementia having received no formal diagnosis.1,3 A key aim of the 2009 Dementia Strategy was to encourage earlier diagnosis.1 A raft of measures was introduced including two voluntary financial incentive schemes in primary care: Directed Enhanced Service 18 (DES18), for ‘facilitating timely diagnosis of and support for dementia’,4–6 and the complementary Dementia Identification Scheme (DIS).7
DES18 ran from April 2013 to March 2016.4–6 It supported a proactive and timely approach for assessing patients considered at risk of developing dementia, and then testing them as appropriate. DES18 also aimed to improve support for individuals who were newly diagnosed with dementia and their carers by referring them to specialist services and offering a care plan or a carer health check.
DIS ran from 1 October 2014 to 31 March 2015 and was designed to support and complement DES18.7,8 The aim was to encourage GP practices to adopt a proactive approach in identifying patients with dementia and, working with their clinical commissioning groups (CCGs), to develop relevant services and care packages. Like DES18, this involved identifying at-risk patients, working with care or nursing homes to find symptomatic patients, and offering them a dementia assessment to improve the recording of dementia on the practice’s dementia register and hence to improve care.
DES18 and DIS appear to have boosted diagnosis rates,9 but the unintended effects (positive or negative) of the schemes are unknown. Incentive schemes can unintentionally impact on other aspects of patient care; for example, by diverting clinical and administrative resources away from core and/or unincentivised services or conditions.10 The aims of this study were to test the effects of these schemes on quality measures from the Quality and Outcomes Framework (QOF) and on patient experience.
METHOD
A literature review was conducted to inform the researchers’ selection of unintended effects (positive or negative) of the two primary care incentive schemes. The search was restricted to UK studies of incentive schemes in primary care that were published between 2006 and 2016. The authors searched MEDLINE, Embase, PsycINFO, CINAHL, and HMIC.
How this fits in
A previous evaluation of two primary care schemes for tackling underdiagnosis in dementia demonstrated that the schemes had been effective in terms of their intended effects, but their unintended consequences are unknown. This study addresses that gap in the evidence base. The researchers show that the schemes are associated with higher-quality care both for dementia and for other long-term conditions, but that some aspects of patient experience may have been adversely affected. Feedback from the GP Patient Surveys could help practices to identify and mitigate adverse effects.
The researchers screened 509 records and identified 22 relevant studies.10–31 None of these articles investigated the unintended consequences of DES18 or DIS. In total, evidence on 12 unintended effects from other incentive schemes was found. Effects on provider behaviours included: gaming (inappropriate exception reporting);10,17,28 reduced clinical autonomy;17,24,26,29 internal motivation;17,26 and provider professionalism.10,24 Effects on practices included greater use of computers and widespread adoption of electronic medical records.21,23,27 For patients, there were effects on health inequalities;10,11,15–20,22,30 loss of patient-centredness;13,14,18,19,21,25,27,30 the doctor–patient relationship;14,21,27 access to care;14 and continuity of care.13,14,18,19,22,28,30 Studies also identified spillover effects on the quality of care falling outside of the schemes.10,12,14,17,21,25,30,31
Data
This retrospective cohort quantitative study was conducted on primary care practices in England. The cohort was a balanced panel, so all practices contributed data in all 10 years of the study from 2006/2007 to 2015/2016. A list of the datasets used to construct the dependent and explanatory variables is provided in Box 1.
Datasets used for the analysis
Dataset | Reporting level | Year range | Type of variable(s) derived | Details of variable |
---|---|---|---|---|
QOF | GP practice | 2006/2007 to 2015/2016 | Dependent | Overall QOF achievement on clinical domain. Used to generate variables of the achievement of different QOF clinical indicators |
Control | Practice-list size, percent of practice patients ≥65 years | |||
GPPS (unweighted)a | GP practice | 2008/2009 to 2015/2016 | Dependent | Practice-level responses to each question in the survey Used to generate variables to investigate: patient-centred care, access to care, continuity of care, and the doctor–patient relationship |
Dementia assessments data | GP practice | 2013/2014 to 2015/2016 | Policy | Used to identify participation in Directed Enhanced Services (DES18): Facilitating Timely Diagnosis and Support for People with Dementia |
List of participation for DIS | GP practice | 2013/2014 to 2015/2016 | Policy | Used to identify participation in DIS |
Dementia Assessment and Referral data collection | GP practice | 2013/2014 to 2015/2016 | Control | Used to construct ‘hospital effort’ indicator |
HES | Patient | 2013/2014 to 2015/2016 | Control | Used to construct ‘hospital effort’ indicator |
GMSb | GP practice | 2011/2012 to 2015/2016 | Control | Proportion of practice patients in different age and sex bands (≥65 years) used to derive expected dementia registers. GMS contract status |
ADS | GP practice | 2006/2007 to 2015/2016 | Control | Numbers of practice patients in each LSOA. Used to generate practice- level weighted averages of rurality and deprivation |
ONS: urban | LSOA | 2004 to 2011 | Control | Source of urban classifications. Combined with ADS to derive practice rurality measure. The 2004 data were used for missing values in 2011 |
ONS: deprivation | LSOA | 2010 to 2015 | Control | Source of IMD classifications. Combined with ADS to derive practice deprivation measure. The 2010 data were used for missing values in 2015 |
CCG code | GP practice | 2006/2007 to 2015/2016 | Control | Practice CCG code |
↵a GPPS is a questionnaire that is sent to a sample of each practice of England’s registered patients and is designed to collect data on different aspects of patient experience.32
↵b Method of GMS data collection changed 2015–2016 and data are missing for around 15% of practices. ADS = Attribution Dataset. CCG = clinical commissioning group. DIS = Dementia Identification Scheme. GMS = General and Personal Medical Services dataset. GPPS = GP Patient Survey. HES = Hospital Episode Statistics. IMD = Index of Multiple Deprivation. LSOA = lower-layer super output area. ONS = Office for National Statistics. QOF = Quality and Outcomes Framework.
Five patient-focused measures that could be captured from available data were selected from the 12 potential unintended consequences. These domains are detailed in Box 2 along with the measures used to evaluate the effects of DES18 and DIS.
Outcomes for the analysis of unintended consequences
Domain | Measure |
---|---|
1. Schemes’ impacts on quality of care outside DES18 and DIS | Population achievement of all QOF clinical indicators excluding the dementia annual review and diagnosis indicator (a weighted measure of overall achievement of the QOF clinical domains [excluding dementia review and diagnosis indicator], with the maximum points for each indicator used as weights) Population achievement of the QOF dementia annual review indicator |
2. Patient-centred care | Mean percentage of responders answering ‘good’ or ‘very good’ to each part of the question: ‘Last time you saw or spoke to a GP from your GP surgery:
|
3. Access to care | Percentage of responders answering ‘good’ or ‘very good’ to: ‘Last time you saw or spoke to a GP from your GP surgery, how good was that GP at giving you enough time?’ |
4. Continuity of care | Percentage of responders answering ‘almost always’ or ‘always’ to: ‘How often do you see (or speak to) the doctor you prefer to see?’ |
5. Doctor–patient relationship | Percentage of responders answering good or very good to: ‘Last time you saw or spoke to a GP from you GP surgery, how good were they at explaining tests and treatments?’ Percentage of responders answering ‘yes, definitely’ to: ‘Did you have confidence and trust in the GP you saw or spoke to?’ |
DES18 = Directed Enhanced Service 18. DIS = Dementia Identification Scheme. QOF = Quality and Outcomes Framework.
Domain 1: The researchers used two measures of practice performance from the QOF data to evaluate the schemes’ impacts on the quality of care outside of the schemes.
The QOF is a voluntary financial incentive scheme designed to improve quality of primary care.33,34 It incentivises 19 clinical areas as well as public health indicators;33 dementia was added to the QOF in 2006.35 Practices can exclude (‘exception report’) patients from specific indicators, who are not counted when calculating achievement for payment purposes.33 In contrast, a ‘population achievement’ measure includes exception-reported patients in the denominator, and the present researchers used this approach in both of the measures:
a QOF composite measure of all clinical indicators excluding the two dementia-specific indicators (annual review and post-diagnostic tests for reversible dementia); and
The first measure aimed to investigate the impact of participation in the dementia schemes on the quality of care for long-term conditions other than dementia. In theory, this effect could be negative, for example, diverting a practice’s resources towards dementia assessments could adversely impact the quality of care in other areas; or it could be positive, in so far as better organised practices might perform well on both the dementia schemes and on QOF. The second measure was used to assess the impact of the schemes on the dementia annual review for existing patients. It is plausible that attention could be focused on newly diagnosed patients at the expense of those with an existing diagnosis (negative effect); alternatively, increased resources for dementia could have beneficial spillover effects on existing patients. The authors did not assess the impact on the QOF indicator for incentivising tests for reversible dementia in newly diagnosed patients, as better-quality post-diagnostic care is an intended effect of the schemes.
Domains 2 to 5: patient experience domains included patient-centred care, access to care, continuity of care, and the doctor–patient relationship. The measures in these domains were constructed from the GP Patient Survey (GPPS). The rationale for including these indicators is similar to that of the first quality measure: that the impact of participation in the dementia incentive schemes on other patient experiences of, and access to, primary care could be either negative or positive, depending on how practices managed their resources.
The researchers’ key explanatory variables were practice participation in the incentive schemes. For any particular year in which DES18 was active, a practice was defined as a participant if it provided data on the number of dementia assessments conducted that year. Even practices that recorded zero assessments were counted as DES18 participants, because they had engaged with the incentive scheme by signing up for the scheme, for which they were paid, and by reporting data.
NHS England provided data on practices that participated in DIS, which was based on information collected by Local Area Teams for payment purposes.
As other factors may impact practices’ outcomes, the researchers adjusted for a range of practice characteristics: the proportion of patients aged ≥65 years; the practice list size; the proportion of patients living in the 20% most deprived small areas; the proportion of patients in urban areas; the number of full-time equivalent GPs per 1000 patients (in deciles); and whether practices had a General and Personal Medical Contract. The Office for National Statistics (ONS) provides data on the Index of Multiple Deprivation score and rural–urban classification for small areas known as lower-layer super output areas (LSOAs). These were attributed to practices as weighted averages of the proportions of registered practice patients in each LSOA. A variable to capture dementia screening activity in local hospitals (a ‘hospital effort’ indicator used in a previous study9) was also included, based on one of the Commissioning for Quality and Innovation Framework (CQUIN) schemes.37–39 The researchers also accounted for regional characteristics (CCG).
Statistical modelling
All the dependent variables were continuous measures ranging from 0 to 100. A difference-in-differences (DID) design was used to model the impact of DES18 and DIS on the unintended consequences. DID is a method that has been used extensively in the policy evaluation literature,40 and the approach was previously used to evaluate the effectiveness of the two schemes in terms of their intended consequences.9 This design is appropriate when information before and after the introduction of the incentive schemes is available for both the treatment group (those who participated in the schemes) and control group (those who never participated). An important assumption is that the treatment and control groups are subject to the same time trends, known as the ‘common trends’ assumption.40
DIS operated during the period when DES18 was active, and practices could participate in one, both, or neither of the schemes. As DES18 ran for 3 years, practices could participate in DES18 in any number of these years. To account for these features in the model, the same eight DES18 groups and two DIS groups defined in the authors’ previous research were used.9
A mixed-effects linear DID model that allowed for multiple periods and multiple incentive schemes was applied (technical details of the model are available from the authors on request).9,41–43
RESULTS
National practice participation rates in DES18 and DIS were 98.5% and 76% respectively. In total, 7079 practices were included in the study sample. Table 1 shows the number of practice-years within each participation group.
Practice participation in DES18 and DIS from 2006/2007 to 2015/2016
Sample statistics are presented in Table 2, and Table 3 summarises the DIS and DES18 policy effects on the quality of care and on patient experience. Results of the analysis of effects on the composite measure of care quality for long-term conditions in QOF (excluding dementia) are available from the authors on request. Figure 1 shows the trends of the mean QOF clinical composite measure (excluding dementia indicators). Figures 2 and 3 show the trends of the remaining measures. The formal tests (available from the authors on request) show that the pre-intervention time trends for the control and treatment practices are parallel at the 0.1% significance level.
Descriptive statistics of the estimation sample, N = 7079 practices
Results of the policy variables of DIS and DES18 on outcomes
Trends in the quality of primary care for long-term conditions in QOF (excluding dementia): variation by participation in the schemes. DES18 = Directed Enhanced Service 18. DIS = Dementia Identification Scheme. QOF = Quality and Outcomes Framework. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.
Trends of outcomes for DES18 scheme. DES18 = Directed Enhanced Service 18. QOF = Quality and Outcomes Framework. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.
Trends of outcomes for DIS scheme. DES18 = Directed Enhanced Service 18. QOF = Quality and Outcomes Framework. DIS = Dementia Identification Scheme. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.
Quality of care
Practices that participated in either or both of the schemes to incentivise early diagnosis of dementia had significantly higher overall quality of clinical care compared with non-participants. Participation in DES18 and DIS increased practice achievement by 0.743 percentage points and 0.429 percentage points, respectively (Table 3).
Participation in DES18 was associated with a statistically significant positive effect on practice performance on the annual dementia review, with participation in DES18 increasing practice achievement by 1.302 percentage points on average. Participation in DIS had no significant effect (Table 3).
Patient experience
The schemes were associated with some negative effects on patient experience. Participation in DES18 decreased GPPS indicators of patient-centred care (−0.525 percentage points, 95% confidence interval [CI] = −0.755 to −0.296), access to care (−0.364 percentage points, 95% CI = −0.582 to −0.145), explaining tests and treatments (−0.371 percentage points, 95% CI = −0.620 to −0.122), and confidence and trust in the GP (−0.520 percentage points, 95% CI = −0.837 to −0.202), but had no effect on continuity of care, such as the ability to see their preferred clinician. Conversely, participation in DIS negatively impacted care continuity (−0.663 percentage points, 95% CI = −1.147 to −0.180), but was linked to improved patient experience with respect to one indicator of the doctor–patient relationship: explaining tests and treatment (0.265 percentage points, 95% CI = 0.006 to 0.525).
DISCUSSION
Summary
Analysis by the researchers of the unintended consequences of the schemes revealed mixed effects. The schemes appear not only to have enhanced QOF performance in dementia review, but also to have had beneficial spillover effects on QOF performance in other clinical areas. This is possibly due to the use of extra funds attracted through the schemes to improve other areas of care at the practice; alternatively, it could be capturing practices’ organisational skills, such as the ability to comply with incentive schemes. Whatever the reason, it is reassuring that there was no adverse effect on either the annual dementia review or the quality of care for patients with other long-term conditions. However, the present study also uncovered some negative consequences. Analysis of the GPPS indicators identified deleterious effects for DES18 on several aspects of patient experience. For DIS, the only significant negative impact found was on continuity of care. A possible causal mechanism for each of these negative effects is that practices diverted efforts towards assessments for dementia, reducing the time available for other patients in a variety of ways, as described more fully in the data section.
Strengths and limitations
The major strength of this study was that it addressed a gap in the evidence base on the unintended effects of the incentive schemes for underdiagnosis in dementia. Comprehensive datasets were used covering almost all English general practices over a 10-year period.
There are several reasons why the present findings on the effects on patient experience should be interpreted with caution. First, the GPPS data are derived from a small sample of practice patients and so may not be representative. Second, the impact on different types of patients, such as those with or without dementia, or their carers, is unclear. It is possible that patient experience may have improved for some types of patients. Third, the researchers did not control for other DES schemes, as data on uptake are not available.
There are methodological weaknesses inherent in observational studies. Randomised controlled trials are considered the optimal study design for identifying causal effects as they control for known and unknown biases.40 However, DID is a good alternative method for non-experimental policy changes, such as the schemes evaluated in this study, if there are large numbers of observations (the schemes were national), if participation varies over time (as it did), and the dataset covers a reasonably long time-series. Although there was an extensive list of covariates to control for practice and regional characteristics, and hospital effort was included, there may be other confounders that could bias the present results. In addition, the authors could not test some potential unintended consequences due to lack of data.
Comparison with existing literature
There have been no previous studies on the unintended consequences of DES18 or DIS. Studies investigating the unintended consequences of the QOF or other local incentive schemes have found mixed effects on the quality of care outside of the schemes.10,12,14,31 One study found no significant effect on access to care or on the doctor–patient relationship,14 but two studies showed that continuity of care declined significantly.14,19
Implications for research and practice
The present study indicates that the schemes could have had a small adverse effect on patient experience. Alongside the unintended effects, policymakers should also consider that the schemes had a positive impact on tackling underdiagnosis.9 Depending on the relative values placed on improving the diagnosis of dementia as opposed to the small negative effects on some aspects of patient experience, policymakers may consider this trade-off acceptable. Future evaluations of incentive schemes should include analysis of the unintended as well as the intended effects. Feedback from the GPPS could help practices to identify and mitigate any potential adverse effects of this nature.
One potential area for future research is gaming (inappropriate exception reporting), which was highlighted in the literature review as a potential unintended consequence of QOF. There are no data to test whether practices assessed cases inappropriately in order to gain financially from the schemes, though qualitative work may shed light on this issue. There is also a risk of misdiagnosis, which can have ‘truly tragic consequences’,44 especially if doctors feel pressured into providing an early diagnosis.
There are variations in the availability of post-diagnosis support between CCGs,45 which may be a response to higher diagnosis rates in areas where the incentive scheme had most impact. Policymakers could focus on monitoring future schemes and ensuring practices are supported to deliver sufficient high-quality post-diagnostic support.
Acknowledgments
The authors would like to thank Kath Wright from the Centre for Reviews and Dissemination at the University of York for her support in completing the literature search. They are also grateful for feedback on an earlier draft of this article by attendees at the Health Economics Study Group (winter 2018, City University), and for comments from the project advisory group and from two referees.
Notes
Funding
This article is based on independent research commissioned and funded by the National Institute for Health Research (NIHR) Policy Research Programme (Policy Research Unit in the Economics of Health and Social Care Systems: reference 103/0001). The views expressed in this article are those of the authors and not necessarily those of the NHS, the NIHR, the Department of Health and Social Care, arm’s length bodies, or other government departments.
Ethical approval
Ethical approval was not required for this study.
Provenance
Freely submitted; externally peer reviewed.
Competing interests
The authors have declared no competing interests.
Discuss this article
Contribute and read comments about this article: bjgp.org/letters
- Received June 29, 2018.
- Revision requested August 6, 2018.
- Accepted August 27, 2018.
- © British Journal of General Practice 2019