Paragraphs

Background: Warfarin reduces the risk for ischemic stroke in patients with atrial fibrillation (AF) but increases the risk for hemorrhage. Dabigatran is a fixed-dose, oral direct thrombin inhibitor with similar or reduced rates of ischemic stroke and intracranial hemorrhage in patients with AF compared with those of warfarin.

Objective: To estimate the quality-adjusted survival, costs, and cost-effectiveness of dabigatran compared with adjusted-dose warfarin for preventing ischemic stroke in patients 65 years or older with nonvalvular AF.

Design: Markov decision model.

Data Sources: The RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial and other published studies of anticoagulation. The cost of dabigatran was estimated on the basis of pricing in the United Kingdom.

Target Population: Patients 65 years or older with nonvalvular AF and risk factors for stroke (CHADS(2) score ≥1 or equivalent) and no contraindications to anticoagulation.

Time Horizon: Lifetime.

Perspective: Societal.

Intervention: Warfarin anticoagulation (target international normalized ratio, 2.0 to 3.0); dabigatran, 110 mg twice daily (low dose); and dabigatran, 150 mg twice daily (high dose).

Outcome Measures: Quality-adjusted life-years (QALYs), costs (in 2008 U.S. dollars), and incremental cost-effectiveness ratios.

Results of Base-Case Analysis: The quality-adjusted life expectancy was 10.28 QALYs with warfarin, 10.70 QALYs with low-dose dabigatran, and 10.84 QALYs with high-dose dabigatran. Total costs were $143,193 for warfarin, $164,576 for low-dose dabigatran, and $168,398 for high-dose dabigatran. The incremental cost-effectiveness ratios compared with warfarin were $51,229 per QALY for low-dose dabigatran and $45,372 per QALY for high-dose dabigatran.

Results of Sensitivity Analysis: The model was sensitive to the cost of dabigatran but was relatively insensitive to other model inputs. The incremental cost-effectiveness ratio increased to $50,000 per QALY at a cost of $13.70 per day for high-dose dabigatran but remained less than $85,000 per QALY over the full range of model inputs evaluated. The cost-effectiveness of high-dose dabigatran improved with increasing risk for stroke and intracranial hemorrhage.

Limitation: Event rates were largely derived from a single randomized clinical trial and extrapolated to a 35-year time frame from clinical trials with approximately 2-year follow-up.

Conclusion: In patients 65 years or older with nonvalvular AF at increased risk for stroke (CHADS(2) score ≥1 or equivalent), dabigatran may be a cost-effective alternative to warfarin depending on pricing in the United States.

Primary Funding Source: American Heart Association and Veterans Affairs Health Services Research & Development Service.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Annals of Internal Medicine
Authors
Douglas K. Owens

117 Encina Commons
Stanford, CA 94305

(650) 736-0405 (650) 723-1919
0
Connor_Martin_profile.JPG

Martin Connor is presently a Harkness Fellow in Health Care Policy and Practice.  These fellowships are delivered by the Commonwealth Fund and support mid-career physicians and health service managers and researchers to study in the US.  During his time at CHP/PCOR Martin will be studying integrated care and its potential to contribute to the delivery system aspects of Health Reform as well as maintaining long standing interests in policy developments in the UK and developments in physician leadership and accountability.

Prior to starting the fellowship, Martin was Director of the Trafford Integrated Care Organisation Programme in the UK NHS, as the follow on to his role as Deputy Chief Executive at Trafford PCT in Manchester, England.

Before this, he worked from 2005-08 as special policy adviser to the Department of Health in Northern Ireland, leading the development of national policy at Permanent Secretary and ministerial level.  He went on to lead the reform programme and established the Service Delivery Unit in Northern Ireland.  This transformed waiting times for elective assessment and treatment, increased the involvement of clinical professionals in decision making and the developed a novel, high frequency, patient level information base to support strategic decision making.

Between 2002 - 2005, he was Associate Director (Health Reform) for the Greater Manchester Strategic Health Authority.  He co-authored the strategy for GMSHA, which led to the area moving from 'special measures' to 'high performing' within 2 years.  This strategy included the first health authority-wide demand management system in the NHS that was commended by the Audit Commission.

In his twenties, he studied classical and linguistic philosophy following the award of a studentship from Durham University where he received his doctorate in 2001.  He joined the NHS on the graduate management training programme in 1999.

Adjunct Affiliate at the Center for Health Policy and the Department of Health Policy
Paragraphs

Objectives To determine the relation between the HIV/AIDS epidemic and support for dependent elderly people in Africa.
Design Retrospective analysis using data from Demographic and Health Surveys.

Setting 22 African countries between 1991 and 2006.

Participants 123 176 individuals over the age of 60.

Main outcome measures We investigated how three measures of the living arrangements of older people have been affected by the HIV/AIDS epidemic: the number of older individuals living alone (that is, the number of unattended elderly people); the number of older individuals living with only dependent children under the age of 10 (that is, in missing generation households); and the number of adults age 18-59 (that is, prime age adults) per household where an older person lives.

Results An increase in annual AIDS mortality of one death per 1000 people was associated with a 1.5% increase in the proportion of older individuals living alone (95% CI 1.2% to 1.9%) and a 0.4% increase in the number of older individuals living in missing generation households (95% CI 0.3% to 0.6%). Increases in AIDS mortality were also associated with fewer prime age adults in households with at least one older person and at least one prime age adult (P<0.001). These findings suggest that in our study countries, which encompass 70% of the sub-Saharan population, the HIV/AIDS epidemic could be responsible for 582 200-917 000 older individuals living alone without prime age adults and 141 000-323 100 older individuals being the sole caregivers for young children.

Conclusions Africa's HIV/AIDS epidemic might be responsible for a large number of older people losing their support and having to care for young children. This population has previously been under-recognised. Efforts to reduce HIV/AIDS deaths could have large "spillover" benefits for elderly people in Africa.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
BMJ
Authors
Eran Bendavid
Grant Miller
Paragraphs

Background: Sodium consumption raises blood pressure, increasing the risk for heart attack and stroke. Several countries, including the United States, are considering strategies to decrease population sodium intake.

Objective: To assess the cost-effectiveness of 2 population strategies to reduce sodium intake: government collaboration with food manufacturers to voluntarily cut sodium in processed foods, modeled on the United Kingdom experience, and a sodium tax.

Design: A Markov model was constructed with 4 health states: well, acute myocardial infarction (MI), acute stroke, and history of MI or stroke.

Data Sources: Medical Panel Expenditure Survey (2006), Framingham Heart Study (1980 to 2003), Dietary Approaches to Stop Hypertension trial, and other published data.

Target Population: U.S. adults aged 40 to 85 years.

Time Horizon: Lifetime.

Perspective: Societal.

Outcome Measures: Incremental costs (2008 U.S. dollars), quality-adjusted life-years (QALYs), and MIs and strokes averted.

Results of Base-case Analysis: Collaboration with industry that decreases mean population sodium intake by 9.5% averts 513 885 strokes and 480 358 MIs over the lifetime of adults aged 40 to 85 years who are alive today compared with the status quo, increasing QALYs by 2.1 million and saving $32.1 billion in medical costs. A tax on sodium that decreases population sodium intake by 6% increases QALYs by 1.3 million and saves $22.4 billion over the same period.

Results of Sensitivity Analysis: Results are sensitive to the assumption that consumers have no disutility with modest reductions in sodium intake.

Limitation: Efforts to reduce population sodium intake could result in other dietary changes that are difficult to predict.

Conclusion: Strategies to reduce sodium intake on a population level in the United States are likely to substantially reduce stroke and MI incidence, which would save billions of dollars in medical expenses.

Primary Funding Source: Department of Veterans Affairs, Stanford University, and the National Science Foundation.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Annals of Internal Medicine
Authors
Douglas K. Owens
Paragraphs

Background: Sodium consumption raises blood pressure, increasing the risk for heart attack and stroke. Several countries, including the United States, are considering strategies to decrease population sodium intake.

Objective: To assess the cost-effectiveness of 2 population strategies to reduce sodium intake: government collaboration with food manufacturers to voluntarily cut sodium in processed foods, modeled on the United Kingdom experience, and a sodium tax.

Design: A Markov model was constructed with 4 health states: well, acute myocardial infarction (MI), acute stroke, and history of MI or stroke.

Data Sources: Medical Panel Expenditure Survey (2006), Framingham Heart Study (1980 to 2003), Dietary Approaches to Stop Hypertension trial, and other published data.

Target Population: U.S. adults aged 40 to 85 years.

Time Horizon: Lifetime.

Perspective: Societal.

Outcome Measures: Incremental costs (2008 U.S. dollars), quality-adjusted life-years (QALYs), and MIs and strokes averted.

Results of Base-case Analysis: Collaboration with industry that decreases mean population sodium intake by 9.5% averts 513 885 strokes and 480 358 MIs over the lifetime of adults aged 40 to 85 years who are alive today compared with the status quo, increasing QALYs by 2.1 million and saving $32.1 billion in medical costs. A tax on sodium that decreases population sodium intake by 6% increases QALYs by 1.3 million and saves $22.4 billion over the same period.

Results of Sensitivity Analysis: Results are sensitive to the assumption that consumers have no disutility with modest reductions in sodium intake.

Limitation: Efforts to reduce population sodium intake could result in other dietary changes that are difficult to predict.

Conclusion: Strategies to reduce sodium intake on a population level in the United States are likely to substantially reduce stroke and MI incidence, which would save billions of dollars in medical expenses.

Primary Funding Source: Department of Veterans Affairs, Stanford University, and the National Science Foundation.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Annals of Internal Medicine
Authors
Douglas K. Owens
Paragraphs

Background: Since California lacks a statewide trauma system, there are no uniform interfacility pediatric trauma transfer guidelines across local emergency medical services (EMS) agencies in California. This may result in delays in obtaining optimal care for injured children.

Objectives: This study sought to understand patterns of pediatric trauma patient transfers to the study trauma center as a first step in assessing the quality and efficiency of pediatric transfer within the current trauma system model. Outcome measures included clinical and demographic characteristics, distances traveled, and centers bypassed. The hypothesis was that transferred patients would be more severely injured than directly admitted patients, primary catchment transfers would be few, and out-of-catchment transfers would come from hospitals in close geographic proximity to the study center.

Methods: This was a retrospective observational analysis of trauma patients ≤ 18 years of age in the institutional trauma database (2000–2007). All patients with a trauma International Classification of Diseases, 9th revision (ICD-9) code and trauma mechanism who were identified as a trauma patient by EMS or emergency physicians were recorded in the trauma database, including those patients who were discharged home. Trauma patients brought directly to the emergency department (ED) and patients transferred from other facilities to the center were compared. A geographic information system (GIS) was used to calculate the straight-line distances from the referring hospitals to the study center and to all closer centers potentially capable of accepting interfacility pediatric trauma transfers.

Results: Of 2,798 total subjects, 16.2% were transferred from other facilities within California; 69.8% of transfers were from the catchment area, with 23.0% transferred from facilities ≤ 10 miles from the center. This transfer pattern was positively associated with private insurance (risk ratio [RR] = 2.05; p < 0.001) and negatively associated with age 15–18 years (RR = 0.23; p = 0.01) and Injury Severity Score (ISS) > 18 (RR = 0.26; p < 0.01). The out-of-catchment transfers accounted for 30.2% of the patients, and 75.9% of these noncatchment transfers were in closer proximity to another facility potentially capable of accepting pediatric interfacility transfers. The overall median straight-line distance from noncatchment referring hospitals to the study center was 61.2 miles (IQR = 19.0–136.4), compared to 33.6 miles (IQR = 13.9–61.5) to the closest center. Transfer patients were more severely injured than directly admitted patients (p < 0.001). Out-of-catchment transfers were older than catchment patients (p < 0.001); ISS > 18 (RR = 2.06; p < 0.001) and age 15–18 (RR = 1.28; p < 0.001) were predictive of out-of-catchment patients bypassing other pediatric-capable centers. Finally, 23.7% of pediatric trauma transfer requests to the study institution were denied due to lack of bed capacity.

Conclusions: From the perspective an adult Level I trauma center with a certified pediatric intensive care unit (PICU), delays in definitive pediatric trauma care appear to be present secondary to initial transport to nontrauma community hospitals within close proximity of a trauma hospital, long transfer distances to accepting facilities, and lack of capacity at the study center. Given the absence of uniform trauma triage and transfer guidelines across state EMS systems, there appears to be a role for quality monitoring and improvement of the current interfacility pediatric trauma transfer system, including defined triage, transfer, and data collection protocols.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Academic Emergency Medicine
Authors
Paragraphs

BACKGROUND: Genetic variability among patients plays an important role in determining the dose of warfarin that should be used when oral anticoagulation is initiated, but practical methods of using genetic information have not been evaluated in a diverse and large population. We developed and used an algorithm for estimating the appropriate warfarin dose that is based on both clinical and genetic data from a broad population base. METHODS: Clinical and genetic data from 4043 patients were used to create a dose algorithm that was based on clinical variables only and an algorithm in which genetic information was added to the clinical variables. In a validation cohort of 1009 subjects, we evaluated the potential clinical value of each algorithm by calculating the percentage of patients whose predicted dose of warfarin was within 20% of the actual stable therapeutic dose; we also evaluated other clinically relevant indicators. RESULTS: In the validation cohort, the pharmacogenetic algorithm accurately identified larger proportions of patients who required 21 mg of warfarin or less per week and of those who required 49 mg or more per week to achieve the target international normalized ratio than did the clinical algorithm (49.4% vs. 33.3%, P<0.001, among patients requiring < or = 21 mg per week; and 24.8% vs. 7.2%, P<0.001, among those requiring > or = 49 mg per week). CONCLUSIONS: The use of a pharmacogenetic algorithm for estimating the appropriate initial dose of warfarin produces recommendations that are significantly closer to the required stable therapeutic dose than those derived from a clinical algorithm or a fixed-dose approach. The greatest benefits were observed in the 46.2% of the population that required 21 mg or less of warfarin per week or 49 mg or more per week for therapeutic anticoagulation.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
New England Journal of Medicine
Authors
Russ B. Altman
Paragraphs

BACKGROUND: It is unclear whether functional status before dialysis is maintained after the initiation of this therapy in elderly patients with end-stage renal disease (ESRD). METHODS: Using a national registry of patients undergoing dialysis, which was linked to a national registry of nursing home residents, we identified all 3702 nursing home residents in the United States who were starting treatment with dialysis between June 1998 and October 2000 and for whom at least one measurement of functional status was available before the initiation of dialysis. Functional status was measured by assessing the degree of dependence in seven activities of daily living (on the Minimum Data Set-Activities of Daily Living [MDS-ADL] scale of 0 to 28 points, with higher scores indicating greater functional difficulty). RESULTS: The median MDS-ADL score increased from 12 during the 3 months before the initiation of dialysis to 16 during the 3 months after the initiation of dialysis. Three months after the initiation of dialysis, functional status had been maintained in 39% of nursing home residents, but by 12 months after the initiation of dialysis, 58% had died and predialysis functional status had been maintained in only 13%. In a random-effects model, the initiation of dialysis was associated with a sharp decline in functional status, indicated by an increase of 2.8 points in the MDS-ADL score (95% confidence interval [CI], 2.5 to 3.0); this decline was independent of age, sex, race, and functional-status trajectory before the initiation of dialysis. The decline in functional status associated with the initiation of dialysis remained substantial (1.7 points; 95% CI, 1.4 to 2.1), even after adjustment for the presence or absence of an accelerated functional decline during the 3-month period before the initiation of dialysis. CONCLUSIONS: Among nursing home residents with ESRD, the initiation of dialysis is associated with a substantial and sustained decline in functional status. 2009 Massachusetts Medical Society

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
New England Journal of Medicine
Authors
Subscribe to Europe