As hospitalized children become more medically complex, hospital-to-home care transitions will become increasingly challenging. During a quality improvement (QI) initiative, we developed an electronic tool to improve the quality of our hospital discharge process.
We modeled the concept of a paper-based Early Screen for Discharge Planning – Child Version, which identifies children with multiple medical conditions, home nursing-care needs, tube feedings, presence of intravenous lines or drains, or post hospital care that requires coordination.1We opted for an electronic tool to automate screening and increase visibilty of patients’ care transitional needs via the electronic health record (EHR).
The tool was designed by our QI team to address weaknesses in our discharge process (eg, discharge instructions that are not translated appropriately) and causes of preventable readmission at our institution (eg, discharge teaching, home care, and medication-related problems).2,3 The tool’s selected components comprised those that might complicate or delay discharge care and included indicators of home health, polypharmacy, and caregiver language preference. Additional features were considered but withheld from the initial tool for several reasons noted in the methods.
We describe the development and implementation of this electronic tool. Given the paucity of pediatric risk models, we conducted an analysis of the tool’s potential to predict readmissions. We anticipated good predictive performance because the tool includes measures previously associated with readmission (eg, technology dependence, polypharmacy, and language barrier).4-7 If successful in distinguishing readmission, this embedded discharge planning tool could also serve as a pediatric readmission risk score.
This work was conducted at the Children’s Hospital Colorado as part of a national QI collaborative. The hospital’s EHR is Epic (Verona, Wisconsin). The project was approved as QI by the Children’s Hospital Organizational Research Risk and Quality Improvement Review Panel, precluding review from the Colorado Multiple Institutional Review Board.
Tool Design, Implementation, and Use
A team of clinicians, nurse–family educators, case managers, social workers, and informatics experts helped design the instrument between 2014 and 2015. In addition to the selected features (number of discharge medications, presence of home health, and language preference), we considered adding the number of consulting specialists but had previously improved our process for scheduling follow-up appointments. Diagnoses were not systematically or discretely documented to be reliably extracted in real time. We excluded known readmission predictor variables (such as length of stay [LOS] and prior hospitalizations) from the initial model to maintain emphasis on the modifiable discharge processes. Additional considerations, such as health literacy and social determinants, were not systematically measured to be operationally usable.
To generate the score, the clinical documentation of home-health orders categorizes children with home care. Each home-care equipment or service category is documented in separate flowsheet rows, allowing for identification of distinct categories (Table). Total parenteral nutrition, intravenous medications, and durable medical equipment and supplies are counted as home care. The number of discharge medications is approximated by inpatient medication orders and finalized as the number of discharge medication orders. The medications include new, historic, and as-needed medications (if included among discharge medication orders). Home oxygen is not counted as medication. The preferred language of the family caregiver is recorded during patient registration or by the admitting inpatient nurse and is gleaned from either of these flowsheet sources.
The electronic score is displayed within the EHR’s Discharge Readiness Report8 and updates automatically as relevant data are entered. The tool displays the individual components and as a composite of 0-3 points. To register a point in each category, a patient needs to exceed (1) the dichotomous discharge medications criterion (ie, ≥6 medications), (2) the dichotomous home-health order criterion (ie, ≥1 home-care order), and (3) to possess documentation of a non-English speaking caregiver. The tool serves as a visual reminder of discharge planning needs during daily coordination rounds attended by clinicians, nursing managers, case managers, and social workers. Case managers use the home-care alert to verify the accuracy of home-care orders.
Evaluation of Predictive Utility for Readmission
We performed a retrospective cohort study on patients aged 0-21 years who were discharged between January 1, 2014 and December 30, 2015. This study was performed to determine the optimal cut points for the continuous variables (discharge medications and home-care orders) and to evaluate the predictive value of the composite score.
Unplanned readmission within 30 days was used as the primary outcome. The index hospitalization was randomly selected for patients with >1 admissions to avoid biasing the results with multiple hospitalizations from individual patients.
Patient characteristics were summarized using percentages for categorical variables and the median and interquartile range (IQR) for continuous variables. We examined bivariate associations for each of the tool’s predictor elements with readmission using Chi-square and Wilcoxon tests (level of statistical significance was set at 0.05). Receiver operating characteristics (ROC) analyses established optimal dichotomization points for medications and home-care orders using the maximized Youden’s index (sensitivity + specificity – 1).9 Dichotomization of these variables was selected for ease of implementation and interpretation. Similarly, the composite score was treated as a categorical predictor and because sensitivity analysis using it as a continuous predictor did not improve the tool's discriminatory properties.
The area under the ROC curve (AUC) was estimated to evaluate the performance of the composite score (as a categorical variable) using a predictive logistic regression model. To establish internal validity, we performed 10-fold cross validation analysis.10 Measures of predictive performance included the c statistic and the Brier score. The c statistic represents the discriminative ability of a model. A model with AUC > 0.9 signifies high discrimination, 0.7-0.9 indicates moderate discrimination, and 0.5-0.7 suggests low discrimination. The Brier score is a measure of the overall accuracy of the predictions and ranges from 0 to 1, with 0 representing perfect accuracy.10 Analyses were performed using SAS v9.4.
Analysis was restricted to patients with at least 30 days of available follow-up time after the index admission (N = 29,542 patients; Figure). Patient characteristics from the index admission are shown in the Table. The median age was 5 years, and median LOS was 2 days. A total of 19% of patients were discharged with ≥6 discharge medications, 15% received ≥1 home-health orders, and 10% were documented to have a non-English speaking family caregiver. Almost 28% had a composite score of 1 and 8% a score ≥2. The unplanned 30-day readmission rate was 4%. In bivariate analysis, children with readmission had longer LOS, more discharge medications, and more home care than children without readmission. Caregiver language preference was not associated with readmission.
ROC analysis indicated that dichotomizing number of medications at ≥6 vs. <6 and home health at 0 vs. ≥1 categories maximized the sensitivity and specificity for predicting 30-day unplanned readmissions. In predictive logistic regression analysis, the odds of readmission was significantly higher in children with a composite score of 1 vs. 0 (odds ratio [OR], 1.7; 95% CI 1.5-2) and a score of ≥2 vs. 0 (OR, 4.2; 95% CI, 3.6-4.9). The c statistic for this model was 0.62, and the Brier score was 0.037. Internal validation of the predictive logistic regression model yielded identical results.
Since implementation, we have not audited frequency of the tool’s use or whether care is changed with its use. Such feedback would be a first step in demonstrating its utility in QI. Our organization is undertaking an overhaul of the discharge process to reduce unnecessary discharge delays, with plans to actively incorporate the instrument in workflows. For example, the tool can be used to prompt more timely and complete translation and interpretation, verify home-care order accuracy, and flag appropriate families for early and optimal teaching in uses of home-care equipment. The medication component can be used to improve safe discharge practices for children with polypharmacy (eg, as a reminder to fill prescriptions and review the accuracy of medication lists prior to discharge). These interventions may generate more impact on discharge efficiency and family reported measures of satisfaction or discharge readiness than readmission outcomes.
Despite its potential benefit in QI, this instrument will need further validation to ensure that it helps target factors that it intends to capture. About 75% of patients were missing home-care values in the original data and were assumed to have not received home health. While there was no missing data for medication number or language, we opted not to assess accuracy of these variables. Therefore, patients may have been misclassified due to clinical documentation omissions or errors.
The instrument’s framework is relatively simple and should reduce barriers to implementation elsewhere. However, this tool was developed for one setting, and the design may require adjustment for other environments. Regional or institutional variation in home-health eligibility or clinical documentation may impact home-care and medication scores. The score may change at discharge if home-health or medication orders are modified late. The tool performs none of the following: measurement of regimen complexity, identification of high-risk medications, distinguishing of new versus preexisting medications/home care, nor measurement of health literacy, parent education, or psychosocial risk. Adding these features might enhance the model. Finally, readmission rates did not rise linearly with each added point. A more sophisticated scoring system (eg, differentially weighting each risk factor) may also improve the performance of the tool.
Despite these limitations, we have implemented a real-time electronic tool with practical potential to improve the discharge process but with low utility for distinguishing readmissions. Additional validation and research is needed to evaluate its impact on hospital discharge quality metrics and family reported outcome measures.
The authors have no relevant financial relationships to disclose.
This study was supported by an institutional Clinical and Operational Effectiveness and Patient Safety Small Grants Program