Search Site


Journal Entries


Stay Informed

Sign Up Today to stay informed about HINZ events and relevant health informatics news!




Sponsors 2015






International Events 2014





The Quality of Electronic Discharge Summaries for Post-Discharge Care - Hospital Panel Assessment and IT to Support Improvement

Wednesday, December 1st, 2010
Mehnaz Adnan, Jim Warren, Martin Orr
University of Auckland

Andrew Ewens, John Scott, Shona Trubshaw
Waitemata District Health Board

This article is also available as a PDF file.

EDS Quality Issues and IT-based Remediation Plan
Acknowledgements / Funding


A Discharge Summary (DS) is an important tool to communicate the post-discharge framework of care between hospital physicians and primary care as well as patients and their families. This paper assesses the quality of information in Electronic Discharge Summaries (EDSs) with respect to adequacy and specificity for optimal post-discharge care of the patient. The assessments were done by randomly selecting 40 EDSs of the patients discharged from two secondary care hospitals affiliated with Waitemata District health board, Auckland and using a questionnaire with an expert panel composed of two medical specialists and a clinical coder. The data was analysed using descriptive statistics and thematic analysis. Irrelevant data in laboratory results, and insufficient management and follow up information to patients and primary care physicians are identified as the most common quality issues. We conclude by outlining an ongoing programme of work to support improvement of the quality issues using Information Technology (IT), with a focus on improving the patient advice section of the EDS with decision support to EDS authors and embedding of readability support in the EDS documents themselves.

1. Introduction
A Discharge Summary (DS) is an important multipurpose clinical report which is primarily used to communicate post-discharge framework of care between hospital physicians and primary care [1-3] as well as to patients and their families [2,4]. In addition, clinical coders rely on these summaries in the process of clinical classification [5,6]. A DS is considered to be high quality when it is short, delivered quickly, and contains clinical information which is considered important for adequate follow up care [2,3,7-9]. The information includes patient’s chief complaint, diagnostic findings, summary of clinical management, advice on ongoing management of clinical condition, appropriate use of medications, relevant laboratory results and follow-up plan.

Traditionally, paper DSs have been completed manually by a health professional at the completion of an inpatient event of care. With the advancement in technology, paper DSs moved to Electronic Discharge Summaries (EDSs) in most of the developed countries including New Zealand. EDS systems was found to provide timely [10,11] and more legible summaries [10,12,13] that simplified work practices [10,14]. However several studies reported quality issues in EDSs [15-18] with deficiencies in summary content [15-18] and accuracy [15-17]. In a recent New Zealand study [15], General Practitioners (GPs) raised the concern of inaccurate and suboptimal follow up information in EDSs. In one study, Kazmi [17] identified the problems of incorrect consultant names,  missing follow up appointments, discharge date and diagnosis in EDSs. Similarly, Callen et. al. [16] reported omitted discharge date, secondary diagnoses and follow up instructions to GPs. Were et. al [18] reported missing information about pending test results and follow up providers in EDSs.

Although New Zealand is among the leading developed countries in its use of health information technology [19], and there are ongoing  efforts in Waitemata District Health Board (WDHB), Auckland to improve the quality of EDSs, there is a need for ongoing quality improvement efforts for EDSs. We have embarked in a project to produce more readable EDS contents for patients (aka health consumers) through interactive computer-based support both at the authoring and in the reading of EDSs. As a first phase in this research we have analysed the text characteristics of EDS contents and found that advice to patient section is brief and does not tend to lengthen in proportion to the clinical management section [20]. The present study is the second stage of the research and presents a panel assessment of EDSs information related to their quality for supporting post-discharge care. This paper presents the findings of that panel assessment.  We conclude by providing an overview of our IT-based remediation plan focused on addressing the issues of information for patients as identified in the text analysis and panel assessment phases.

2. Methods
2.1.    Study Sample and Design
We randomly collected 200 printouts of EDSs of patients discharged between August 2007 to July 2008, 50 each from medicine, surgery, emergency and older adult health services in two secondary care hospitals (North Shore and Waitakere hospitals) managed by WDHB.  The documents were anonymised before data collection with regard to patient and author identity. In a previous study we analysed the text characteristics of these 200 EDSs [20]. In this study, we included 40 randomly identified EDSs. Random assignment was achieved by manually mixing hard copies of EDSs and then selecting first ten from each specialty. 

A panel was convened consisted of two senior hospital specialists, one each from of Emergency Services and Older Adult Services, and one experienced specialist clinical coder from the Medical Records department (hereafter Specialist 1, Specialist 2 and Clinical Coder, respectively). The panel gave input on needs assessment and helped to create a questionnaire  (see appendix) to be administered to all 40 selected EDSs. The questionnaire was designed to measure the quality and relevancy of written information in the items identified by previous studies to be important for continual management of the patient [1,2,21] . Each panel member examined the printed copies of EDSs individually. Before this study began, a pilot on one EDS was carried out where the expert panels examined, discussed and tested the evaluation protocol thoroughly in a meeting.

2.2.    Assessment of Quality
All summaries were evaluated for the quality of information using yes/no options in the questions for the following key items: diagnosis, relevant medical problems, follow-up, advice to GP, relevant results, major interventions, discharge destination, referral to other services and advice to patient. The questions asked the specialists whether the information contained in the fields is optimal to support ongoing management of the patient in primary care. While for the advice to patient, specialists were asked whether the written information is adequate for patients to manage their post discharge self-care. ‘Yes’ was marked for satisfactory and ‘No’ for unsatisfactory written information in the respective sections. Not applicable (NA) was entered during data analyses process when the respondent wrote ‘n/a’ and/or the section was not present in the EDS. Generally ‘n/a’ would be written when the section was not present; we haven’t verified if specialist entered ‘n/a’ for non-blank sections. These questions were also provided with a space for free text comments to be used if a panel member found the information quality to be sub-optimal. The aim was to identify the origin and reason of the possible deficiencies in EDSs.

2.3.    Assessment of Relevancy
To assess the relevancy of the EDS text, the panel members were asked to code the reports using orange, blue and green highlighters. Where orange was used to code for ‘most important’ content which should be available as a first level information, green for ‘important’ content which should be accessible as a second level of information (e.g., via a hyperlink in an online document) and blue for ‘least important’ content which would be better excluded from the EDS report (thus to be described as ‘irrelevant’). The aim of this colour coding was to perceived the relative importance of the contents in each EDS section for the purpose of ongoing patient management in general practice and to provide insight for future EDS enhancements such as online hypertext.

2.4.    Statistical Analysis
Data were stored and analysed using Microsoft Excel version 2007. Forced choice questions and word counts of coloured coding were analysed via descriptive statistical analyses. We investigated agreement among the medical specialists for the forced choice questions by calculating the total perfect agreement and partial agreement. Differences between coloured coding were tested for significance using chi-squared test.  Open ended comments about issues of quality of information in significant deficient sections  were combined to create one note per question and analysed using data driven thematic content coding [22]. The first author segmented free text comments into sentences or clauses, with each segment representing one idea. This coder then read the comments, assigning each a descriptive label. Whenever possible, the label was selected from those already existing; otherwise, a new label was created.  Then labels were arranged into a two level hierarchal scheme with four branches; Information quality, Information missing, Ease of access and Language.  For example ‘Information quality’ has a sub hierarchy of; incorrect information, insufficient information, irrelevant information, no useful information and incomplete information. This study was approved by the University of Auckland Human Participants Ethics Committee under protocol number 2008/221 and by WDHB Knowledge Centre under the project code RM0980710342.

3. Findings
3.1.    Assessment of Quality
The agreement between medical specialists and clinical coder response for the forced choice yes/no questions is shown in Table 1.

The table shows that the medical specialists are in agreement on finding 45% of relevant results (18 out of 40), 42% of advice to patient, 30% of advice to GP and 20% of follow-up sections to be deficient in quality. While the clinical coder found that the majority of the advice to patient, advice to GP and follow-up sections are not applicable for coding purposes.

The results of thematic coding focusing on the quality issues of the four sections with the most deficiencies, as agreed by both Specialists, are shown in Table 2. N is the total number of codes assigned for every segment of the comment (see section 2.4). Forty-nine comments were made about quality issues for the information written in relevant results sections. Among 43 codes in the information quality category, the most frequent issue was irrelevant information e.g. “redundant details”. In the missing information category, 3 were “missing interpretation” and 3 noted that important results are missing. In the ease of access category, 7 comments noted that the information is hard to access and 6 noted that the information is written in another section; one comment was “poor formatting”.

Sixty-one comments noted information quality issues in advice to patient sections. Among 21 comments in the information quality category, 10 pointed out that no useful information or synopsis is given, 10 noted that advice is incomplete and one noted that the advice to patient is very brief. Eleven comments pointed out that information is missing (e.g. 8 times the comment was “missing advice” and 3 times the comment was “no advice”). Among the 8 codes in the language category, 4 pointed out the use of abbreviations, 3 noted medical jargon and 1 noted a spelling mistake.

Table 1 - Assessment of Quality by Discharge Summary Section


Table 2 - Comments Regarding Deficiencies by Category per Section

For the 25 labels for information quality category in GP advice section, 10 each pointed out the information to be insufficient and not useful (e.g. “no useful information/synopsis”) while 3 noted that the goals are unclear, 1 noted incorrect information and 1 noted incomplete advice. Among 7 labels in the missing information category, 5 noted that management advice is missing and 2 noted that field is not present in the EDS. While for ease of access, both of the comments noted that information is written in the clinical management section.

For the information quality category in follow-up section, 8 times it was noted that follow up advice is incomplete; while 2 codes pointed out that no useful information is written in this section. In the missing information category, 4 comments pointed out that important information in missing e.g. “clinic not described”. However, in the information access category, all 11 comments pointed that follow up information in written in clinical management section. 

3.2.    Assessment of Relevancy
The proportion of the level of importance in EDS documents perceived by each panel member is shown in Figure 1 (Note that un-coded data is not shown, that is why percentages not added up to 100%). The figure shows that there was no consensus between the specialists for the relevancy level of the EDS contents overall. One specialist marked 69% of the content to be most important while other found only 25% of the information to be most important; and the clinical coder marked only 24% of the information to be most important for the purpose of coding. The percentage of content marked by the two specialists and the clinical coder to be least important in the EDSs was 9%, 34% and 63%, respectively. The majority of the text marked as least important i.e. better excluded was from the laboratory results section of the EDS, with laboratory results text the most likely to be marked as least important  (P<0.01).


  Figure 1 - Proportion of Importance per EDS Section (Percentage labeled for key findings

4. EDS Quality Issues and IT-based Remediation Plan
The present assessment found deficiencies in the quality of information, notably in the sections relating to ongoing management advice for patients and primary care physicians. This is complemented with our previous text analyses of EDSs [20], where we found that advice to patient has a lesser number of abbreviations and simpler clinical vocabulary but is a very brief component of the total document and hence does not provide comprehensive information for patients. Other sections of the EDS, which are primarily written for a clinical audience, providing comprehensive information about in-hospital and post-discharge care, but contain clinical jargon and abbreviations and hence impose a comprehension barrier for a lay person.

To provide better advice to patients and to improve the readability of EDSs for consumers we are working on an IT-based remediation plan as shown in Figure 2.


Figure 2 - IT-based Remediation Plan for Patient Support


This plan has two major components:

  1. To provide writing support for EDS authors, we implemented a prototype interactive Clinical Decision Support System (CDSS) that gives medication advice recommendations at the point of authoring patient advice. The system offers the EDS author to include a pre-formulated ‘auto text’ of patient advice specific to a given medication for a set of identified high risk discharge medications. In addition, the system provides a critique alerting if any of the recommended advice is missing from the patient advice section [23]
  2. To provide readability support for consumers, we demonstrated [24] the use of semantic annotations to provide synonyms and automatic hyper-linking to online resources for difficult terms as a means of making the content more comprehensible for patients in a web interface where consumers can view their EDSs online.  This readability support has also been embedded for EDSs in portable document format (PDF) which could be easily transmitted to health care providers and consumers

Our principal aim for the IT-based remediation plan is to improve the EDS content for patients. Another major area is to improve the relevance of the ‘relevant results’ section of the EDSs. Our present assessment of the content relevance in EDSs suggests that laboratory results should be included more selectively to spare the reader from information overload. Our EDS writing support approach has the potential to be applied also to the selective inclusion of information in the relevant results section. A first default behaviour may be to include all the abnormal results automatically and then provide the EDS author the chance to include any further results which they believe are important.

5. Discussion
In this study from two secondary care hospitals we found that the written information in EDSs was frequently of inappropriate quality in all assessed fields with laboratory results, advice to patient, GP advice and follow-up the most deficient. The most dominant reason for these deficiencies was the inclusion of large amounts of irrelevant information in the laboratory results, which comprised 44% of the overall data. Nonetheless many EDSs were provided incomplete or less-than-comprehensive management and follow-up advice to patient and GPs. Secondly, lack of comprehensive advice  and incomplete follow-up information for patients is of great importance because patients often discharge with problems that, while improving, are not yet resolved and therefore are more vulnerable to adverse events during the hospital-to-home transition [25]. Our findings agree with other studies on finding missing and suboptimal information to patient and families [18,26], as well as with respect to the use of abbreviations and clinical terminology in discharge instructions [20]. Suboptimal information has been shown to lead to patients’ dissatisfaction with discharge instructions and noncompliance [4,27], while use of abbreviations and medical jargon has been found to impose comprehension barriers in other studies [4,28,29]. Allied to patient advice is the need for GP tasks to be clear and focus on the follow-up goals of patients. It is a well recognized problem that DSs are found to be inadequate in communicating valuable patient management information [3,15,30,31]  to general practice. Since the DS is the primary tool for transferring information, incomplete and inadequate information for primary care physicians in these summaries may negatively affect continuity of care and contribute to adverse events.

This indicates a need to improve the written information while focussing on the needs of patients and primary care physicians. The advice to patient should be in lay language and cover aspects of treatment plan and self-management (e.g., potential medication side effects) while GP tasks should focus on goals in follow-up management of the patient.  At a first opportunity, there should be improved emphasis in the training of EDS authors with respect to the importance of conveying the required information for the patient and GP audiences who will need to interpret the document for the patient’s ongoing care. To complement and provide support for cultural changes in the style of communication in the EDS, we are working on a technology driven remediation plan having two components, writing and reading support, to improve the quality of EDS for patients’ ongoing care self care. The writing support component [23] is implemented as a CDSS to provide EDS writers with interactive feedback based on the medical concepts in medications to provide specific instructions for self care. The reading support component is an interpretative layer in the EDS document itself in the form of consumer-friendly names and hyperlinks to specific web resources as demonstrated in [24], to make EDS content  more comprehensible for consumers. 

Our study has several limitations. The most significant is its small scale in terms of number of patients and doctors. Although the amount of EDS data is small, the agreement level in patient ongoing management sections (patient advice, GP advice and follow up) is sufficient to support discussion of the sub-optimal information quality in these sections. In addition, the panel is made up of professionals with different backgrounds, each having different subjective needs. Therefore, the two specialists exhibited substantial individual differences in assessment of EDS content relevancy which may be influenced by their individual perspective and values. Furthermore, the study covers the perception of hospital physicians and a clinical coder rather than the GPs and patients who are the primary audience of EDSs. This complements a recent study [15] with the primary care physicians in WDHB which has shown the GPs’ views on EDSs. A further study would be helpful in seeking the patients’ feedback of their discharge information. Moreover, there is an opportunity to promote more selective inclusion of laboratory results in the EDS, possibly using a relevance critique strategy similar to what we have developed for authoring support for medication information in the patient advice section of the EDS.

This study, along with other similar studies [15-17,30], demonstrates the importance of improving the quality of EDSs to better serve their purpose of supporting a seamless transition at the primary-to-secondary care interface. We also believe that in order to reap the benefits of EDSs the content should be tailored to consumers’ information needs to empower patients in their post-discharge self care. We believe IT has the potential to play a significant supporting role in improving the quality of the EDS for ongoing care.

6. Acknowledgements
The authors gratefully acknowledge the help of the Waitemata District Health Board (WDHB) staff especially Jo-Anne Benjamin and Zina Ayar in making available the de-identified Discharge Summary data essential to this research. This work was supported by a Higher Education Commission, Pakistan scholarship.

7. References
[1] Newton J, Eccles M, Hutchinson A. Communication between general practitioners and consultants: what should their letters contain? British Medical Journal. 1992;304:4.
[2] Walraven C. What is Necessary for High-Quality Discharge Summaries? American Journal of Medical Quality. 1999;14(4):10.
[3] Kripalani S, LeFevre F, Phillips CO, Williams MV, Basaviah P, Baker DW. Deficits in communication and information transfer between hospital-based and primary care physicians: implications for patient safety and continuity of care. The Journal of American Medical Association. 2007 Feb 28;297(8):831-41.
[4] Engel KG, Heisler M, Smith DM, Robinson CH, Forman JH, Ubel PA. Patient Comprehension of Emergency Department Care and Instructions: Are Patients Aware of When They Do Not Understand? Annals of Emergency Medicine. 2008(10).
[5] Campbell SE, Campbell MK, Grimshaw JM, Walker AE. A systematic review of discharge coding accuracy. J Public Health Med. 2001 Sep;23(3):205-11.
[6] Kirkman MA, Albert AF. Accuracy of hospital discharge coding is important in assessing trends in stroke rates.[comment]. Swiss Med Wkly. 2009 Aug 8;139(31-32):463; author reply -4.
[7] Bull MJ, Roberts J. Components of a proper hospital discharge for elders. Journal of Advanced Nursing. 2001;35:571-81.
[8] Rao P, Andrei A, Fried A, Gonzalez D, Shine D. Assessing quality and efficiency of discharge summaries. American Journal of Medical Quality. 2005 Nov-Dec;20(6):337-43.
[9] Hall C, Bjorner T, Martinsen H, Stavem K, Weberg R. [The good discharge summary--criteria and evaluation]. Tidsskr Nor Laegeforen. 2007 Apr 19;127(8):1049-52.
[10] Archbold RA, Laji K, Suliman A, Ranjadayalan K, Hemingway H, Timmis AD. Evaluation of a computer-generated discharge summary for patients with acute coronary syndromes. Br J Gen Pract. 1998 Apr;48(429):1163-4.
[11] O'Leary KJ, Liebovitz DM, Feinglass J, Liss DT, Baker DW. Outpatient physicians' satisfaction with discharge summaries and perceived need for an electronic discharge summary. J Hosp Med. 2006 Sep;1(5):317-20.
[12] Bolton P. A quality assurance activity to improve discharge communication with general practice. J Qual Clin Pract. 2001 Sep;21(3):69-70.
[13] Turner S, Birrell G. What do general practitioners and paediatricians want from discharge letters? Ambulatory Child Health. 2000;6: :5.
[14] Craig J, Callen J, Marks A, Saddik B, Bramley M. Electronic discharge summaries: the current state of play. Health Inf Manage J. 2007;36(3):30-6.
[15] Hopcroft D, Calveley J. What primary care wants from hospital electronic discharge summaries – a North/West Auckland perspective. NZ Family Physician. April 2008;35(2).
[16] Callen JL, Alderton M, McIntosh J. Evaluation of electronic discharge summaries: a comparison of documentation in electronic and handwritten discharge summaries. International Journal of Medical Informatics. 2008 Sep;77(9):613-20.
[17] Kazmi SMB. Quality of Electronic Discharge Summaries at Newham University Hospital: An Audit. British Journal of Medical Practitioners. 2008;1(1):3.
[18] Were MC, Li X, Kesterson J, Cadwallader J, Asirwa C, Khan B, et al. Adequacy of hospital discharge summaries in documenting tests with pending results and outpatient follow-up providers. Journal of General Internal Medicine. 2009 Sep;24(9):1002-6.
[19] Davis K, Doty MM, Shea K, Stremikis K. Health information technology and physician perceptions of quality of care and satisfaction. Health Policy. 2009 May;90(2-3):239-46.
[20] Adnan M, Warren J, Orr M, editors. Assessing Text Characteristics of Electronic Discharge Summaries and their Implications for Patient Readability. Second Australasian Workshop on Health Data and Knowledge Management 2010.
[21] Solomon JK, Maxwell RB, Hopkins AP. Content of a discharge summary from a medical ward: views of general practitioners and hospital doctors. J R Coll Physicians Lond. 1995 Jul-Aug;29(4):307-10.
[22] Krippendorff K. Content analysis: an introduction to its methodology. Sage Publications; 2004.
[23] Adnan M, Warren J, Orr M, editors. Ontology Based Semantic Recommendations for Discharge Summary Medication Information for Patients. 23rd IEEE Symposium on Computer-Based Medical Systems (CBMS 2010); 2010; Perth, Australia.
[24] Adnan M, Warren J, Orr M. Enhancing Patient Readability of Discharge Summaries with Automatically Generated Hyperlinks. Health Care and Informatics Review Online. 2009 December 2009.
[25] Forster AJ, Murff HJ, Peterson JF, Gandhi TK, Bates DW. The incidence and severity of adverse events affecting patients after discharge from the hospital. Ann Intern Med. 2003 Feb 4;138(3):161-7.
[26] Thomas EJ, Burstin HR, O'Neil AC, Orav EJ, Brennan TA. Patient noncompliance with medical advice after the emergency department visit. Annals of Emergency Medicine. 1996 Jan;27(1):49-55.
[27] Clarke C, Friedman S, Shi K, Arenovich A, Culligan C. Emergency department discharge instructions comprehension and compliance study. Canadian Journal of Emergency Medicine. Jan 2005;7(1):7.
[28] Maloney LR, Weiss ME. Patients' perceptions of hospital discharge informational content. Clin Nurs Res. 2008 Aug;17(3):200-19.
[29] Makaryus AN, Friedman EA. Patients' understanding of their treatment plans and diagnosis at discharge. Mayo Clin Proc. 2005 Aug;80(8):991-4.
[30] Raval AN, Marchiori GE, Arnold JMO. Improving the continuity of care following discharge of patients hospitalized with heart failure: is the discharge summary adequate? Can J Cardiol. 2003 Mar 31;19(4):365-70.
[31] Alderton M, Callen J. Are general practitioners satisfied with electronic discharge summaries? Health Inf Manage J. 2007;36(1):7-12.