Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Technicians helping to root out errors in the pharmacy

24/11/2021 Client: muhammad11 Deadline: 2 Day

Medical Errors: Root Cause Analysis

REPORTS Detecting niedieation errors

Comparison of methods for detecting medication errors in 36 hospitals and skilled-nursing facilities

ELIZABETH A. FLYNN, KENNETH N . BARKER, GINETTE A. PEPPER, DAVID W . BATES, AND ROBERT L. MIKEAL

The Institute of Medicine (IOM)report on the quality of patientcare entitled "To Err Is Human" has drawn national attention to the occurrence, clinical consequences, and cost of adverse drug events in hospitals, which is estimated at $2 billion annually in the United States.' Leape et al.̂ studied adverse drug events by reviewing solicited self- reports and conducting daily chart reviews and found that 44% of these events occurred after the prescrip- tion order was written (i.e., during the medication delivery and admin- istration processes). Drug therapy cannot be successful unless, and un- til, both the prescribing and medica- tion delivery processes are conducted correctly. In this report, we focused on the errors that occur after physi- cians' medication orders are written.

Current estimates of the frequen- cy of medication errors are undoubt-

Abstract: The validity and cost-effectiveness of three methods for detecting nnedication errors were examined.

A stratified random sample of 36 hospi- tals and skilled-nursing facilities in Colo- rado and Georgia was selected. Medication administration errors were detected by registered nurses (R.N.s), licensed practical nurses (L.P.N.s), and pharmacy technicians from these facilities using three methods: incident report review, chart review, and di- rect observation. Each dose evaluated was compared with the prescriber's order. Devi- ations were considered errors. Efficiency was measured by the time spent evaluat- ing each dose. A pharmacist performed an independent determination of errors to as- sess the accuracy of each data collector. Clinical significance was judged by a panel of physicians.

Observers detected 300 of 457 pharma- cist-confirmed errors made on 2556 doses {11.7% error rate) compared with 17 errors detected by chart reviewers (0.7% error rate), and 1 error detected by incident re- port review (0.04% error rate). All errors

detected involved the same 2556 doses. All chart reviewers and 7 of 10 observers achieved at least good comparability with the pharmacist's results. The mean cost of error detection per dose was $4.82 for di- rect observation and $0.63 for chart re- view. The technician was the least expen- sive observer at $2.87 per dose evaluated. R.N.s were the least expensive chart re- viewers at $0.50 per dose. Of 457 errors, 35 (8%) were deemed potentially clinically significant; 71 % of these were detected by direct observation.

Direct observation was more efficient and accurate than reviewing charts and in- cident reports in detecting medication er- rors. Pharmacy technicians were more effi- cient and accurate than R.N.s and LP.N.s in collecting data about medication errors.

Index terms: Charts; Costs; Economics; Er- rors, medication; Hospitals; Methodology; Nurses; Personnel, pharmacy; Pharmacists; Pharmacy, institutional, hospital; Quality assurance; Skilled nursing facilities Am J Health-Syst Pharm. 2002; 59:436-46

ELIZABETH A. FLYNN, PH.D. , is Associate Research Professor, Center for Pharmacy Operations and Designs, Department of Pharmacy Care Systems, Harrison School of Pharmacy, Auburn University (AU), Auburn, AL. KENNETH N. BARKER, PH.D., is Sterling Professor and Director, Center for Pharmacy Operations and Designs, Harri- .son School of Pharmacy, AU. CiNETTE A. PEPPER, PH.D., is Associate Professor, School of Nursing, University of Colorado Health Sciences Center, Denver. DAVID W . BATES, M.D., M.SC, is Associate Profes- sor, Harvard School of Medicine, Boston, MA; Chief, Division of Ceneral Internal Medicine and Primary Care, Brigham and Women's Hospital, Boston; and Director, Clinical and Ouality Analysis, Center for Applied Medical Information Systems, Partners Healthcare, Bos- ton. ROBERT L MIKEAL, PH.D. , is President, DACE Company, West Monroe, LA.

Address correspondence to Dr. Flynn at 128 Miller Hall, Auburn University, AL 36849-5506.

The contributions of Robert M. Cisneros, Helen Deere-Powell, Loriann E. DeMartini, Samuel W. Kidder, Lucian L. Leape, G. Neil Libby, Robert E. Pearson, Linda A. Pfaff, and Rick Shannon are acknowledged.

Supported by grant 500-96-P605 from the Alabama Quality As- surance Foundation.

Presented at the ASHP Midyear Clinical Meeting, Las Vegas, NV, December 7, 2000.

Copyright © 2002, American Society of Health-System Pharma- cists, Inc. All rights reserved. 1079-2082/02/0301-0436$06.00

4 3 6 Am J Health-Syst Pharm—Vol 59 Mar 1, 2002

RKPORTS Detecting medication errors

edly low because many errors are un- documented and unreported.-^ Errors resulting in serious harm are report- ed because they are easy to identify and hard to conceal, yet they repre- sent the "tip of the iceberg." Report- ed errors are a small subset of the totality of errors that signal major system breakdowns with possible grave consequences for patients.'

The IOM report devotes an entire chapter to error-reporting systems, recognizing the self-report method as "the way to learn from errors," yet it says nothing about the use of other methods.' However, other error de- tection methods, such as observa- tion, have a long history of use, such as in the evaluation studies that es- tablished the effectiveness of the unit-dose concept and in the evalua- tion of expensive automated medica- tion systems.''' Incident report re- view and chart review have also been used extensively. In the early 1980s, the Health Care Financing Adminis- tration (HCFA, now the Centers for Medicare & Medicaid Services [CMS]) evaluated and adopted a technique involving the observation by HCFA surveyors of selected nurs- es as they administered medications. The results were considered an out- come indicator of the quality of the drug delivery system. '̂' Since then, HCFA required all 50 states to con- duct annual surveys of medication errors in nursing homes and nonac- credited hospitals. A standard for medication error rates is defined (5%) and must not be exceeded as part of the regulation.^

The goal of this study was to iden- tify the most efficient and practical error detection method that produc- es valid and reliable results. A pro- posal in 1997 called for the extension of HCFA responsibilities for evaluat- ing the quality of medication admin- istration to include all hospitals.* Should this occur, knowledge of the accuracy and efficiency of various types of methods used by different types of data collectors ••(e.g., regis-

tered nurses [R.N.s], licensed practi- cal nurses [L.P.N.s], and pharmacy technicians) would be critical to the success of expanding the scope of CMS's responsibility. This would also provide a quality assurance mecha- nism for medication administration in all hospitals. For such widespread ap- plication, the data collection method would need to be robust (i.e., not reli- ant on research-educated person- nel). Thus, a measure of the accuracy achievable under such circumstances was essential. Studies have compared the effectiveness of incident report review with direct observation in de- tecting medication errors, but a di- rect comparison of doses evaluated by both direct observation and chart review had not been performed.

Barker and McConnelP compared incident report review and voluntary report review with direct observation in 1962 and projected that the errors observed represented 1422 times the number identified by incident report review. Shannon and De Muth'° compared a combination of chart and incident report reviews with di- rect observation in 30 long-term- care facilities. Direct observation de- tected a mean medication-error rate of 9.6%, while document review yielded an error rate of 0.2%. Other investigators have compared the number of incident reports filed with the number of errors detected by ob- servation and found similar results."''' One study considered using errors found by incident report review to compare the quality of drug distribu- tion system errors in hospitals but concluded that such comparisons would not be meaningful.'' However, incident report review continues to be used widely in research, risk manage- ment, and quality improvement.

Cost comparisons of direct obser- vation and incident reports have been conducted. Shannon and De Muth'° found that the time required per patient to directly observe 20 pa- tients was significantly less than the time required per patient to review

incident reports. Brown"* calculated that completing and analyzing one incident report in 1979 cost $6.71.

The objective of this study was to compare the ability of three error de- tection methods to detect the same medication errors and the incurred cost when employed by three types of data collectors: R.N.s, L.P.N.s, and pharmacy technicians.

Methods The literature was reviewed to

find the best methods for detecting medication administration errors in U.S. hospitals and skilled-nursing fa- cilities, comparing the methods' va- lidity and cost-efficiency. Twelve methods were considered for inclu- sion in the present study:

1. Directly observing medication administration,^'"

2. Reviewing patients' charts,'"-'̂ 3. Reviewing incident reports involving

medication errors,̂ " 4. Attending medical rounds to listen

for clues that an error has occurred,'̂ ' 5. Interviewing health care personnel

to stimulate self-report,'*" 6. Analyzing doses rettirned to the

pharmacy,̂ ^ 7. Testing urine for evidence of omit-

ted drugs and unauthorized drug administration,"

8. Examining death certificates,̂ '' 9. Attending nursing change-of-shift

report," 10. Comparing medication administra-

tion record (MAR) with physicians' orders,̂ *"

11. Performing computerized analysis to identify patients receiving target drugs that may be used to treat a medication error or to search for se- rum drug concentration orders that may indicate an overdose," and

12. Comparing drugs removed from an automated drug-dispensing device for patients with physicians' orders."

A panel of six experts on medica- tion errors considered these methods and recommended that incident re-

Am J Health-Syst Pharm—Vol 59 Mar 1, 2002 437

UliPOUTS Detecting niedieation errors

port review, chart review, and direct observation be compared for their ability to detect medication adminis- tration errors. (The inclusion of pre- scribing errors was discussed and tried but found too difficult for the data collectors recruited in the time available.)

The accuracy of the three error de- tection methods was compared by us- ing them to evaluate an identical sam- ple of doses called the "comparison doses." The standard for comparison comprised the direct-observation re- sults from all data collectors after the accuracy of each dose (in error or not in error) was confirmed indepen- dently by a research pharmacist. The efficiency and cost of each method and data collector type were deter- mined by comparing the time re- quired to examine each dose. The sample of doses studied was drawn from the medications administered or omitted in the stratified random sample of 36 facilities in Georgia and Colorado.

R.N.s, L.P.N.s, and pharmacy technicians were compared as data collectors. Pharmacists have been compared with R.N.s previously and were found to be capable of obtain- ing results that are not significantly different.̂ " The effectiveness and ef- ficiency of LP.N.s and technicians as observers, chart reviewers, and inci- dent report reviewers have not been compared with pharmacists' abilities.

Sample. Suspecting that differ- ences in the medication-use process might exist between tbe East and West, and between hospitals accred- ited by tbe Joint Commission on Ac- creditation of Healthcare Organiza- tions, nonaccredited hospitals, and skilled-nursing facilities tbat might infiuence tbe findings, we decided to use a stratified sample. A stratified random sample of 36 facilities was drawn from the Atlanta, Georgia, metropolitan statistical area (MSA) and the Denver-Boulder-Greeley, Colorado, consolidated MSA. Lists of Medicare- and Medicaid-certified

hospitals and skilled-nursing facili- ties located in each MSA were pro- vided by HCFA. Six primary and 18 back-up facilities were randomly se- lected for eacb of tbe 3 facility types in eacb state. Letters soliciting partic- ipation were mailed to the adminis- trator. Tbe first 18 sites in eacb MSA that qualified and agreed to partici- pate were included in tbe sample (26 sites declined). All sites met the in- clusion criteria of having an incident report system in place.

The population from wbich non- accredited hospitals were randomly selected was expanded to include tbe entire states of Georgia and Colo- rado after we found tbat tbere were not six nonaccredited hospitals within each MSA tbat would agree to participate. A minimum bed-size re- quirement of 25 was established after we found tbat the census was unpre- dictable in hospitals witb fewer than 25 beds (which at times did not bave any inpatients).

The sampling unit comprised dos- es observed as given or omitted dur- ing a one- to four-day period on a nursing unit identified by nursing administration as having a high medi- cation volume. If the facility had sever- al higb medication volume units, up to four were included in tbe sample. A different nursing unit was studied eacb day to enable the data collectors to work on a new sample of patients. Di- rect observation occurred during tbe morning medication administration session, wben the greatest volume of doses were given, or until 50 doses were observed. Only staff nurses were observed.

Tbe comparison doses were de- fined as tbe set of doses for which data were collected using all three methods. This set comprised the dos- es observed as given or omitted but was limited to tbose for which the chart review could be completed witbin tbe eight-bour workday of tbe data collector.

Definitions. A medication error was defined as any discrepancy be-

tween tbe prescriber's interpretable medication order and wbat was ad- ministered to a patient. Definitions for tbe following categories of medi- cation errors employed in this study appear in Appendix A: omission, un- authorized drug, wrong dose, extra dose, wrong route, wrong form, wrong technique, and wrong time. Doses were considered to bave been given at tbe wrong time if tbey were administered more tban 60 minutes before or after tbey were due, unless a drug was ordered to be adminis- tered with a meal, in whicb case a 30- minute window was allowed (based on HCFA guidelines'). The schedule of administration for routine doses as listed on tbe MAR at eacb site was consulted to determine whether a dose was administered within the ac- ceptable time.

A dose was defined to be an op- portunity for error (a measure long used as the basic unit of data in ob- servational error studies).' A dose in- cluded any dose given plus any dose ordered but omitted. Any dose given was designated only as correct or in- correct (error or no error) wben cal- culating the error rate. This defini- tion prevented the error rate from exceeding 100%. Doses were defined, for the observation metbod, to in- clude only those for wbich both the preparation and administration of the medication was witnessed by the observer.' For chart review, doses were defined as those recorded in tbe cbart or scheduled to be administered in ac- cordance witb tbe prescriber's orders but omitted, plus as-needed medica- tions for whicb there was evidence of administration on the MAR during the observation period.

In determining if an omission er- ror bad occurred, the data collectors always sought an explanation for the omission. For example, a patient's blood pressure migbt have been too low for the patient to receive antiby- pertensives, or a patient's heart rate was lower than tbe ordered limit to receive a certain medication. Data

438 Am J Health-Syst Pharm—Vol 59 Mar 1, 2002

RJî POKTS Oeteetiiig niedieation errors

collectors were instructed to review a prescriber's orders twice when tbey detected an unautborized-drug error to ensure tbat an order was not in tbe patient's chart and wben an omission error was found to ensure tbat tbere was not an order to discontinue or temporarily stop administration of tbe drug.

Efficiency was defined as tbe time required to perform the tasks associ- ated with each error detection metb- od. Tbe starting and ending times for eacb task were recorded by eacb data collector, and the total time spent on each patient was indicated for each task. The time spent on all aspects of data collection for individual pa- tients (e.g., copying drug orders and reviewing a patient's cbart) was doc- umented. The time spent on tasks not associated with individual pa- tients (e.g., introducing themselves to nurses, observing a medication administration session, and calculat- ing an error rate) was divided and assigned equally for all patients. Nonproductive tasks, sucb as breaks and lunch, were excluded. Efficiency was measured in minutes per dose and calculated by dividing tbe time spent on eacb patient by the number of doses evaluated for that patient us- ing the assigned method. Data collec- tors were paid for an eigbt-hour work- day, regardless of how long it took them to complete the work. Tbis re- move;d the possible incentive to spend longer than necessary on each patient. A maximum of eigbt hours was spent on eacb nursing unit.

Data coUection and error detec- tion methods. Each data collector type was randomly assigned to use one of the three methods at each fa- cility, so tbat eacb method was em- ployed an equal number of times. For example, at tbe first site, tbe R.N. might have been assigned to use di- rect observation, tbe technician migbt have been assigned to review charts, and tbe L.P.N. might have been assigned to review incident re- ports. The data collectors worked in-

dependently of each other and were asked not to discuss their findings.

Incident reports. Data collectors al- lowed two to tbree weeks to pass after the observation period before return- ing to analyze reports and classify any errors reported. Incident reports had to be filed at least 7 days before tbe first day of observation in hospitals, at least 30 days before the first obser- vation day in skilled-nursing facili- ties, and no more than 7 days after the last observation day (to allow enough time to identify incidents and file reports for events occurring on tbe observation day). Tbe time required to complete this proc- ess was recorded for eacb incident report. To assess tbe accuracy of each data collector's work, the research pharmacist reviewed photocopied incident reports, if available, or went to the facility to review tbe original documents.

Chart review. Chart review was based on the research method described by Bates and colleagues" and modified in accordance with suggestions made by Bates for this study. A list of up to tO patients who were directly observed during tbe medication administration session was provided to the chart reviewer after tbe observer completed her work. The charts were reviewed the day after the medication administra- tion session to allow for errors tbat occurred during tbat time to exhibit effects on the patients. The following sections of each medical chart were evaluated by the data collectors: phy- sician's orders, laboratory test re- sults, physician's progress notes, nurse's notes, and the MAR. Trigger events tbat could result from a medi- cation error included laboratory test results that exceeded normal values to a degree that it was likely that an error had occurred (e.g., blood glu- cose concentration of >400 mg/dL could indicate that an insulin dose was omitted). Forms were provided to data collectors to remind them of trigger events. A list of medication

orders was extracted from eacb pa- tient's chart to facilitate tbe compari- son of each order with tbe MAR. Data collectors noted the use of "red fiag" drugs, sucb as naloxone and flumazenil, as indicators that a medi- cation error may have occurred. A complete list of these drugs appears in Appendix B."

Retrospective study using chart review and incident report review was performed. Tbe time period re- viewed during chart review extended from the last day of observation to 7 days in hospitals and 30 days in skilled-nursing facilities (if the pa- tient was there at tbe time). (Note that such retrospective data were not a source of comparison doses but were used to compare tbe efficiency and effectiveness of cbart reviewers with incident report reviewers.)

Direct observation. Direct observa- tion required the data collector to ac- company tbe nurse administering medications and observe tbe prepa- ration and administration of eacb dose. The observer recorded exactly wbat the nurse did witb tbe medica- tion and witnessed the drug's admin- istration to the patient. Data record- ed included related procedures, such as measuring the patient's heart rate and giving medications with food.

Eacb observer attended tbe change-of-shift report on the nurs- ing unit to meet tbe nursing staff and answer questions they bad about tbe study. The observer tben followed the first nurse to begin preparing medications and observed other nurses periodically to try to witness the administration of at least 50 dos- es. The observer made handwritten copies of the medication orders tbat were in the patient's chart and com- pared each dose observed with each prescriber's order. Uninterpretable orders {n = 6) were excluded from the study. Any deviations between the prescriber's order and what was observed were recorded as errors. Af- ter examining all of tbe doses wit- nessed, tbe observer tben tallied all of

Am J Health-Syst Pharm—Vol 59 Mar 1, 2002 439

J<:PORTS Oetecting medication errors

the doses omitted. The medication error rate was calculated by dividing the number of errors by the sum of the number of doses given plus the number of omissions and then mul- tiplying the result by 100.

Error confirmation. To establish a standard set of "true errors" for comparison, the observer first fin- ished the error determination and then gave the data to a research pharmacist. The researcher made a blinded, independent determination of errors by comparing each dose on the observer's medication adminis- tration session worksheet with the pharmacist's copy of the prescriber's orders. Each dose was then con- firmed, classified as a false positive (i.e., data collector noted an error when the research pharmacist had evidence to the contrary), or classi- fied as a false negative (i.e., data col- lector failed to detect an error that was subsequently found by the re- search pharmacist). The research pharmacist for the Colorado area also reviewed the data collected at the Georgia sites in order to address inconsistencies.

Recruiting and training data col- lectors. To emulate procedures prac- tical for most health care facilities, newspaper advertisements were placed to recruit R.N.s, L.P.N.s, and pharmacy technicians for training as data collectors. Applicants took a qualifying test to determine their base knowledge of medication and administration techniques. Two R.N.s, two L.P.N.s, and one pharma- cy technician were hired in Denver, and two R.N.s, two L.P.N.s, and two pharmacy technicians were em- ployed in Georgia.

A five-day training program was held in each city and included lec- tures about each data collection method, practice sessions using each method with sample patients, an in- teractive videotape about observation, practice sessions on nursing units in a hospital, and examinations. Addi- tional practice observations and

chart reviews were performed in hos- pitals and skilled-nursing facilities after the training was completed to develop proficiency in each method before the study began. One R.N. in Georgia withdrew from the project for personal reasons before complet- ing the training program.

Two examinations about the ob- servation method were given to eval- uate each data collector's ability to detect errors. One exam determined intraobserver and interobserver reli- ability. The overall percentage of agreement on the error detection exam was 96%, with the agreement on each individual question ranging between 89% and 100%, indicating interrater reliability. Repeated mea- sures analysis of variance found that the split-halves test scores were not significantly different within sub- jects, indicating intrarater reliability.

Assessment of clinical signifi- cance. The frequency of clinically significant errors was based on com- parison doses and compared for each method. A clinically significant med- ication error was defined as a medi- cation error with the potential for causing a patient discomfort or jeop- ardizing a patient's health and safety. A three-physician panel evaluated (by majority decision) each medica- tion error detected for potential sig- nificance by considering the patient's condition, drug category involved, and frequency of the error. HCFA guidelines for determining signifi- cance were provided to the panel (Appendix C).̂ The research phar- macist reviewed each chart and pro- vided information to the panel about each patient's condition, including sex, age, allergies, diseases, laborato- ry test results associated with a medi- cation, "red flag" drugs ordered, and noteworthy physicians' and nurses' progress notes.

Statistical methods. The number of errors detected by each method was compared by method and site. The rate of agreement between ob- servation and chart review was deter-

mined, and the rate of false negatives and false positives was calculated.

Kappa values were calculated to as- sess the accuracy of the data collectors' information compared with that from the research pharmacist. The purpose of using the kappa statistic was to de- termine whether the rate of agreement with the errors confirmed by the re- search pharmacist (the standard) was greater than what would have been found by chance alone.

The cost of each data collector type and method was compared by using chi-square analysis after it was found that the cost per dose results were not normally distributed, based on the Shapiro-Wilk test result in SAS version 6.12 for each error de- tection method (SAS Inc., Cary, NC). Significance was set atp< 0.05.

Results

Comparison of methods. The re- search pharmacist confirmed 457 of the 2556 comparison doses to be in error, producing a true error rate of 17.9%. Direct observation detected 300 of these errors and 73 false posi- tives, which produced an error rate of 14.6%. For the same doses, chart review detected 17 of the 457 errors and 7 false positives, yielding an er- ror rate of 0.9%, while incident re- port review detected only 1 error for an error rate of 0.04%. However, the data collectors missed 157 errors during direct observation, 440 dur- ing chart review, and 456 during in- cident report review.

Because additional information was available to observers who were at the patients' bedside during drug administration, some doses were classified differently by chart review- ers and observers. For example, a dose of a stool softener that an ob- server recorded as "refused by the patient" (and therefore not included as an opportunity for error) was la- beled as an omission error by a chart reviewer. In another case, a dose of potassium chloride was classified as an omission error by a chart reviewer

440 Am J Health-Syst Pharm—Vol 59 Mar 1, 2002

REPORTS Detecting liiediearioii errors

but was classified as a late dose by an observer who actually witnessed its administration.

Clinical significance. The physi- cian panel evaluated all 457 errors among the comparison doses for po- tential clinical significance. Thirty- five (8%) errors were considered po- tentially clinically significant. Of these errors, 25 (71%) were detected by direct observation, 3 (9%) by chart review, and none by incident report review.

Comparison by region and facili- ty type. Table 1 summarizes the er- rors detected by each method by re- gion and type of facility. The number of errors shown in this table includes false positives because these doses were categorized as errors by the data collectors.

Retrospective analysis by chart review and incident report. Retro- spective data were collected because they were accessible at a low incre- mental cost once the data collector was at the site. Overall, the error rate detected by chart review was 3.0% (2,536 pharmacist-confirmed errors, compared with 2,283 errors detected by data collectors on 85,197 doses evaluated). The overall error rate de- tected by incident report was 0.04% (32 pharmacist-confirmed errors, compared with 31 errors detected by data collectors on 85,197 doses). Chart review and incident report re- view data agreed on 18 errors, while chart review failed to detect 14 errors detected by incident report review. Incident reports failed to detect 2504 errors detected by chart review.

Error category analysis. Table 2 lists the number of errors in each er- ror category for each method. Chart review detected all but three types of errors (wrong route, wrong form, and wrong technique) in this study but at much lower frequencies than direct observation.

Accuracy of data collectors. The accuracy of the three different types of data collectors was compared with the research pharmacist's standard by calculating the kappa statistics for direct observation and chart review. The results from individual data col- lectors appear in Tables 3 and 4. Note that it was possible to use observa- tion data in addition to comparison dose data because more data were available for patients than the chart reviewer could evaluate because of

Table 1. Results for Comparison Doses by Method, Facility Type, and Region

Region and Facility Type

Colorado Accredited hospitals Nonaccredited hospitals' Skilled-nursing facility

Subtotal Georgia

Accredited hospitals Nonaccredited hospitals Skilled-nursing facility

Subtotal Total

No. Comparison

Doses'

568 201 623

1392

355 249 560

1164 2556

No. Errors Confirmed''

90 54

161 305

52 22 78

152 457

Number of Errors Detected''

Direct Observation

80 38

136 254

48 22 49

119 373

Chart Review

9 4 3

16

3 1 4 8

24

Incident Report Review

0 1 0 1

0 0 0 0 1

'Doses and errors as confirmed by research pharmacist. 'Includes false positives. 'Five nonaccredited hospitals that became accredited before data collection were included in the nonaccredited hospital category.

Table 2. Frequency of Errors for Comparison Doses Detected by Medication Error Category and Method

Error Category

Omission Wrong dose Wrong form Unauthorized drug Extra dose Wrong route Wrong technique

Subtotal Wrong time

Total

Standard*"

126 84 18 17 8 6 2

261 196 457

'Numbers in parentheses are the false positives included. 'Confirmed by research pharmacist.

Number of Errors Detected'

Direct Observation

118(32) 69(10)

3 21 (6) 4(1) 5 2

222 (49) 151 (24) 373(73)

Chart Review

9(2) 3 0 4(3) 2 0 0

18(5) 6(2)

24(7)

Am J Health-Syst Pharm—Vol 59

Incident Report Review

1 0 0 0 0 0 0 1 0 1

Mar 1,2002 441

Detecting iiiecUeatioii errors

Table 3.

Accuracy of Individual Data Collectors in Detecting Errors by Observation, Based on Total Doses Observed^

Data Collector

TechCO RN2C0 RN1C0 RNGA LPN2C0 LPN1GA TechiGA LPN2GA Tech2GA LPN1C0

Total

Total Doses

Compared

661 168 291 498 420 527 194 116 84

257 3216

Standard''

Total Errors

Confirmed

233 30 51 70 49 61 20 18 19 54

605

Total Nonerrors Confirmed

428 138 240 428 371 466 174 98 65

203 2611

No. Nonerrors Detected

414 137 237 420 368 452 169 98 62

172 2529

Performance of Data Collectors

Errors Detected Correctly

180 20 34 46 29 41 12 5 6

22 395

No. False Positives

14 1 3 8 2

19 5 0 3

31 86

No. False Negatives

53 10 17 24 20 20 8

13 13 32

210

Kappa Value

0.77 0.75 0.73 0.71 0.70 0.64 0.61 0.39 0.33 0.26 0.67

'Comparison doses plus doses not studied by other methods (2556 + 650). 'Confirmed by research pharmacist.

Table 4. Accuracy of Individual Data Collectors in Detecting Errors Using Chart Review, Based on Total Doses Evaluated^

Data Collector

RN2C0 RNGA TechCO TechlGA LPN1C0 LPN2C0 LPN1GA RN1C0

Total

Total Doses

Compared

14,161 9,653 5,726

17,369 1,091

11,544 15,927 9,726

85,197

Standard''

Total Errors

Confirmed

544 207 157 649 38

496 297 134

2,522

Total Nonerrors Confirmed

13,617 9,446 5,569

16,720 1,053

11,048 15,630 9,592

82,675

No. Nonerrors Detected

13,606 9,320 5,501

16,688 1,050

11,032 15,567 9,569

82,333

Performance of Data Collectors

Errors Detected Correctly

310 203 115 587

18 390 229 89

1,941

No. False Positives

11 126 68 32 3

16 63 23

342

No. False Negatives

234 4

42 62 20

106 68 45

581

Kappa Value

0.71 0.75 0.67 0.92 0.60 0.86 0.77 0.72 0.80

'Comparison doses pius doses not studied by other methods (2,557 + 82,640). 'Confirmed by research pharmacist.

time restraints. Kappa values of 0.75 or higher were considered to have ex- cellent reproducibility, values be- tween 0.40 and 0.74 had good repro- ducibility, and values less than 0.40 indicate marginal reproducibility.^"

When kappa values were calculat- ed for each type of data collector, the highest kappa values for direct obser- vation were those for pharmacy tech- nicians (0.74), followed by R.N.s (0.72) and LP.N.s (0.53). For chart review, R.N.s had the lowest kappa value (0.72) and pharmacy techni- cians had the highest (0.87). L.P.N.s had a kappa of 0.82 for chart review. Results varied widely among individ- ual pharmacy technicians.

Agreement with the pharmacist was calculated to evaluate the accura- cy of incident report analysis. The mean level of agreement was 91.4% (range, 79-100%). After all 254 incident reports were analyzed (ret- rospective results plus comparison doses), 18 false positives and 6 false negatives were found.

Comparison of accuraq' of indi- vidual data collectors. Of the data collectors using direct observation, one had an excellent kappa value (0.75), six had good kappa values, and three had poor values. (If the three outliers were excluded, the overall kappa value for observation was 0.74.) Of the data collectors us-

ing chart review, four had excellent kappa values, and the remaining four had good values.

Efficiency and cost of error de- tection methods. A technician using direct observation for one hour de- tected 80% of the true errors, while only 7% would be found by the tech- nician with chart review, at a cost of $15. An L.P.N. using direct observa- tion for one hour detected 92% of the true errors versus 2% if the L.P.N. used chart review, at a cost of $20; an R.N. found 70% of the true errors in one hour by direct observa- tion, and only 6% when chart review was used, at a cost of $25. These fig- ures do not include the cost of the

442 Am J Health-Syst Pharm—Vol 59 Mar 1, 2002

REPOKTS Detecting inedictition errors

pharmacist's time spent completing follow-up paperwork to confirm the errors detected.

Table 5 compares the time and cost required per dose for each type of data collector using each method. The data do not control for the incre- mental cost of retrospective chart re- view. Technicians were the least ex- pensive, performing observation at $2.87 per dose {t = 70.11, d.f. = 12, p < 0.0001). R.N.s were the most cost-efficient, performing chart re- view at $0.50 per dose, but not signif- icantly more than technicians ($0.56 per dose) {^ = 17.87, d.f = 14, p = 0.21). Pharmacy technicians com- pleted incident report review most efficiently ($2.61 per dose) (x^ = 18.37, d.f = 8, p = 0.02).

Discussion

Accuracy of error detection meth- ods. These data show that direct obser- vation detected administration errors at a much higher rate and more accu- rately than either chart review or inci- dent report review.

False positives and false nega- tives. The detection rate of false pos- itives is particularly important in sit- uations where the detection of an error rate above a fixed standard will trigger a response involving expen- sive or legal interventions. The rate of false positives for the observation method was 3.5%; for chart review it was 0.3%. Incident report review

found no false positives on the com- parison doses.

The detection rate of false nega- tives is important when the goal is to detect the absolute maximum num- ber of errors occurring. Direct obser- vation by all of the data collectors missed 34% of the true errors but was far superior to the other meth- ods (chart review missed 96% of the 457 errors).

Representativeness. Errors de- tected by incident report reviews were too few to be representative of the actual error rate. Errors detected by chart review, though far fewer than those found by direct observa- tion, were correlated with errors de- tected by observation when the number of errors detected by each method during each observation day were compared (r = 0.41 l ,p = 0.002, R^ = 0.169).

Accuracy of data collectors. All three R.N.s had excellent or good re- producibility values for observation and chart review. Two of the four L.P.N.s and two of the three pharma- cy technicians had good reproduc- ibility scores for direct observation. Three L.P.N.s and two technicians who performed chart review had good reproducibility (one technician had a kappa value of 0.92, the highest of all chart reviewers). Although these data collectors took a qualify- ing test to confirm a basic knowledge of medications and passed an exam

after about a week of training, the performance of three particular per- sons using direct observation (two L.P.N.s and one technician) suggest- ed that additional training (or re- placement) was necessary.

Errors rated potentially clinically significant. Medication errors are, at minimum, a waste of resources and may even represent fraud when pa- tients are charged for omitted doses. They become a threat to patient safe- ty when they adversely affect therapy and patient care. Though all three methods evaluated proved capable of detecting some errors of potential clinical significance, the true fre- quency of such errors was best esti- mated by direct observation.

The most efficient approach to re- ducing clinically significant errors may be to simply focus on reducing the rate of all errors detected by di- rect observation.

Cost of error detection. The cost of all three methods was measured by comparing the cost of the labor to ex- amine every individual dose for errors by each method. The mean cost per examined dose was $0.67 for chart review and $4.82 for observation.

One explanation for the higher expense of observation is the inclu- sion of delays during data collection (e.g., waiting for a nurse to start a medication administration session and dealing with the interruptions that occur during such sessions).

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Supreme Essay Writer
Coursework Help Online
Calculation Guru
Financial Solutions Provider
Chartered Accountant
Homework Guru
Writer Writer Name Offer Chat
Supreme Essay Writer

ONLINE

Supreme Essay Writer

I am an experienced researcher here with master education. After reading your posting, I feel, you need an expert research writer to complete your project.Thank You

$48 Chat With Writer
Coursework Help Online

ONLINE

Coursework Help Online

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$40 Chat With Writer
Calculation Guru

ONLINE

Calculation Guru

I am an experienced researcher here with master education. After reading your posting, I feel, you need an expert research writer to complete your project.Thank You

$38 Chat With Writer
Financial Solutions Provider

ONLINE

Financial Solutions Provider

After reading your project details, I feel myself as the best option for you to fulfill this project with 100 percent perfection.

$34 Chat With Writer
Chartered Accountant

ONLINE

Chartered Accountant

I am an experienced researcher here with master education. After reading your posting, I feel, you need an expert research writer to complete your project.Thank You

$47 Chat With Writer
Homework Guru

ONLINE

Homework Guru

I am a professional and experienced writer and I have written research reports, proposals, essays, thesis and dissertations on a variety of topics.

$25 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Caps Assignment W1 - Information systems for managers piccoli pdf - Why do ionic compounds have a high melting point - Graph for sin x cos y - Strong base weak acid equivalence point - Origin of dystopian literature - To what amount will the following investments accumulate - Performance lawn equipment case study solution - Another brooklyn summary - Google translate south african languages - Is flat rate simple interest - Hydraulic tools with names - Business adventures john brooks pdf español - F 1 4πε q1q2 r2 - Sweetheart r&b and blues concert in dallas - Tudor close ferring for sale - Unit I Hys Assesment - Mcdonald's annual report 2009 - How to write an exegesis philosophy - Balogun and hailey four types of strategic change - Philosophy reflection paper example - Raft foundation advantages and disadvantages - Identify the subordinate clause - Carlos santana early life - What is the maximum height reached by the ball - The following musical excerpt represents strophic form - Roller bending machine project report - Stakeholder communication management plan - Week 6 Cases Consumer Behavior - Define segregation and integration - Bmw globalization - Communication management plan ipt - Apn professional development plan paper - Globalization and Information Research - Excel exercise 1 grade sheet - +91-8306951337 love marriage specialist astrologer IN Aurangabad - The virtue of courage should enable one to face danger - Queensland law society journal - N4455 change theory project - Air force esd phone number - Small Buisness Management Discussion - Race in america matthew desmond and mustafa emirbayer - Assignment #013 - Five flavors of dumb sparknotes - Blue collar brilliance audience - Christian concept of imago dei in healthcare - Week 13 - Mecklenburg county health department - Ferngully the last rainforest worksheet answers - Why does salt only conduct electricity in water - Electrical body treatments description - 3 day diet analysis essay - Gloria y samuel comer comida francesa - Research Paper - Frederick douglass my escape from slavery summary - Arcsine transformation for percentage data - Larry page and sergey brin leadership style - Healthcare reimbursement - A wrinkle in time genre - Journal 300 word and 200 word response - Peter senge 7 learning disabilities - Management leading and collaborating in a competitive world pdf - Coastline credit union internet banking - Bachelor of social work csu - Uct gsb executive education - St modans high school website - Safe work australia model code of practice - Benzoic acid reaction with grignard reagent - Self-Image Evaluation - Chemical reaction observations - Discussion Question 1 week 2 - What other symbols represent human sexuality reproduction and fertility - Answer reading questions. - List the steps to making ethical decisions - Biopsychosocial assessment tool - Dave matthew inc issues 500 shares - Don t throw me in the briar patch - Explain how ahima's data quality management model works - Homework - Guyland chapter 4 summary - External proposal - Are engineered foods evil scientific american - Symbolism in literature powerpoint - Apqc process classification framework - Foxtel satellite wiring diagram - Govt2306 texas gov - How to determine if the function is continuous - Torkington primary school term dates - Wk 3 - Signature Assignment: Juvenile vs. Adult Justice Systems Paper - Touareg rear bumper removal - Elastic strength of rectangular beam - Services - Word equations for chemical reactions - Griffith university student administration - Customer value hierarchy model - Leg muscles used in walking - Foundations of finance nyu - Herero genocide summary - Coco republic warehouse archerfield - The induction coil works on the principle of