Drugs and Driving

Introduction

Motor vehicle accidents are the leading cause of death in the United States for people aged 1-34 years, and ethanol is a factor in nearly half the traffic fatalities each year. Studies investigating the prevalence rate of drugs other than ethanol in fatally injured drivers have reported varied results, ranging from 6 to 37%. Among individuals stopped for reckless driving who were judged to be clinically intoxicated, urine drug testing indicated 85%were positive for cannabinoids, cocaine metabolites, or both.
These relatively high prevalence rates reinforce the general assumption that psychoactive drugs are capable of impairing driving. Drug prevalence rates do not imply impaired driving; however, because of the reliability with which certain drugs degrade psycho-motor and cognitive performance in the laboratory, it is highly likely that many drug-related vehicular accidents and driving under the influence/driving while impaired/intoxicated arrests involve impaired behaviors critical for safe driving. There are some studies that have investigated the effects of drugs on actual driving; however, the majority of our knowledge concerning drugs and driving is inferred from laboratory studies using driving simulators or tasks that are assumed to measure critical aspects of driving skills such as reaction time, divided attention, and vigilance. The information presented in the following section emphasizes data from studies of actual driving on closed courses or highways, but also includes pertinent data from laboratory studies examining the effects of drugs on human performance.


Specific Drugs and Driving

Psychomotor stimulants (cocaine, rf-amphetamine, and methamphetamine)
The psychomotor stimulants, cocaine, J-ampheta-mine, and methamphetamine, share a similar psycho-pharmacological profile, such that low to moderate acute doses of both drugs increase positive mood, energy, and alertness in nontolerant individuals. It is well known that amphetamines can increase the ability to sustain attention over prolonged periods of time when performing monotonous tasks. Amphetamines or cocaine have also been shown to improve performance on auditory and visual reaction time tests; the digit symbol substitution test (DSST), a test of psychomotor skills and attention; and on tests of selective and divided attention.
Although numerous studies have shown that psy-chostimulants can enhance performance on tasks that appear related to driving (e.g. reaction time, motor coordination), no studies that directly assessed the effects of amphetamines in actual or simulated driving situations were found in recent literature searches. An indication of the incidence of stimulant use among truck drivers was recently reported. Of 317 drivers who volunteered urine or blood samples, 5%were positive for prescription amphetamines (amphetamine, methamphetamine, phentermine) and 12% were positive for over-the-counter sympathomimetic drugs, such as phenylpropanolamine, ephedrine and pseudoephedrine.
Chronic use of psychostimulants can result in a drug dependence syndrome, such that abrupt cessation of drug use can result in depression or fatigue. When stimulants are taken chronically in large doses, a toxic psychosis can develop that is characterized by vivid auditory and visual hallucinations, paranoid delusions, and disordered thinking, and is often indistinguishable from paranoid schizophrenia. Thus, although occasional use of low to moderate doses of amphetamines or cocaine can enhance performance, chronic use is not compatible with situations requiring clear thinking and coordination, such as driving a motor vehicle.

Cold and allergy medications (antihistamines)

All first-generation antihistamines produce sedation and have been shown to impair psychomotor and cognitive performance as well as on-road driving. For example, in a test of highway driving, diphenhydramine (50 mg) significantly impaired the ability to maintain steady lane position and to follow speed changes in a lead car. The second generation of anti-histamines are less lipophilic than the previous generation and thus cross the blood-brain barrier less readily, which accounts for the lower levels of sedation observed with the newer drugs. Terfenadine was the first second-generation antihistamine to be approved by the Food and Drug Administration and marketed in the United States. In general, studies have found that terfenadine, in acute doses that exceeded the therapeutic dose (60 mg, b.i.d.), did not impair psychomotor and attentional performance as measured in the laboratory and in on-road driving tests. However, when terfenadine (120 mg, b.i.d.) was administered for 3 days, driving impairment was observed. Similar results have been obtained with other second-generation antihistamines, such as lor-atadine, cetirizine, and mizolastine, when therapeutic doses are exceeded. Thus, the second-generation anti-histamines produce less sedation than first-generation compounds and are generally safe at therapeutic doses. However, when therapeutic doses are exceeded, the so-called ‘nonsedating’ antihistamines are capable of producing sedation, which could result in impaired driving.

Sedative-hypnotics (benzodiazepines)

Because of their sedative effects, benzodiazepines generally impair psychomotor performance in non-tolerant individuals. Not surprisingly, benzodiaze-pines are associated in a dose-dependent manner with a significantly increased risk for traffic accidents. In studies of on-road driving, diazepam (5mg, t.i.d.), lorazepam (0.5 mg, t.i.d.), and lorazepam (2mg, b.i.d.) increased lateral lane position and slowed response time to a lead car’s change in speed. Several benzodiazepines(oxazepam50 mg, flurazepam30 mg, and lormetazepam 2 mg) impaired driving the morning after dosing. In studies using driving simulators, diazepam (5-15 mg) produced deficits in lane position, speed maintenance, and emergency decisions. Diazepam (15 mg) impaired performance on a clinical test for drunkenness, which comprised 13 tests assessing motor, vestibular, mental, and behavioral functioning. Studies have also documented the effects of benzodiazepines in tracking tests in which subjects attempt to maintain a cursor within a moving target. Some tracking tests have been considered laboratory tests of driving ability because the device used to control the cursor was a steering wheel. Tracking performance is uniformly impaired by acute, therapeutic doses of numerous benzodiazepines.
Numerous studies have reported that acute doses of various benzodiazepines slow response time in simple or choice visual reaction time tests and impair atten-tional performance on the DSST and the Stroop test, a measure of selective attention. Benzodiazepines have also been shown to impair divided attention tests that require splitting attention between a central tracking task and responding to stimuli in the peripheral visual field. The impairment caused by benzodiazepines in tests of sustained attention or vigilance is not secondary to sedation, but rather a direct effect on perceptual sensitivity, resulting in decreased hits and increased response time in detecting stimulus targets.

Cannabis (marijuana)

Marijuana consists of the crushed leaves and stems of the plant Cannabis sativa. In the United States, marijuana is typically smoked, although in various parts of the world other preparations of the cannabis plant are eaten, or fumes from the ignited plant material are inhaled. The primary psychoactive ingredient of marijuana is A9-tetrahydrocannabinol (THC), which currently averages 4.6%in commercial grade marijuana, but can range up to 25%in certain areas of the United States. Marijuana readily disrupts performance in complex tasks requiring continuous monitoring and the ability to shift attention rapidly between various stimuli. These same abilities are required when driving a car.
Marijuana has been shown to impair performance in driving simulators and laboratory tests that model various components of driving, such as reaction time, coordination, and tracking. Standardized field sobriety tests used by law enforcement officials to determine whether a person can drive safely were also shown to be impaired by marijuana. Several studies conducted in the 1970s indicated that marijuana impaired on-road driving abilities. A more recent comprehensive study of on-road driving found that marijuana moderately increased lateral movement of the vehicle within the driving lane on a highway.

Validity of Field Assessment Techniques Background

The drug evaluation and classification (DEC) program was developed by the Los Angeles Police Department during the 1970s because of a need to identify and document whether a driver was impaired due to recent drug ingestion. The DEC program consists of a standardized evaluation conducted by a trained police officer (drug recognition examiner, DRE) and the toxicological analysis of a biological specimen. The evaluation involves a breath alcohol test, examination of the suspect’s appearance, behavior, eyes, field sobriety tests (FSTs), vital signs, and questioning of the suspect. From the evaluation, the DRE concludes: (1) whether the suspect is behaviorally impaired such that he or she is unable to operate a motor vehicle safely; (2) whether the impairment is drug-related; and (3) the drug class(es) likely to be causing the impairment. The toxicological analysis either confirms or refutes the DRE’s drug class opinion.
Several field studies have indicated that DREs’ opinions were confirmed by toxicological analysis in 74-92%of cases when DREs concluded suspects were impaired. These studies attest to the validity of the DEC program as a measurement of drug-induced behavioral impairment in the field. However, the validity of the DEC evaluation has not been rigorously examined under controlled laboratory conditions.
Recently, one of the authors (SJH) conducted two placebo-controlled, laboratory studies to examine the validity of the individual variables of the DEC evaluation as predictors of drug intake and to determine the accuracy of DREs in detecting whether subjects had been dosed with various drugs. In the first study, placebo and two active doses of ethanol, cocaine, and marijuana were administered to research volunteers before being evaluated by DREs.
Results showed that 17-28 variables of the DEC evaluation predicted the presence or absence of each of the three drugs with a high degree of sensitivity and specificity and low rates of false-positive and false-negative errors. In contrast, DREs’ predictive accuracy of drug-induced impairment was not highly consistent with toxicological data. Of the 81 cases in which DREs predicted impairment, toxicology was positive for any drug(s) in 75 cases (92.6%). DREs’ predictions were consistent with toxicology in 41 cases (50.6%), including 9 cases in which the DRE concluded impairment was due to ethanol alone. Because the DRE’s breath test provided a priori confirmation of ethanol, an ethanol-only prediction was guaranteed to be consistent. Excluding those 9 cases resulted in 72 predictions that named a non-ethanol drug class. DREs’ predictions were consistent with toxicology in 44.4%of cases.
In the second study, research volunteers were administered placebo and two active doses of alprazolam, J-amphetamine, codeine, or marijuana. The ability of the DEC evaluation to predict the intake of the four drugs or placebo was optimal when using 2-7 variables from the evaluation. DREs’ decisions of impairment were consistent with the administration of any active drug in 76%of cases, and their drug class decisions were consistent with toxicology in 32%of cases.
Of particular relevance to the issue of drugs and driving safety is the part of the DEC evaluation in which four standardized FSTs are evaluated. The FSTs were Romberg balance (RB), walk and turn (WT), one-leg stand (OLS), and finger-to-nose (FN). The RB assessed body sway and tremor while subjects stood for 30 s with feet together, arms at sides, head tilted back, and eyes closed. The WT test required subjects to take nine heel-to-toe steps along a straight line marked on the floor, turn, and return with nine heel-to-toe steps. The OLS assessed balance by having subjects stand on one leg, with the other leg elevated in a stiff-legged manner 15 cm off the floor for 30 s. Subjects were given a brief rest between right and left leg testing. In the FN test, subjects stood as in the RB and brought the tip of the index finger of the left or right hand (as instructed) directly to the tip of the nose.
The broadest range of impairment was produced by alprazolam, which significantly increased body sway in the RB and disrupted balance in the WT and OLS. Marijuana impaired balance in the OLS, increased errors in the FN, and produced minor impairment in the WT. The relatively low doses of ethanol used in the study (see below) increased body sway in the RB and impaired coordination in the FN. Codeine increased errors in the OLS and FN. In the doses tested, neither cocaine nor J-amphetamine impaired performance on any of the tests; in fact, J-amphetamine decreased body sway in the OLS.

Plasma drug concentration

During both studies, repeated blood samples were taken to determine the relation between plasma drug concentration and impaired performance. In the first study, plasma concentration data are reported from the blood sample obtained halfway through the DEC evaluation. For ethanol, this was about 30 min after drinking ended, and ethanol (0.28 and 0.52 gkg-1) produced mean + standard error (SE) plasma concentrations of 24.3 + 2.2 and 54.4 + 6.0mgdl-1, respectively. Cocaine (48 and 96 mg 70 kg-1) produced mean plasma concentrations of 74.7 + 7.2 and 180.5 + 17.1 ngml-1, respectively.
Benzoylecgonine concentrations were 95.4 + 27.7 and 210.7 + 47.3 ngml-1, and ecgonine methylester levels were 10.8 + 3.0 and 26.1 + 6.3ngml-1 for low and high dose cocaine, respectively. Marijuana produced peak plasma concentrations immediately after smoking, which had declined at time of DEC evaluation to 15.4 + 3.0 and 28.2 + 4.2ngml-1 for 1.75%and 3.55%THC, respectively.
In the second study, plasma drug concentrations are reported for blood samples obtained 20 min before and immediately after the DEC evaluation. These blood samples corresponded to the following post-drug times: alprazolam, 60 and 105 min; J-amphetamine, 120 and 145 min; codeine, 60 and 105 min; and marijuana, 2 and 45 min. Mean + SE plasma concentrations of alprazolam at 60 min postdrug were 12.9 ± 1.7 and 25.8 ± 2.9ngml-1 and at 105min postdrug were 15.5 + 1.5 and 31.3 + 2.0ngml-1 for the 1 and 2 mg doses, respectively. J-Amphetamine produced mean + SE plasma concentrations of 21.0 + 2.5 and 42.8 + 4.2ngml-1 at 120min post-drug and 23.8 ± 1.6 and 46.6 ± 3.4ngml-1 at 145 min postdrug for the 12.5 and 25 mg doses, respectively. Codeine produced mean + SE peak plasma concentrations at 60 min postdrug of 120.8 ± 14.8 and 265.8 ± 28.8 ngml-1 for the 60 and 120 mg doses, respectively. At 105 min postdrug, plasma codeine concentrations were 114.1 + 9.1 and 229.6 + 17.8ngml-1 for the low and high doses, respectively. Marijuana produced mean + SE peak plasma THC concentrations 2 min after smoking of 28.1 + 3.6 and 61.2 + 9.2ngml-1 for low and high doses, respectively. By 45 min postsmoking, THC concentrations had declined to 5.4 + 0.7 and 9.9 + 1.5 for low and high doses, respectively. Plasma THC levels of both doses were less than 1 ngml-1 at 10 h postsmoking.

Conclusion

The validity of the DEC evaluation was examined by developing mathematical models based on discriminant functions that identified which subsets of variables best predicted whether subjects were dosed with placebo or each active drug. For all drugs except codeine, the subset of variables predicted the presence or absence of drug with a moderate to high degree of sensitivity and specificity. For codeine, sensitivity was low, and false-negative rates were extremely high. A secondary goal of these studies was to determine the accuracy of the DREs’ evaluations in detecting whether an individual had been dosed with active drug. In the first study, non-ethanol drug class decisions were consistent with toxicology in 44%of cases. In the second study, DREs’ drug class decisions were consistent with the administration of any active drug in 76%of cases, but consistent with toxicology in only 32%of cases. Thus, it would appear that DREs are able to detect drug-induced impairment in general, but have difficulty discriminating between various drugs.
These data clearly indicate that the variables of the DEC evaluation alone did not permit DREs to predict impairment and drug intake with the accuracy observed in field studies. There were several differences between the controlled laboratory conditions of this study and the field conditions under which DREs normally conduct the DEC evaluation that may have contributed to this outcome, such as the abridged form of the DEC evaluation, lack of meaningful drug-related cues (e.g. erratic driving, marijuana odor, drug paraphernalia), inability to interview subjects concerning drug use, and the possibility that greater drug doses encountered in the field would result in clearer behavioral signs of impairment. These findings suggest that predictions of impairment and drug use may be improved if DREs focused on a subset of variables associated with each drug class, rather than attempting to synthesize the entire DEC evaluation.

Postmortem Issues

Ethanol blood concentrations above a certain specified threshold are, by law, proof of intoxication. In theory, the same thresholds apply, even if the driver is dead. The situation becomes much more confusing when other drugs are involved. In some jurisdictions, the presence of any drug (parent or metabolite) is deemed proof of impairment. The question then arises as to whether postmortem blood drug concentrations in any way reflect drug concentrations at the time of death, and whether impairment can be inferred from the results of postmortem drug measurement.
At a minimum, postmortem drug concentrations depend on:
• the amount of drug ingested;
• the time elapsed from the last dose until the time of death;
• the time elapsed from death until specimen collection;
• where in the body the specimen is collected;
• sexual and racial differences in drug metabolism;
• drug tolerance (decreasing drug effect at a constant dose);
• the drug’s volume of distribution.
Drug tolerance is almost certainly the most important factor. Concentrations of opiates and stimulants in trauma victims, where drugs have nothing to do with the cause of death, often exceed drug concentrations in decedents where the drug is clearly the cause of death. Unfortunately, after death, tolerance cannot even be estimated, let alone measured.

Factors effecting postmortem drug concentrations

Dose If illicit drugs are involved, the total amount of drug ingested is almost never known. Drug dealers, known as ‘body stuffers’, often ingest their inventory when arrest seems imminent. Even if the total number of ‘rocks’ swallowed is known, the amount of cocaine is not. Depending on the skill of the drugmaker, the rocks may contain much more bicarbonate than cocaine. Similar considerations apply to heroin. According to the Drug Enforcement Agency, retail heroin sold in the United States is, on average, 70%pure. But, unless a sample is available for analysis, there is no way to determine the purity of the fatal dose.
Premortem interval Parent drugs and metabolites generally have different half-lives. In the appropriate setting this difference can be used to estimate pattern of use, if not the exact time when the drug was taken. Very low concentrations of cocaine, in the face of high concentrations of benzoylecgonine, suggest that cocaine was taken many hours before death. In the case of heroin, measurable concentrations of 6-monacetylmorphine (MAM), a heroin metabolite with an extremely short half-life, suggests use just prior to death. But unless death is witnessed and sudden, and an autopsy performed immediately, firm conclusions are impossible. The conversion of cocaine to benzoyl-ecgonine continues after death, and heroin users may remain comatose for hours before expiring, allowing time for all the heroin to be converted to 6-MAM and thence to morphine.
Time to specimen collection After death, bacteria escape from the gastrointestinal tract. Some of these bacteria produce alcohol, while others metabolize it. Alcohol concentrations of 150 mg dl- 1, or higher, have been measured in alcohol-free cadavers stored for 2 days at room temperature. In the case of cocaine, if the postmortem interval is long enough, all the cocaine will be converted to benzoylecgonine, and ultimately to ecgonine. The changes that occur with other illicit drugs are unpredictable. Concentrations of drugs stored in fat, such as phencyclidine and marijuana, can be expected to increase, but the rate of increase has never been determined.
Site dependency Blood drug concentrations measured at autopsy depend on where the sample is obtained. Methamphetamine concentrations in left heart blood are 2-3 times higher than concentrations in the right heart, but only a tenth as high as concentrations measured in blood from the pulmonary artery. Cocaine concentrations in the subclavian artery decrease after death, while those in the femoral artery rise. In general, the increases are greater in central sites (heart and liver), and least in peripheral locations, such as the femoral artery. In individual cases, the changes are completely unpredictable. This situation comes about because of a process known as postmortem redistribution. After death, drugs diffuse along a concentration gradient. The process is more likely to occur if a drug is highly bound to protein, and if it is sequestered in a major organ, such as a lung or the liver. Redistribution begins immediately after death and continues indefinitely, although the biggest changes occur within the first 24 h, with greatest concentration increases occurring at central sights such as the heart and liver.
Sexual and racial differences The forensic importance of genetic polymorphism has only recently been recognized. Equivalent doses of ethanol produced higher peak blood levels in American Indians than in American Whites, and some speculate this difference may account for higher rates of alcoholism and death among American Indians. In a similar fashion CYP2D1 (the animal equivalent of CYP2D6 in humans) polymorphism appears to explain differential mortality rates observed after methamphetamine treatment in rats and presumably humans. Opiate metabolism also varies from individual to individual. The enzymatic conversion of hydrocodone to hydro-morphone is catalyzed by cytochrome P450 2D6, an enzyme that is inactive in about 7%of Caucasians (slow metabolizers). Death and impairment in poor metabolizers is likely to occur after much smaller amounts of drug have been ingested than in individuals with a normal cytochrome system. In addition, sex hormones also affect drug concentrations and responses. Women metabolize cocaine differently than men, and the same woman will metabolize cocaine differently at different times in her menstrual cycle. The list of drugs affected by such differences is extensive and growing longer.

Estimating drug concentrations at the time of death

Volume of distribution The volumes of distribution for morphine, cocaine and methamphetamine are all over 3 l kg- 1. As a consequence, less than 2%of a given dose is to be found circulating in the blood at any given moment. Since only 2%is actually circulating in the blood, the release of only 1%more from deep tissue stores would be enough to double observed postmortem tissue concentrations. In controlled animal studies, levels of free morphine, for example, more than doubled during the first 24 h after death. The situation becomes even more complicated when metabolites are considered. The volume of distribution for morphine is more than 10 times that of either morphine-3-glucuronide or morphine-6-glucuronide. In contrast to free morphine, which diffuses into tissues throughout the body, almost all the conjugated morphine is to be found circulating in the blood. Any inference based on measurements of total morphine circulating in the blood may either wildly over- or underestimate the actual situation at the time of death.
Ethanol estimates Premortem ethanol measurements are, by statute, made with plasma, not whole blood. Postmortem alcohol measurements are made on whole blood, which contains 10-15%less water than serum or plasma (the red cells take up room). In practice, the difference may be much greater, because all tissues in the body lose water after death, including blood. As water is lost, the concentration of alcohol, changes. Still, provided the appropriate specimens are obtained at the time of autopsy, it is generally possible to distinguish antemortem intoxication from postmortem alcohol production. One way to do this is by measuring vitreous alcohol. Vitreous fluid contains more water than blood, and a blood:vitreous ratio >0.95 is a good indication that death occurred before the blood and vitreous were in equilibrium, suggesting that blood alcohol concentrations (BACs) were rising at the time of death. Similarly, a urine alcohol concentration (UAC):BAC ratio < 1.2 is good evidence that blood concentration was rising at the time of death. UAC:BAC ratios greater than 1.3 are consistent with a postabsorptive stage at the time of death. Ratios much greater than 1.3 suggest heavy consumption over a long period of time. The relationship between BAC and UAC is not accurate enough for forensic purposes, but BACs can be estimated by dividing autopsy UAC by 1.35.

Conclusions

Except for alcohol, where impairment is defined by statute, relating impairment to specific blood concentrations of any abused drug is, even in the living, problematic. More often than not, it is impossible. After death, the issues become much more complex. Given our current state of knowledge, inferring impairment from any postmortem drug measurements (except perhaps alcohol) is simply not possible.

Next post:

Previous post: