Show Summary Details
Page of

Defining Substance Use Disorders 

Defining Substance Use Disorders
Chapter:
Defining Substance Use Disorders
DOI:
10.1093/9780195173642.003.0018
Page of

PRINTED FROM OXFORD MEDICINE ONLINE (www.oxfordmedicine.com). © Oxford University Press, 2016. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Medicine Online for personal use (for details see Privacy Policy and Legal Notice).

date: 10 December 2018

An occasion of drug taking may be a passing indulgence of an adolescent, perhaps initiated in a moment of immature judgment. All too often, the drug taking can become repetitive and may lead to a syndrome of abuse or addiction, possibly in an interaction of inherited vulnerability traits with environmental conditions and processes. The development of addiction and the dependence syndrome in adolescents are emotion-laden, controversial, and often misunderstood topics. Public opinion leaders often do not appreciate the scientific evidence that tends to favor a disease concept of addiction or dependence on alcohol, tobacco, and other drugs such as cocaine or cannabis (McLellan, Lewis, O'Brien, & Kleber, 2000). This evidence includes a neuronal basis for many of the prominent clinical features of dependence (Dackis & O'Brien, 2003), genetic vulnerability (Vanyukov & Tarter, 2000), and a characteristic chronic, relapsing course that resembles that of many medical illnesses. Unfortunately, these biological bases of addiction are often forgotten or misunderstood by a nonetheless opinionated American public that can view adolescent drug problems as purely behavioral and morally objectionable. Consequently, many affected adolescents end up being managed by their parents, school authorities, or the judicial system rather than being treated in specialized adolescent treatment programs. Even those seeking treatment often discover that appropriate programs do not exist in their geographic region or are difficult to access because of low capacity or managed care policies. While there would be a public uproar if treatment were not available to adolescents with head injuries, diabetes, or cancer, obvious disparities in addiction treatment have been tolerated for decades and may well reflect public skepticism about the biological bases of drug dependence and addiction. Unfortunately, we have much to learn about the onset, nature, and treatment of these conditions in adolescents. It is our contention that gaps in our knowledge should be addressed with research and clinical experience, and not cited to justify an inadequate treatment infrastructure.

Syndromes of alcohol, tobacco, and other drug abuse and addiction have been defined and redefined over the past several decades and these definitions have now achieved international acceptance. The central clinical features generally include (a) disturbances of mental life in the form of obsession-like ruminations or even craving for drugs or drug-related experiences, (b) disturbances of behavior in the form of compulsion-like repetitive drug-taking or drug-related behaviors to the detriment of normal activities, and (c) manifestations of neuroadaptation to drug exposure, in the form of pharmacological tolerance and (sometimes) observable and characteristic withdrawal syndromes when there is an abrupt cessation of drug use. Many definitions of abuse and addiction reference a loss of control during cycles of euphoria and craving. Any definition must account for the continuum of severity that ranges from minimal use with limited consequences on one end to compulsive use with serious functional impairment on the other end. Some adolescents show progressive impairment and move along the continuum whereas others remain at a relatively stable level of severity. The American Psychiatric Association's (1994) Diagnosis and Statistical Manual of Mental Disorders, 4th ed. (DSM-IV) uses the label substance abuse to describe use with limited negative consequences, whereas the term dependence refers to a generally more serious loss of control of drug taking and a clinical syndrome with a running together of the clinical features mentioned above. The term dependence as used in the official DSM-IV manual often leads to semantic confusion with dependence in the pharmacological sense, which is a normal response to repeated use of many different types of medications, including drugs for the treatment of hypertension, depression, and pain. Thus many clinicians prefer the term addiction when referring to dependence as defined in DSM-IV. This definition applies the same diagnostic criteria to all pharmacological classes, acknowledging that similar disturbances of mental life and behavior, as well as manifestations of neuroadaptation, may develop regardless of whether the drug taking involves alcoholic beverages, tobacco (nicotine), or other drugs such as cocaine, heroin, or cannabis. Accordingly, alcohol is grouped with other dependence-producing drugs in the DSM-IV classification and will be categorized as such in this volume.

Seeking a fundamental understanding of the processes that lead toward drug addiction, clinical investigators have found evidence that drug-induced euphoria is linked to pharmacological activation of reward-related brain regions that normally mediate natural reward, including reward circuits influenced by food, sex, and drinking fluids (Dackis & O'Brien, 2003b). Reward-related circuits that have evolved over millions of years to ensure survival may actually be dysregulated by the chronic use of drugs. The clinical features of craving, loss of control, and impaired hedonic function also have linkages with the dysregulation of brain reward centers (Dackis & O'Brien, 2001, 2003b).

Tolerance is manifested in an escalation of drug dose to achieve a stated level of drug-induced effect. The process of developing tolerance can begin with the first exposure to the drug, with subsequent compensatory neuradaptational changes that often oppose the acute effect of the drugs. For example, the chronic administration of heroin and diazepam can produce so much tolerance that a 100-fold increase in dose is required to produce certain pharmacological effects that were apparent at the first low dose. Tolerance may also occur over brief periods, as evidenced by ever-decreasing euphoria with successive cocaine doses during a protracted binge. Tolerance develops more rapidly to some drug effects than to others, and drug-induced euphoria typically requires ever-increasing doses; in contrast, toxic effects may not initially require a greater dose (e.g., heroin-induced respiratory depression). Individual variability in rates of development of tolerance appear to be genetically determined and may help explain variations in the risk of drug overdose.

In contrast with pharmacological tolerance as ordinarily defined, a reverse tolerance or sensitization can occur, as manifest in an increased drug response after repeated administration. For instance, successive equal doses of cocaine can be followed by motor activity increases (i.e., the opposite of what is expected if tolerance has developed) in rats given the drug at daily intervals. Examples of reverse tolerance or sensitization in humans are difficult to demonstrate but may include cocaine-induced psychosis, seizures, and cue-induced craving (Dackis & O'Brien, 2001). Tolerance and sensitization phenomena are influenced by genetic factors that affect receptor responses and the distribution of drugs in the body (pharmacokinetics) and the nervous system adaptations (pharmacodynamics) once drug exposure has occurred.

A withdrawal syndrome is seen most clearly when there is abrupt discontinuation of repetitive drug-taking. It is possible to conceive of the withdrawal features as manifestations of compensatory or homeostatic brain changes that have been established in response to chronic or repeated drug exposures. If the drug is abruptly discontinued, these changes are suddenly unopposed and there can be a rebound in the form of a characteristic cluster of withdrawal signs and symptoms. These clusters or drug withdrawal syndromes can vary markedly across classes of addictive drugs and typically include clinical features that are opposite those seen during intoxication. The onset, duration, and clinical course of withdrawal also can vary from one individual to another. It should be emphasized that withdrawal from alcohol or sedative or hypnotic agents (especially barbiturates) is potentially lethal and often requires intensive medical inpatient treatment. Treatment of drug withdrawal (detoxification) is often accomplished by administering descending doses of a medication with the same types of action as the dependence-producing drug. Therefore, opioids (such as methadone) are used to reverse heroin withdrawal, longer-acting benzodiazepines are used to counter withdrawal from shorter-acting benzodiazepine drugs, and longer-acting barbiturates are used to counter withdrawal due to shorter-acting barbiturates. In addition, benzo-diazepines and barbiturates effectively reverse the alcohol withdrawal syndrome, based on their shared action on underlying neuronal (GABA) circuitry. Although detoxification is an important clinical intervention that allows the brain to equilibrate in the absence of the dependence-producing drug, detoxification is seldom sufficient to arrest the cycle of craving and euphoria. Consequently, detoxified patients require referral to continued drug rehabilitative treatment.

As we seek an understanding of vulnerability to drug dependence and addiction, we can turn to three major domains of influence. This triad includes (1) the drug used (the “agent”), (2) the constitutional characteristics of the user (the “host”), and (3) the physical and psychosocial setting (the “environment”). As we look across classes of drugs, the nature and potency of drug-induced euphoria vary considerably, and this euphoria may be an important reinforcer of repetitive drug-taking behavior. Among addiction specialists, there is a general consensus about a hierarchy of addictiveness or dependence liability among drugs, although it is arguable how central stimulants, opioids, alcohol, nicotine, sedative and hypnotics, marijuana, and hallucinogens should be specifically ranked. Animal models provide one means of gauging the reinforcing functions of each drug as an agent in the process of developing drug dependence. When properly equipped with intravenous delivery systems, laboratory animals will self-administer various drugs with varying enthusiasm. For instance, laboratory animals with unlimited access to cocaine or amphetamine generally will self-administer these agents until they die; in contrast, training animals to self-administer alcohol or cannabis can be difficult. Evidence for the biological basis of drug reward includes the phenomenon that animals will self-administer drugs like cocaine without special training, as well as the finding that nearly all drugs with dependence liability (cocaine, amphetamine, heroin, alcohol, nicotine, marijuana) share a common action of activating brain reward systems, as indicated by elevated dopamine levels in the nucleus accumbens (Dackis & O'Brien, 2003b). In fact, researchers routinely use this neurochemical signature of drug reward to screen the addictive potential of compounds synthesized in pharmaceutical research and development process. Natural rewards such as food, water, and sex also increase dopamine levels in the nucleus accumbens, consistent with the notion that these drugs produce euphoria by activating natural pleasure pathways. Although dopamine plays a central role in natural and drug-induced reward, there is also evidence that endogenous opioids, glutamate, γ‎-aminobutyric acid (GABA), and serotonin systems are mechanistically involved.

The ability of a drug to induce euphoria in humans is correlated with its dependence liability or addictiveness and can be greatly enhanced when the drug is taken by routes of administration that produce rapidly rising brain levels of the agent. The importance of the route-of-administration principle is illustrated in the history of cocaine use. People living in the Andes Mountains have chewed coca leaves since antiquity with few deleterious effects, largely because cocaine brain levels increase very slowly when the drug is absorbed through the buccal mucosa. However, once pure cocaine was chemically isolated from the coca leaf in the form of hydrochloride powder (HCl), it became possible to administer high doses by the more efficient intranasal and intravenous routes. Once cocaine HCl powder was administered intravenously or intranasally, its toxic effects became more obvious. During the 1980s the by-then illegal cocaine marketplace changed markedly when an underground pharmaceutical innovation appeared in the form of crack-cocaine, which is heated and vaporized so that the fumes can be inhaled (“smoked”). Whereas a typically intranasal route of administration might yield a cocaine “high” over the course of minutes after insufflation, the onset of the cocaine high is more rapid with smoked crack-cocaine. Indeed, within seconds after being smoked and then absorbed in the lungs, the vaporized cocaine reaches the brain and an almost instantaneous rush of euphoria may fuel a crack-associated rapid emergence of the cocaine dependence process, as indicated by recent epidemiological evidence (Chen & Anthony, 2004). A similar rapid onset of effects can be achieved by injecting cocaine, but smoking crack is more convenient and more socially acceptable than injecting cocaine, especially when adolescents already have a history of smoking cigarettes or marijuana.

In addition to characteristics of the drug agent and its route of administration, host factors can contribute significantly to the onset and course of the dependence process. Disenfranchised adolescents might use drugs to gain peer acceptance, and thrill-seeking adolescents may be attracted to drug euphoria. Adolescents with clinical features of depression, anxiety, or phobias may sometimes use drugs for relief of these symptoms. Unfortunately, chronic brain changes produced by drugs often exacerbate the very symptoms they initially alleviated, and these drugs may well produce far more psychiatric symptoms than they relieve.

Inherited traits correlated with a family history of drug dependence are also now appreciated as important host characteristics associated with the risk of becoming drug-dependent. Children of alcoholics are at increased risk of developing alcoholism, even after they have been adopted at birth and raised by nonalcoholic parents, and identical twins are more likely than fraternal twins to be concordant for alcoholism (Schuckit, 2000) and nicotine dependence (Sullivan & Kendler, 1999). Drug effects vary markedly from one individual to another and are influenced by genetic variations that affect drug absorption, metabolism, excretion, and receptor-mediated neuronal responses. Researchers are actively investigating profiles of candidate genes that encode the enzymes or other protein products related to these functions, as described in later sections about specific agents.

Individuals with extensive family history loadings for drug dependence do not necessarily become addicted, however, partly because environmental factors (supply, psychosocial norms, and peer pressure) also influence the onset and development of dependence processes. Drug supply is an essential and permissive factor that has been targeted by law enforcement initiatives. Without drug access, there can be no addiction. Aside from the physical availability of a drug, access can also be affected by its cost, with, for example, the increased use of cocaine and heroin in the past decade coinciding with the declining street price of these drugs. Psychosocial norms also affect the timing of first and subsequent uses of addictive drugs. In some communities, drug users and drug dealers are actually viewed as successful role models to be respected and emulated by young people. This unfortunate situation is often compounded by inadequate educational, vocational, and diversionary options that would otherwise offset the attractiveness of a drug-related lifestyle. Drug users often attempt to regain control of their drug use, but relapse to compulsive use is almost inevitable. When the user returns to a neighborhood where he or she previously used drugs, the environmental cues may reflexively precipitate craving and resumption of drug taking. The role of environmental cues (people, places, and things) that can trigger craving are important research-based principles that influence both continued drug use and relapse after treatment (O'Brien, Childress, McLellan, & Ehrman, 1992). Since agent, host, and environmental factors are cumulatively involved in addictive illness, effective prevention and treatment interventions typically address all three realms.

THE EXTENT AND NATURE OF THE PROBLEMS

The field of epidemiology offers a description in statistical terms of the nature and extent of problems associated with adolescent use of alcohol, tobacco, and other psychoactive drugs. In this section, we describe trends and offer estimates for the prevalence of different forms of drug taking, as well as the occurrence of the clinically defined syndromes of abuse and addiction. As we seek to understand the nature and extent of associated problems we stress the plural. Indeed, a description of “the problem” would oversimplify the complex phenomenon of drug use and addiction described previously in this chapter.

Evolution of the Drug Epidemic Among Mainstream Youth in the United States: 1960s–1990

Against a background of quite prevalent alcohol and tobacco use in the United States, the prevalence of illegal drug use grew dramatically among American adolescents beginning in the 1960s. Since then, there have been dynamic changes. New drugs have emerged (e.g., crack-cocaine; MDMA, or ecstasy). Major historical events led to great social upheaval and alienation among many youth (e.g., the Vietnam War, the Watergate scandal). Concurrently, youth in volvement in illegal drug use changed from being a set of experiences and behaviors concentrated mostly in fairly small, identifiable subgroups on the periphery of mainstream American society to becoming a widespread and more normative set of experiences and behavior. In terms of numerical increases of new users and previously observed and expected numbers, these illegal drug use experiences came to be characterized as an epidemic, and drug use became an important “generation gap” issue, with youths and adults often holding radically different views of the acceptability of drug use—particularly the use of certain drugs such as marijuana and hallucinogens (National Commission on Marijuana and Drug Abuse, 1972).

At the peak of this epidemic of illegal drug use, between 1975 and 1980, fully two thirds of American adolescents had tried an illegal drug by the time they finished high school (Johnston, O'Malley, & Bachman, 2002a; see also Substance Abuse and Mental Health Services Administration (SAMHSA), 2002a). This represented a dramatic increase over the prevalence proportions observed in the mid-1960s, when the epidemic began (Johnston, 1973; National Commission on Marijuana and Drug Abuse, 1972). During this interval, illegal drug use came to be associated with political beliefs—particularly, being against the Vietnam War—and with certain lifestyle orientations, as reflected in the counterculture (Johnston, 1973; Zinburg & Robertson, 1972). At the population level, there were also associations with other forms of rule-breaking behavior, deviance, or delinquency unrelated to using drugs. These associations remain, even though many drug-using youths are otherwise rule abiding and do not show other conduct problems, and some youths with conduct problems do not take illegal drugs (Jessor & Jessor, 1977; Johnston, 1973; Osgood, Johnston, O'Malley, & Bachman, 1988).

If the Vietnam War and other historical events of the 1960s and early 1970s accounted for the dramatic increase in the epidemic of illegal drug use, one might have expected a downward trend with passage of time since these events. That turned out not to be the case. The Vietnam War ended in 1973, but the rise in drug use continued into the late 1970s, albeit with some fading of the symbolic meanings of illegal drug use. During this interval, the illegal drug use epidemic developed its own forward momentum.

As gauged in relation to the number of new adolescent users year by year, it was not until the late 1970s that the epidemic trend turned downward. Even so, the proportion of young people continuing to use drugs did not decline across the board until after 1985. The decline in each year's prevalence proportion persisted into the early 1990s. During this interval of time, norms among young people against the use of many of the illegal drugs strengthened considerably. Our surveys showed an increased appreciation of harms associated with illegal drug use, in particular marijuana, cocaine, crack, and phenlcyclidine (PCP) (Bachman, Johnston, & O'Malley, 1990, 1998; Johnston, 2003).

This change in perceptions about drugs and a concomitant decline in prevalence of illegal drug use during the late 1980s, however, may have helped to sow the seeds of its own reversal and to create a context for emergence of a noteworthy “failure of success.” By this we mean that important new historical events were emerging in the late 1980s and early 1990s to take the center stage of the popular imagination. Headlines and media attention shifted away from domestic matters such as illegal drug use and toward concerns about terrorism abroad, the Iraqi invasion of Kuwait, and the emerging Gulf War. With these declines in coverage of the drug issue by the media, Congressional attention and budgeting for drug prevention programs shrank considerably, and political attention to the issue declined generally. Young people were hearing and reading much less about illegal drug use, and perhaps more importantly, their perceptions reflected less familiarity with hazards that go along with illegal drug use (Johnston, 1991, 2003). In our retrospective analysis of the domains of influence that govern youths' appreciation of the hazards of illegal drug use, we build from a foundation of observations by historian David Musto and others, highlighting (a) media coverage, (b) drug prevention programming, (c) personal experiences, and (d) vicarious experiences (i.e., learning vicariously from peers, parents, or other relatives of the drugs' hazards experienced by others). To the extent that society achieves suc cess in dampening the prevalence of illegal drug use (by whatever means), we may expect controllable declines in the first two domains: (a) media coverage falls, and (b) support for prevention programming wanes. Part and parcel with declines in the prevalence of illegal drug use are less personal experience with these drugs, fewer chances to try the drugs, and fewer young people experiencing the hurt that often goes along with drug taking. In addition, there is less vicarious experience with the associated hazards as can be gained by personal acquaintance with other young people or adults whose lives have been harmed by illegal drug use. Hence, on the downward side of an epidemic curve of illegal drug use, the same processes that fuel the continuing decline in the behavior of illegal drug use are fueling a decline in adolescents' personal and vicarious knowledge of the hazards of illegal drug use. In this sense, a “success” in the form of declining prevalence of illegal drug use sows the seeds for a “failure” and later rebound—to the extent that the knowledge of drug-associated hazards helps to promote resistance when the young person faces the first or subsequent chance to try an illegal drug.

These seeds for a resurgence of illegal drug use among American adolescents had been sown in the late 1980s and early 1990s, and for most illegal drugs, the epidemic curve turned upward in the early 1990s, with a generally persistent trend of increasing proportions of new users and continuing users into the late 1990s. Marijuana use exhibited the sharpest rise during this “relapse phase” in the epidemic, but use of most of the drugs in the ever-growing list of alternatives increased during this period as well (Johnston, O'Malley, & Bachman, 2003b; SAMHSA, 2002a,b).

By the late 1990s there were once again signs of improvement, with prevalence of inhalant use beginning to decline in 1996, and prevalence of marijuana use peaking or stabilizing by 1997. Other drugs began to decline at various points, including LSD (lysergic acid diethylamide), cocaine, and finally heroin. But one drug was growing sharply in popularity in the late 1990s— namely, ecstasy, or MDMA (3,4-methylenedioxy-methamphetamine). With respect to prevalence of recent use, this newcomer to the list of popular drugs reached peak values in the 2000–2002 interval, and there now may be a persisting downward trend in prevalence of MDMA use, concomitant with increases in the proportion of young people perceiving adverse effects and hazards of MDMA use, as happened with a number of other drugs in prior years (Johnston et al., 2003b).

The ecstasy epidemic among youth and young adults illustrates one very important feature of the larger epidemic of illegal drug use over the past three decades: that there has been a continuing march of new drugs onto the scene, each presenting youths with new alternatives. While American youth may have learned about the dangers of most of the existing alternatives on the menu, we may expect continuing innovations and “designer drugs,” perhaps with claims that the new drug compounds have no adverse consequences. As an elaboration of the “failure-of-success” concept, we now appreciate that when one birth cohort of adolescents comes to appreciate the dangers of a drug, by learning about its consequences through the media, prevention programming, and personal and vicarious experience, and enters the lower-risk developmental period of adulthood, the more recently born cohorts of children enter adolescence without the history of accumulated knowledge and belief about the hazards connected with the drug. These adolescents are thus prone to reexperience and relearn these dangers on their own.

The ecstasy/MDMA example illustrates another important point of particular relevance to prevention. A careful examination of the trends over time of the various classes of illegal drugs will show that to a considerable degree, they each have unique cross-time profiles (Johnston et al., 2003a; SAMHSA, 2002a). The use of one drug may be rising at the same time that the use of another is falling and perhaps a third is holding steady. This means that the different drugs are responding to influences that are specific to them—very likely, factors such as the perceived benefits of using that drug as well as the perceived dangers of doing so. Although there may be some larger social forces, such as the Vietnam War, that change the overall proportions of youth willing to engage in illegal drug use gen erally, there are also many important drug-specific influences that must be taken into account. This means that drug education, communication, and persuasion efforts may be most valuable when they address each class of drug separately. The attitudes and beliefs related to drugs are so varied that in addition to pursuing goals common to all drugs, such as strengthening resistance to peer influence in favor of any illegal drug, there should be educational efforts specific to individual drugs (Johnston et al., 2003a; SAMHSA, 2002a).

Major Data Sources Documenting Adolescent Drug Experiences in the United States

There are several ongoing survey series available for assessing the size and nature of the American adolescent drug experience, based on scientifically selected national samples. Two are based on in-school surveys using self-completed paper-and-pencil questionnaires administered to students in group settings: the Center for Disease Control and Prevention's (CDC's) Youth Risk Behavior Study (YRBS), and the University of Michigan's Monitoring the Future (MTF) study, sponsored by the National Institute on Drug Abuse. The MTF was launched in 1975 and the YRBS in 1991.

In the YRBS, a nationally representative sample of some 13,000 to 14,000 students in grades 9 through 12, enrolled in about 150 public and private high schools, are surveyed by means of self-administered, optically scanned questionnaires (Grunbaum et al., 2002; Kann, 2002). Data are gathered biennially. Measurement is spread across a range of risk behaviors for adolescents, so little information is gathered on attitudes, beliefs, or social surroundings specifically related to drug use. This is a repeated cross-section design.

In the MTF, some 45,000 students in grades 8, 10, and 12, enrolled in public and private schools in the coterminous United States, are surveyed annually. Self-administered, optically scanned questionnaires are used in this study as well. Extensive information is gathered on attitudes, beliefs, and various social influences from the family, school, work, and mass-media environments. In addition, representative panels of high school seniors are selected for follow-up each year and are then surveyed by mail for some years after high school graduation in this cohort-sequential study design.1

The third source of population survey data derives from national household surveys that include and report separately on youth 12 to 17 years old—the National Household Survey on Drug Use and Health (NHSDUH), known until very recently as the National Household Survey on Drug Abuse (NHSDA).2 Data were gathered from adolescents for many years by personal interview in combination with private answer sheets on drug use, but very recently the methodology has shifted to computer-assisted interviewing. This series began in 1971 with a survey for the National Commission on Marijuana and Drug Abuse (1972). Over the years there have been a number of changes in measurement content, measurement methods, and sample sizes, all of which have made accurate trend estimation more of a challenge. In general, however, trends have been reasonably parallel to those generated by MTF and YRBS. The NHSDUH uses a repeated cross-section design, with surveys conducted at various intervals in the past, but on an annual basis in recent years.3

The National Comorbidity Survey (NCS; Kessler, 1994) represents a somewhat different approach to generating prevalence data on drug use, in that it measures the prevalence and correlates of DSM III-R disorders, as well as the connection of drug dependence and related disorders to the various other psychiatric disorders (Anthony, Warner, & Kessler, 1994). One key finding is that there is a significant level of such comorbidity among individuals with a drug de pendence syndrome (see, for example, Kessler et al., 1996, 2001).4 Unfortunately, the relatively small samples of adolescents in the first iteration in the NCS put limitations on the precision of estimates specific to adolescents. The NCS household sample included around 8,000 people from ages 15 to 54 in the noninstitutionalized population of the United States, and it was fielded between 1990 and 1992. Sequel surveys are now being carried out that will, among other things, provide trend estimates on many of the conditions measured in the first wave. In addition, and of particular relevance here, a supplementary sample of 10,000 adolescents is included in the current work to specifically examine the prevalence and correlates of drug dependence and other disorders among adolescents.

Still another, completely different, type of information is gathered nationwide from hospital emergency rooms and coroners' offices as part of the Drug Abuse Warning Network (DAWN; e.g., SAMHSA, 2002b), in which case counts are made of people treated for medical emergencies involving any of a range of drugs and of people who die with identifiable evidence of drug use present. Unlike the population surveys, which attempt to estimate prevalence and trends in drug use in major segments of the national population, the DAWN system is intended to generate data on case counts of drug-related “casualties.”

Prevalence and Trends in the Adolescent Use of Various Drugs

Prevalence rates of adolescent use of specific substances (heroin, cocaine, alcohol, and tobacco especially) are given in the discussion of those substances later in this chapter. Here we focus on the prevalence rates and trends of use across various substances.

The 2002 MTF survey shows that a quarter (25%) of today's young people have tried some illegal drug before finishing eighth grade—that is, by ages 13 or 14—and more than half (53%) have done so by the end of high school (Johnston et al., 2003b). If inhalants are included in the definition of illegal drugs, the numbers are even higher (32% and 55%, respectively). Prevalence rates for any illegal drug, marijuana, cigarettes, and binge drinking of alcohol observed among 8th-, 10th-, and 12th-grade students in these nationally representative surveys are given in Figures 17.117.4. (Grade 8 students are 13 or 14 years old for the most part, and grade 12 students are mostly 17 or 18 years old.) Trend data for the three grades illustrate a number of the points made above, including the dynamic na ture of this class of problem behaviors among youth, and that different drugs tend to vary independently of the others.5

Figure 17.1 Trends in annual prevalence of any illicit drug other than marijuana in three populations.

Figure 17.1
Trends in annual prevalence of any illicit drug other than marijuana in three populations.

Figure 17.4 Trends in 2-week prevalence of binge drinking in three populations.

Figure 17.4
Trends in 2-week prevalence of binge drinking in three populations.

Figure 17.3 Trends in 30-day prevalence of cigarette smoking in three populations.

Figure 17.3
Trends in 30-day prevalence of cigarette smoking in three populations.

By comparison, Tables 17.1 and 17.2 provide trends in the proportion of 8th-, 10th-, and 12th-grade students in MTF who reported receiving treatment for their use of alcohol and/or illegal drugs in recent years.6 As would be expected, the values rise with age (and, therefore, with prevalence), but not as much as one might expect. Among 12th graders (the only ones asked about the distinction) the number receiving outpatient treatment or counseling exceeds the number receiving inpatient treatment by a ratio of 3:1 to 5:1 in recent years, but by larger ratios in earlier years. The lifetime prevalence of treatment also rose toward the end of the 1990s, no doubt reflecting the relapse in the epidemic of adolescent drug use from the early to mid-1990s.

Table 17.1 How Many Have Ever Received Drug Treatment or Counseling?: Cumulative Proportion (%) Estimated for Each Year, 1988–2001

Grade

1988–1989

1990–1991

1992–1993

1994–1995

1996–1997

1998–1999

2000–2001

Average

8

2.9

2.7

3.3

3.0

2.8

2.9

10

2.7

3.1

3.9

3.9

4.0

3.5

12 (total)

3.7

3.6

3.3

3.9

4.5

4.7

4.1

4.0

12 (residential)

1.2

1.1

1.1

1.5

1.4

1.2

1.1

1.2

12 (outpatient)

3.3

3.4

2.9

3.4

3.9

4.5

3.7

3.6

Source: Johnston, L. D., O'Malley, P. M., & Bachman, J. G. (2002a). National survey results from the Monitoring the Future study, 1975–2001. Volume I: Secondary school students. Bethesda, MD: National Institute on Drug Abuse.

Table 17.2 How Many Received Drug Treatment or Counseling Each Year?: Prevalence Proportion (%) Estimated for Each Year, 1988–2001

Grade

1988–1989

1990–1991

1992–1993

1994–1995

1996–1997

1998–1999

2000–2001

Average

8

1.4

1.2

1.5

1.4

1.4

1.4

10

1.3

1.5

1.9

2.0

1.9

1.7

12 (total)

1.6

1.6

1.6

1.9

2.3

2.2

1.9

1.9

12 (residential)

0.4

0.2

0.4

0.5

0.4

0.5

0.5

0.4

12 (outpatient)

1.6

1.5

1.4

1.8

2.1

2.1

1.7

1.7

Source: Johnston, L. D., O'Malley, P. M., & Bachman, J. G. (2002a). National survey results from the Monitoring the Future study, 1975–2001. Volume I: Secondary school students. Bethesda, MD: National Institute on Drug Abuse.

Other important survey series that attempt to measure clinically defined cases of drug depen dence and related problems are the National Household Surveys mentioned above (e.g., SAMHSA, 2002a) and the National Comorbidity Study (Anthony, Warner, & Kessler, 1994; Kessler, 1994). However, as is explained above, it is still too soon to derive much from the NCS with specific regard to adolescents.

Subgroup Differences in Substance Use

Not all adolescent and young adult subgroups in society are at equal risk of being a recently active drug user (Johnston et al., 2002a; Wallace et al., 2003). During adolescence, at least, African-American youngsters have substantially lower prevalence of use of the full range of legal and illegal substances than their white, Hispanic, or Native American counterparts, although there seems to be some reversal of this difference in early and later adulthood (SAMHSA, 2002a). Native Americans tend to have the highest prevalence of use, and in early adolescence, Hispanic youth tend to have the next highest. Adolescent boys are somewhat more likely than girls to use most drugs, and quite a bit more likely to use them frequently. Van Etten, Anthony, and colleagues have pursued a line of research that traces the basic male–female difference in drug experience back to an earlier male excess in the experience of the first chance to try drugs; that is, males have the chance to try illegal drugs earlier than girls and at any given age are more likely to have experienced a chance to try these drugs. However, once the first chance to try a drug is presented, girls are just as likely as boys to actually consume the drug (e.g., see Van Etten & Anthony, 2001; Van Etten, Neumarks, & Anthony, 1999). Contrary to popular opinion, rural areas now generally do not have lower prevalence of most types of drug use than more urban areas, and sometimes actually have higher rates, which speaks to how thoroughly the drug epidemic has diffused to all parts of American society. The finding that most forms of drug use do not vary much as a function of the educational level of the parents, a measure of socioeconomic status, also reflects this diffusion. The use of certain drugs does tend to concentrate in particular regions of the country—crystal methamphetamine use in the West, for example—but for the most part regional similarities are more noteworthy than the differences. Sometimes, when a new drug is coming onto the scene, such as crack in the 1980s and ecstasy in the late 1990s, there are larger geographic differences until the diffusion process takes place. Most of the demographic and family background subgroup differences mentioned above tend to enlarge during periods of greater use of a drug and to diminish during periods of contraction in use; but still they tend to remain consistent as to direction over long historical periods, with very few exceptions. Regional differences in cocaine use expanded dramatically in the early 1980s, at the height of the cocaine epidemic, with the West and the Northeast United States attaining considerably higher prevalence of use than the South or the North Central. But during the contraction period that followed, beginning in 1997, the West and Northeast also showed the most dramatic declines (Johnston, et al., 2003b).

An International Comparison

The American drug epidemic of the 1960s and 1970s spread around the globe, as a mobile generation of young people traveled the world. Even so, illegal drug use generally did not penetrate so deeply into youth populations of other countries. A recent 30-country prevalence survey of illegal drug use among 15-year-olds (mostly in European countries) shows generally larger values for American adolescents than for any of the 30 other countries (Hibell et al., 2000). Despite the long-ago passing of forces giving special impetus to the American epidemic, and despite progress in prevention, we still are in the top rank.7

A caution is in order, however. When an epidemic has occurred, as happened with cocaine in the 1970s and early 1980s, it can give rise to a considerable population of continuing, dependent users. A result is that “casualty” counts of impaired users needing treatment can lag behind general trends in prevalence of use.

CLINICAL ASPECTS OF SPECIFIC SUBSTANCE USE DISORDERS

Our understanding of adolescent addiction, in both pharmacological and behavioral realms, is somewhat limited, and research advances in this area have been thwarted by several considerations. First, the typical pharmacologic experiments involving controlled administration of drugs commonly done with adult research volunteers for the most part have not been possible with adolescents. Thus much of what is known about the pharmacology of drugs in adolescents must be inferred from experience with adults. The logistics, particularly ethical, regulatory, and related informed-consent issues, are such that much of the pharmacologic research on drugs that adolescents should not be using must necessarily take place with animal models or with adult volunteers. Of course, useful information can be learned from clinical experience and observations, but even our clinical experience has been limited by adolescent resistance to treatment, by social stigma, and by an inadequate addiction treatment infrastructure in the United States. Furthermore, anecdotal clinical information is much less reliable than that gleaned from controlled studies, as have been performed on adult substance abusers. Experience from clinical settings, such as emergency rooms and treatment clinics, provides information on the pharmacology of adverse drug consequences, but provides less information on the more typical pharmacologic effects of illicit drug use experienced by the majority of adolescent users who never appear for treatment of adverse consequences.

Another consideration when describing effects of drugs in adolescents is that it is traditional to present and discuss the pharmacology of each drug or drug class individually. However, adolescents who abuse drugs, particularly those of most concern who use drugs regularly, seldom take only one drug during an evening or day of drug use. All of the drugs reviewed here are typically used more in various combinations rather than individually. For example, when considering the pharmacology of cannabis, hardly anyone begins to use cannabis regularly before becoming experienced with alcohol and tobacco, although recent data suggest this pattern may be changing. More often than not, after becoming a regular user, all of these three drugs (and often others) are used in close proximity or together. This is true to varying degrees for all the drugs attractive to some adolescents. The pharmacology and toxicity of drug combinations can be complex and different from the pharmacology of the drugs used individually.

Heroin and Other Opioid Dependence in Adolescence

Most adolescents view heroin to be extremely dangerous and few if any plan to become addicted to this agent. Yet heroin use has increased since 1992 among all age groups and the average age of first time use has declined (SAMHSA, 1997). Furthermore, heroin is now available to adolescents living in urban, suburban, and even rural settings, where alarming numbers of adolescents are seeking treatment for heroin addiction and presenting to emergency rooms with heroin-related problems. There has also been an uptrend among adolescents in the abuse of prescription pills with opioid agonist action (see Table 17.3) that are capable of producing the same clinical elements characteristic of heroin dependence. In past years heroin was primarily injected and the reluctance of adolescents to use needles probably provided a significant barrier to their first-time use. But the purity of heroin has increased significantly and the drug is now widely administered by the less efficient but more socially acceptable nasal route, making first-time use less onerous to adolescents who are introduced to heroin by peers, acquaintances, and family members. In addition, a precipitous decline in the price of heroin has made the drug affordable to adolescents.

Table 17.3 Opioid Pain Relievers

Codeine (Phenergan, Robitussin)

Hydrocodone (Hycodan)

Hydromorphone (Dilaudid)

Meperidine (Demerol)

Methadone (Dolophine)

Morphine (Roxanol)

Oxycodone (Oxycontin, Percocet, Percodan)

Pentazocine (Talwin)

Propoxyphene (Darvocet, Darvon)

Data on adolescents from the National Household Surveys on Drug Use and Health (NHSDUH) reported that in the 12-to 17-years age range, there were 47,000 active heroin users in the United States (0.2% prevalence), of which 34,000 had recently begun using heroin (0.1% incidence) (SAMHSA, 2000). Chen and Anthony (under review) studied the clinical course of adolescent heroin users by analyzing NHSDUH data from calendar years 2000 and 2001 and found that approximately 22% of first-time users became addicted. In addition, clinical features reported by the adolescent heroin users (time spent seeking heroin and recovering from its effects, failed efforts to quit, dose escalation, continued use despite emotional and physical problems, reduction in nondrug activities) were generally much greater than corresponding estimates for stimulants, alcohol, marijuana, hallucinogens, sedative and hypnotics, and inhalants. There are approximately 1,262,000 adolescents using prescription pain relievers (5.4% prevalence), 722,000 recent-onset users (3.3% incidence), and a 9% likelihood of becoming addicted (SAMHSA, 2000).

The addictiveness of heroin stems largely from the intense euphoria associated with heroin intoxication. Heroin euphoria has been compared to sexual orgasm or described as “God's warmest blanket” by heroin users who often become obsessed with the drug. The rewarding effect of heroin and of all opioid agonists derives from their ability to activate endogenous opioid receptors that densely populate reward-related brain regions. Opioid receptors are normally activated by endogenous opioid peptides (i.e., β‎-endorphin, enkephalin, dynorphin) that play important roles in natural reward and satiety (Dackis & O'Brien, 2003b). An injection of heroin leads to a rush of euphoria that lasts several minutes and is followed by a persistent period of sedation and satiety, during which time the user typically nods and falls asleep. It is noteworthy that this tranquil response to heroin contrasts markedly with the stimulation, gregariousness, and intense craving that follows cocaine use.

Tolerance develops more rapidly to the rewarding effect of heroin than to its toxic effects, such as respiratory depression, which increases the risk of lethal overdose in users pursuing euphoria. In the presence of adequate supply, heroin users can progressively increase their daily dose by 100-fold. Street heroin varies widely in potency and unusually pure shipments are notorious for leaving a trail in the medical examiner's office. Although naloxone (an opioid receptor antagonist) rapidly reverses heroin overdose, timely medical treatment is often unavailable in adolescent overdose situations that can easily result in death. Heroin is often used in combination with other drugs of abuse: benzodiazepines, cocaine, alcohol, marijuana, and prescription opioids. It is common for intravenous users to inject heroin and cocaine simultaneously (termed “speedballing”) to experience additive subjective effects of these agents.

Heroin addiction produces marked functional impairment in adolescents as they progressively lose control over the amount used and over behaviors directed toward heroin procurement. Reports indicate that the risk of developing specific clinical features of drug dependence is consistently greater for heroin than other drugs of abuse (SAMHSA, 2002a). School performance, family relations, and social functioning typically deteriorate significantly as heroin becomes the adolescent's first priority. Heroin-addicted adolescents often resort to illegal activities, including shoplifting, dealing, prostitution, and robbery, as a means of paying for their increasing heroin dose requirement. Consequently, they risk arrest, conviction, and incarceration, along with the stigma and disadvantages that are associated with a criminal record. Heroin users also risk physical trauma associated with the dangerous drug-seeking lifestyle. As their heroin addiction intensifies, adolescents are usually shielded from their impairment by denial, an essential feature of the addiction. Minimization, rationalization, intellectualization, and other aspects of denial must be addressed by treatment interventions that make adolescents aware of their loss of control.

Adolescents who experiment with heroin are often surprised by the rapid onset of heroin withdrawal. Heroin withdrawal symptoms usu ally emerge within days or weeks after the first use of heroin, typically emerging within 8 to 12 hr of abstinence and lasting for 3 to 5 days. Physiological, genetic, and psychological factors can significantly affect the duration and severity of heroin withdrawal. It is noteworthy that the signs and symptoms of heroin withdrawal are diametrically opposite those of heroin intoxication. This phenomenon results from the fact that compensatory brain responses to chronically administered heroin are unopposed during heroin abstinence, resulting in rebound withdrawal symptoms (O'Brien, 2001). Although the heroin withdrawal syndrome (see Table 17.4) is extremely unpleasant, it is not medically dangerous.

Table 17.4 Signs and Symptoms of Heroin Intoxication and Withdrawal

Heroin Intoxication

Heroin Withdrawal

Signs (Observed)

Bradycardia

Tachycardia

Low blood pressure

Elevated blood pressure

Low body temperature

Fever, sweats

Sedation

Insomnia

Small (pinpoint) pupils

Enlarged pupils

Reduced movement

Pacing

Slurred speech

Piloerection (“gooseflesh”)

Head nodding

Yawning, tearing, runny nose

Slow breathing

Increased breathing rate

Symptoms (Reported)

Euphoria

Anxiety, depression

Reduced pain threshold

Bone and muscle pain

Calmness

Cramps, nausea, vomiting, diarrhea

Satiation

Craving for heroin

Restlessness, irritability

Reduced pain threshold

Heroin-addicted individuals experience panic and intense irritability during withdrawal, and their urgent drive for heroin often leads to risky drug-seeking behaviors. This tendency is compounded in adolescents who are characteristically impulsive. Generally, heroin users will actively avoid withdrawal symptoms by using heroin on a regular and daily basis. Consequently, the binge pattern of use that is characteristic of cocaine dependence is seldom reported among heroin users. Heroin users routinely experience withdrawal symptoms when their supply of heroin or money is interrupted. Furthermore, severely addicted individuals must use heroin several times per day to avoid withdrawal, and are constantly oscillating between periods of heroin intoxication and withdrawal, creating a vicious cycle that positively (euphoria) and negatively (withdrawal/craving) reinforces continued heroin use. When alcohol or other sedatives (benzodiazepines, barbiturates) are also abused, sedative withdrawal symptoms may complicate the symptoms of opioid withdrawal.

Cocaine and Other Stimulant Dependence in Adolescence

Cocaine is the most heavily abused nervous system stimulant in the United States and therefore the primary focus of this section. Other central stimulants (most notably methamphetamine, amphetamine, and dextroamphetamine) pro duce cocaine-like subjective effects that result from their similar neurotransmitter actions in reward-related brain circuits (Dackis & O'Brien, 2001). Methamphetamine is used predominantly in the western regions of the United States, and is particularly addicting when injected or smoked. Although many adolescents are able to experiment with stimulants without suffering long-term consequences, others are pulled into a tenacious cycle of addiction that all too often persists into adulthood. About 6% of the adolescents who try cocaine become addicted within 1 year, and most of the additional 10% who ultimately become addicted do so within 3 years (Wagner & Anthony, 2002a). The constitutional and environmental vulnerabilities that predispose individuals to cocaine addiction are not entirely understood, and it is very difficult to predict which adolescents will ultimately become afflicted. Considerable resources have been allocated to prevention, on the basis of the undeniable fact that cocaine addiction cannot occur if cocaine is never tried. Prevention initiatives include educational and advertising campaigns that convey the dangers of cocaine, school and community programs, and law enforcement efforts that target the key variable of cocaine supply. However, preventive measures (discussed in Chapter 19) are most likely to benefit cocaine-naive individuals and are of limited value to adolescents who are already in the grips of full-fledged cocaine addiction.

The historic use of cocaine has varied extensively, reaching epidemic levels when its addictiveness was unappreciated and receding when perceived risk was great. The chemical isolation of cocaine in 1860 produced a white powder that could be efficiently consumed by oral, intranasal, and intravenous routes. Cocaine became immensely popular in Europe and the United States in the 19th century, and was sold in wine or soda (a bottle of Coca-Cola originally contained 10 mg of cocaine) for its medicinal, antidepressant, and energy-enhancing qualities. Its availability and perceived harmlessness proved to be essential ingredients that quickly unleashed a cocaine epidemic in the late 19th century. Cocaine use declined precipitously after the risk of medical, psychiatric, and behavioral consequences became widely appreciated. Amphetamine was developed in the mid-20th century and widely abused until perceived risk became appreciated, partly in response to the “speed kills” prevention initiative. Its appearance illustrates both the latent human demand for stimulants and the potential danger posed by new compounds, including designer drugs that have similar actions on reward-related brain circuits. The reversibility of perceived risk was demonstrated by the reemergence of cocaine in the 1980s as a popular drug with a mythology of harmlessness.

In recent years, it is likely that the availability of inexpensive crack has actually increased cocaine access to adolescents. Crack can now be purchased for as little as $2 in many regions of the United States. Although there was a reduction in cocaine use among adolescents between 1985 and 1995, findings of the U.S. Substance Abuse and Mental Health Service Administration (SAMHSA) and the United Nations Office for Drug Control and Crime Prevention provide evidence that cocaine use in the United States and throughout the world actually increased during the late 1990s (Chen & Anthony, 2004). Data on American adolescents (ages 12–17) from the National Household Surveys on Drug Use and Health (SAMHSA, 2000) reported 397,000 active cocaine (including crack) users (1.7% prevalence) and 275,000 (1.2% incidence) recent-onset users. Stimulants other than cocaine were actively used by 561,000 (2.4% prevalence) adolescents, of which 322,000 (1.4% incidence) were recent-onset users. These data indicate that many adolescents continue to experiment with cocaine and other stimulants even though their risks are widely known.

Crack provides adolescents with a convenient and highly efficient means of administering cocaine that is particularly acceptable to adolescents who are already smoking tobacco or marijuana. Marketing inexpensive crack, whether by design or chance, has apparently provided the illegal drug industry with adolescent cocaine customers that number in the hundreds of thousands. In addition, smoking crack (a free-base form of cocaine that can be vaporized without loss of potency) has long been recognized to be more hazardous than snorting cocaine HCl (Hatsukami & Fischman, 1996). Crack is taken by the intrapulmonary route that delivers cocaine to the brain much more rapidly than snorting (6 to 8 sec vs. 3 to 5 min), resulting in more intense euphoria and a greater likelihood of addiction (Volkow, Fowler, & Wang, 1999). A recent epidemiological study concluded that smoking crack might double the likelihood of developing cocaine dependence (Chen & Anthony, 2004). To make matters worse, crack is often sold in dangerous urban areas where drug trafficking, crime, prostitution, and infectious diseases present convergent hazards for adolescents. The risk of suffering procurement-related medical hazards, including trauma, is especially high because adolescents who use drugs are more likely to engage in unprotected sex and illegal behavior (Jessor, 1991).

Although there are large numbers of adolescents who need effective treatment for cocaine dependence, specialized adolescent addiction treatment programs are scarce and difficult to access throughout the United States. This situation is incongruous with the clinical importance of arresting a progressive and reversible disorder at an early stage, thereby averting functional impairment, morbidity, and mortality. In fact, adolescence is the ideal age for recovery. Unfortunately, cocaine addiction often persists into adulthood with predictable medical, psychiatric, behavioral, and societal ramifications that probably exhaust more resources than would be expended by a serious attempt to establish an appropriate treatment infrastructure for our children.

The transition from cocaine use to addiction is influenced by the route of administration (Chen & Anthony, 2004), the environment (Dackis & O'Brien, 2001), and constitutional factors that affect the attractiveness and rewarding qualities of cocaine (Tsuang et al., 1999). Environmental and psychosocial factors strongly influence the likelihood of first-time use. In some communities, drug dealers are viewed as successful role models and are actually emulated by adolescents who have few educational or vocational alternatives. Disenfranchised adolescents might be particularly vulnerable to cocaine use as a means of gaining peer acceptance, and parents are well advised to be cognizant of peer group changes (DuRant, 1995). Family members, particularly older siblings, are often instrumental in providing adolescents with their first dose of cocaine or normalizing its use through example. Studies indicate that the vulnerability for cocaine dependence is enhanced when there is a family history of alcoholism or drug dependency. Epidemiological studies conclude that the vulnerability to develop cocaine dependence is partially inherited. Twin studies report significantly higher concordance rates for identical twins than for nonidentical twins (Cadoret, Troughton, O'Gorman, & Heywood, 1986; Tsuang et al., 1996; van den Bree, Johnson, Neale, & Pickens, 1998), although research into candidate genes that encode enzymes involved in cocaine metabolism and receptors that mediate cocaine effects has not identified reliable genetic vulnerability markers.

Clinical Aspects of Stimulant Dependence

The reinforcing effect of stimulants correlates directly with the rate by which these drugs enter the brain and block dopamine transporters (Volkow, Fowler, et al., 1999), which are membrane-based proteins that regulate the amount of dopamine available to stimulate postsynaptic receptors. Cocaine euphoria has also been associated to a lesser degree with glutamate, β‎-endorphin, GABA, norepinephrine, and serotonin neuronal systems (Dackis & O'Brien, 2003a). By activating brain pleasure centers, cocaine places adolescents at immediate risk of developing stimulant addiction (Dackis & O'Brien, 2001). In fact, the powerful biological basis of stimulant reward is illustrated by the fact that animals with unlimited access will consistently self-administer cocaine and amphetamine to the point of death.

Cocaine administration produces a rush of euphoria that lasts only a few minutes but far exceeds the normal range of human pleasure, explaining its remarkable ability to dominate thoughts, behaviors, and priorities of adolescents. Cocaine intoxication also produces racing thoughts, self-confidence, increased energy, heightened alertness, reduced appetite, and enhanced libido (see Table 17.5). The last effect may lead to promiscuity and unprotected sex, with the accompanying risk of pregnancy and venereal disease. Interestingly, cocaine euphoria appears to last only as long as brain cocaine levels are rising, and declining levels (even when still very elevated) are associated with craving and cocaine-seeking behavior (O'Brien, 2001). Physical manifestations of cocaine intoxication, which may provide warning signs to parents or teachers, include dose-dependent tachycardia, dilated pupils, diaphoresis, excessive move-ment, pressured speech, elevated blood pressure, and increased body temperature. Cocaine-intoxicated adolescents are likely to be talkative and gregarious, although higher doses can precipitate irritability, aggressiveness, and psychosis (especially paranoia and hallucinations) with a host of behavioral risks.

Table 17.5 Signs and Symptoms of Cocaine Intoxication and Withdrawal

Cocaine Intoxication

Cocaine Withdrawal

Signs (Observed)

Elevated blood pressure and pulse

Slow pulse

Elevated body temperature, perspiration

Low body temperature

Alertness, vigilance

Somnolence

Pacing, sweats, enlarged pupils

Reduced movement

Symptoms (Reported)

Euphoria, grandiosity

Depression, low self-esteem

Increased energy

Low energy

Reduced appetite

Increased appetite

Increased sex drive

Reduced sex drive, impotence

Racing thoughts

Poor concentration

Insomnia

Oversleeping

Within minutes, cocaine euphoria gives way to depression, irritability, and cocaine-induced craving (Jaffe, Cascella, Kumor, & Sherer, 1989; O'Brien et al., 1992). Cocaine's ability to beget its own craving promotes a characteristic binge use pattern that typically exhausts the available supply of cocaine and cash. In fact, the unexplained disappearance of money may be the first indication that an adolescent is using cocaine. At the end of a cocaine binge, alcohol, sedatives, and even heroin might be used to reduce insomnia, paranoia, and irritability. The combination of alcohol and cocaine is particularly hazardous because of the formation of cocaethylene, a psychoactive substance with cocaine-like actions that has much more toxicity and lethality than cocaine alone (McCance-Katz, Kosten, & Jatlow, 1998). Cocaine has a half-life of 50 min and the major route of metabolism involves the hydrolysis of both ester groups to form benzoylecgonine, which can be detected for 2 to 5 days after a cocaine binge.

At the end of a cocaine binge, cocaine withdrawal symptoms may develop. The cocaine withdrawal syndrome includes depression, overeating, low energy, somnolence, psychomotor retardation, bradycardia, and poor concentration (Weddington et al., 1990). Although most symptoms resolve within 1 to 3 days, cocaine withdrawal can affect the motivation and school performance of adolescents who binge frequently. Severe cocaine withdrawal is associated with poor clinical outcome and may result from cocaine-induced disruptions of brain reward centers that have been hypothesized to produce hedonic dysregulation (Dackis & O'Brien, 2002). Stimulant withdrawal can also be associated with profound depression and suicidality.

Cocaine craving can persist after weeks, months, and even years of abstinence, especially in response to environmental cues that have been associated with cocaine through conditioning. Cue-induced cocaine craving has been extensively studied in the laboratory and is associated with robust limbic activation in addicted individuals on the basis of positron emission tomography (PET) (Childress, Mozley, McElgin, Fitzgerald, Reivich, & O'Brien, 1999) and func tional magnetic resonance index (fMRI) studies (Garavan et al., 2000). Since the same limbic regions can be activated by sexually explicit videos (Garavan et al., 2000), sexual arousal and cocaine craving appear to share common neuronal substrates. One might imagine how difficult it would be for adolescents, notoriously vulnerable to sexual drive, to resist a similar lure to use cocaine. In actively addicted individuals, cocaine craving alternates with cocaine euphoria to form a cycle of addiction that becomes increasingly tenacious and uncontrollable.

Whereas some individuals may use cocaine intermittently for years, others experience rapidly progressive impairment that involves family, educational, interpersonal, medical, psychiatric, and legal domains. These impairments, resulting from loss of control over drug intake, are typically minimized by denial. Denial may be particularly formidable in adolescents who notoriously view themselves as invincible. Poor school performance is common in adolescent cocaine users and may be an early warning sign of the problem. Legal and behavioral problems (especially theft) should also raise the question of cocaine use. Medical complications of cocaine include sudden death (usually as a result of cardiac arrest or hyperthermia), myocardial infarction, seizures, cardiac arrhythmias, aortic dissection, and hemorrhage. Many of the medical complications of cocaine result from its ability to constrict blood vessels and impede blood supply, potentially leading to stroke, renal failure, spontaneous abortion, and even bowel necrosis. Psychiatric problems include depression, especially during cocaine withdrawal, suicide, and panic anxiety. Paranoia is a classic complication of cocaine and amphetamine intoxication, and may be associated with hallucinations and violent behavior.

Central stimulants may produce tolerance or sensitization, depending on the response in question. During cocaine bingeing, tolerance often develops rapidly to the euphoric effect of cocaine. Cocaine also produces sensitization in animal models, as evidenced by an increase in cocaine-induced hyperactivity with repeated dosing. The relevance of sensitization in the clinical arena is unclear. It has been hypothesized that cocaine users become sensitized to cocaine-induced seizures and cocaine-induced psychosis. Sensitization may also be associated with cue-induced craving as both phenomena are persistent and involve similar perturbations in dopamine and glutamate neurotransmission (Dackis & O'Brien, 2002).

Marijuana Use and Abuse in Adolescence

Marijuana is the most commonly used illicit drug among adolescents in the United States (see Figs. 17.1 and 17.2 for comparison). It shares some attributes and possible health consequences with tobacco in that marijuana is a plant material, is most commonly smoked, and contains hundreds of compounds including at least 60 termed cannabinoids that are unique to the cannabis plant. The pharmacology of most of the cannabinoids is relatively unknown, but the most potent psychoactive agent, δ‎-9-tetrahydrocannabinol (THC), has been isolated, can be synthesized, and has been well researched in adults since the early 1970s. The noncannabinoid materials in the plant and its combustion products when smoked are similar to many of those from tobacco leaf smoking with, of course, the exception of nicotine.

Figure 17.2 Trends in annual prevalence of marijuana in three populations.

Figure 17.2
Trends in annual prevalence of marijuana in three populations.

In recent years the technology of growing and distributing illicit marijuana has become sophisticated and much improved. The THC content of plants from different sources and strains varies a great deal. Improved growing techniques, particularly plant breeding, have changed the THC content from a typical 10 mg in a marijuana cigarette in the 1960s to a 1 g marijuana cigarette that contains 150 to 200 mg. One consequence of the increased potency is that much of the human research done in the 1970s and 1980s with relatively low-potency smoked marijuana may be less relevant to the pharmacology of and consequences from marijuana now readily available to adolescents in most parts of the world. What is clear from past research is that the biological effects of THC are dependent on dose. The availability of potent marijuana has greatly increased so that far higher doses of THC are now available to adolescent marijuana users than was possible 10 or 20 years ago.

Although marijuana is typically smoked in the form of cigarettes or from pipes, THC can also be easily extracted with ethanol and the THC extract or raw plant material can be added to baked goods or to sugar cubes or even an oral spray. Because THC and other cannabinoids are not water soluble, intravenous use leads to major toxic effects unless very special preparations and delivery systems are used.

The pharmacokinetics—that is, the manner in which cannabinoids are distributed and metabolized in the body—are more complex than that of most psychoactive drugs. As with the use of any smoked drug, the final absorbed dose is very much under the control of the individual and subject to learning processes similar to those involved in learning to smoke tobacco. Thus, beginning adolescent marijuana smokers may well underdose or overdose themselves until they learn how to smoke. That traditionally most marijuana smoking follows some prior experience with tobacco smoking probably facilitates the learning process. About half of the THC in a marijuana cigarette enters the lungs and most is absorbed rapidly, reaching the brain within seconds of a puff. After oral ingestion, absorption is much less, slower, and more variable. The onset of effects after an oral dose of THC can be delayed as long as an hour and absorption continues slowly. Thus either through chance or for other reasons, overdose with oral ingestion is more likely than after smoking.

THC and other cannabinoids move rapidly into fat and other body tissues during smoking but are only very slowly released from those tissue stores back into blood (and brain) over days, weeks, and months; they are gradually cleared from the body in urine and feces. Thus, the half-life of elimination of even a single, modest dose of THC from tissue stores is very slow, from 7 to 18 days with complete elimination of cannabinoids from one smoked dose taking up to 30 or more days. One consequence of this pattern of slow elimination from the body is that, with repeated doses, even doses taken only a few times weekly, cannabinoids gradually accumulate throughout the body, including the brain. Concentrations in brain areas vary but are highest in cortical, limbic, sensory, and motor areas of the brain. Cannabinoids are primarily metabolized in the liver into a host of metabolites, most of which are not known to be biologically active. The metabolites are very slowly cleared from the body, thus making urine tests useful as an indicator of past marijuana use. Because of the very long presence of cannabinoids in the body, there is no relationship between plasma or urine concentrations of THC or metabolites and degree of intoxication once an hour or less has passed after smoking.

THC alters brain functions by binding to specific receptors widely distributed throughout the brain and elsewhere in the body. As the actions of naturally occurring ligands (hormones) that alter the state of these receptors have become better understood, it has become apparent that there is a complex system of multiple cannabinoid receptors interacting with a series of endogenous ligands. One of these ligands or hormones is called anandamide. The neuropharmacology of this cannabinoid receptor system is only beginning to be understood, but it appears to involve cannabinoids acting as neuromodulators of the large family of other neurotransmitters such as dopamine. Given the extensive distribution of cannabinoid receptors, it should not be sur-prising that virtually every system in the body is affected to some degree by marijuana. Marijuana appears to have sedative, analgesic, anxiolytic, hallucinogenic, appetite-suppressing, and appetite-enhancing properties and less well-characterized effects on the immune system. Firm conclusions about any medical treatment applications of marijuana are premature, but the discussions of marijuana use as a treatment makes it a more interesting drug to an adolescent.

Marijuana is used by adolescents to produce a mild, relatively short period of intoxication, often imprecisely characterized as euphoria. THC is an extremely potent psychoactive drug, so that less than a milligram smoked can produce relaxation, decreased anxiety, and feelings of less inhibition. The intoxication typically lasts a few hours. With higher doses, or with an inexperienced or sensitive individual, a single dose can produce severe anxiety, paranoid delusional thinking, and perceptual distortions that are not unlike those produced by hallucinogen drugs. Individuals with a genetic predisposition for developing schizophrenia, depression, or other mood disorder may be particularly vulnerable to such adverse effects resulting from marijuana doses well tolerated by others.

Cannabis-induced intoxication clearly impairs cognitive and psychomotor performance, with complex, demanding tasks being more affected and in a dose-dependent manner. The spectrum of behavioral effects is similar to those of other central nervous system (CNS) depressant drugs such as alcohol and are additive to effects produced by concurrently used depressant drugs. The magnitude of marijuana-produced perceptual and psychomotor alterations measurable in research settings is such that it is reasonable to assume that complex tasks such as driving or other tasks that have high demands on attention and information processing and reaction responses would be impaired. Of particular relevance when considering consequences of adolescent marijuana use is that in a laboratory setting, overlearned or well-practiced tasks are relatively less affected by marijuana. Thus, a beginning or relatively inexperienced driver may be more subject to marijuana-induced cognitive, motor, and perceptual impairments than an adult who has been driving for many years.

Although the evidence for cognitive impairment for some hours after a dose of marijuana is quite consistent and has been repeated in experiments in many laboratories over many years, there is less unanimity about the consequences of long-term chronic cannabis use. The consensus of recent studies is that individuals who have used cannabis over long periods of time have impaired performance on tests even when not acutely intoxicated. The cognitive functions that are impaired are attention, memory, and processing of complex information, and appear to last months, perhaps years, after cessation of use. Uncertainty remains as to whether some of the individuals had impaired performance before becoming involved with cannabis, but the data are quite consistent that the performance of heavy, frequent users is impaired when compared with shorter-term, less infrequent marijuana users.

Tolerance to many of cannabis's subjective and behavioral effects develops rapidly with relatively few exposures, not unlike the pattern of tolerance that develops to nicotine and cocaine effects when smoked. For many users, tolerance likely leads to more frequent or high-dose use to achieve the sought-after psychological effects. A cannabis withdrawal state has been clearly demonstrated in laboratory animals given marijuana or other cannabinoids over a relatively short period of time. Clinically significant cannabis withdrawal symptoms have been well described in both human laboratory studies and clinical settings. With abrupt discontinuation after only a few days of repeated administration of THC or marijuana in a laboratory setting, disturbed sleep, decreased appetite, restlessness, irritability, sweating, chills, nausea, and markedly disturbed sleep rapidly develop within hours of the last dose. Although most symptoms disappear in a day or two, irritability and sleep disturbance can persist for weeks. Frequent marijuana users in a clinical setting report similar symptoms when they stop marijuana smoking, along with a craving for marijuana, depressed mood, increased anger, wild dreams, and headaches. The pattern of withdrawal symptoms suggests to some investigators that it may contribute to continued use of marijuana in cannabis-dependent individuals. As with other addicting drugs, the precise links between withdrawal symptoms and continued or relapse to drug use is still a matter of some uncertainty.

As with nicotine dependence, it appears that early exposure to cannabinoids during adolescence may have more adverse consequences, including patterns of drug taking consistent with addiction. Individuals who began regular use of cannabis in adolescence appear to be at greater risk, relative to cannabis users who began regular use at an older age, in terms of greater use of other illicit drugs, depression, suicidal ideation and suicide attempts, and violent or property crimes. When adolescents were followed over time into adulthood, weekly cannabis use when an adolescent predicted an increased risk of dependence when a young adult.

The sometimes marked cardiovascular and autonomic system effects of cannabis appear to be well tolerated by adolescent users. However, when used heavily and over time, smoking marijuana is associated with pulmonary symptoms and problems. Pulmonary toxins are present in marijuana smoke as they are in tobacco smoke. Recent estimates that in the future as many as 30,000 deaths a year may result in Britain from smoking cannabis reflect reasonable extrapolations from the toxic effects of marijuana to what is known about the adverse effects of tobacco smoking. Laboratory models of cannabinoid-induced alterations in immune system response are such that questions remain about the likelihood of clinically relevant immune system impairments from prolonged marijuana exposure.

Alcohol Use and Abuse in Adolescence

In this section we provide an overview of the phenomenology of alcohol drinking and alcohol use disorders (AUDs) in adolescents ages 11 to 19 years of age. Included in this overview is a current description of “drinking youths”; prevalence rates of adolescent drinking, binge drinking, AUDs, and drinking-related consequences; pertinent diagnostic issues; and potential etiological factors that may enhance our understanding of alcohol use and the development of problem drinking in adolescents.

Alcohol is a sedative and it is the only drug in this category to be discussed in this chapter. Other sedatives such as benzodiazepines, barbiturates, and other sleeping pills are used so uncommonly that they do not merit a full discussion. Notably, a discussion of alcohol leads to some overlap in content with discussions of other substance use disorders in adolescents, discussed elsewhere in this chapter. However, there are some important distinctions to bear in mind. Alcohol use by persons 21 years or older is legal in the United States, making it more readily available to adolescents and exposing them to seductive advertisements. In addition, low to moderate alcohol use is an integral part of our adult community life. It is available in many restaurants, sold in grocery stores in many states, and is available in liquor stores throughout the country. It is readily accepted in social settings, frequently accompanying a meal, and incorporated in many religious ceremonies. Over the past decade, the health benefits of one or two glasses of wine per day have been widely covered by the media. Finally, parents and other authorities frequently overlook adolescent drinking, relegating it to experimentation or “rites of passage.” In contrast, the illegality of many of the other abused substances (e.g., marijuana, cocaine, heroin) makes them taboo in most adult circles and causes much alarm and concern in adult communities when adolescent use of illegal substances is uncovered.

General Description of Adolescent Drinking

Drinking alcohol can be a highly pleasurable experience for many people, regardless of age. It is frequently described as relaxing, euphoric, anxiety reducing, and disinhibiting. Nonetheless, as alcohol is absorbed, metabolized, and eliminated from the body, it can also be associated with poor motor coordination, some confusion, irritability, depression, sleeplessness, nausea, and vomiting, among other ill effects. Ingesting excessive amounts of alcohol in a relatively brief period of time can cause extreme confusion, unconsciousness, and sometimes death.

Beer is the most commonly consumed alcoholic beverage among adolescents. The National Center on Addiction and Substance Abuse (CASA) at Columbia University estimated that nearly 20% of all alcoholic beverages purchased in 1999 was consumed by underage drinkers (CASA, 2003). For these underage drinkers (12 to 20 years of age), 76% of the expenditures was for beer, 19% for liquors (distilled spirits), and 4% for wine. These percentages are likely to change as alcohol manufacturers market new types of beverages that appeal to adolescents. The most recent arrivals are sweet-tasting, fruit-flavored, malt-based, colorful beverages, known as “alcopops” or “malternatives.” These beverages are gaining in popularity, easily accessible, and preferred to beer and mixed drinks by adolescents (CASA, 2003). While adolescents will use elaborate means to obtain alcohol (e.g., having fake identification cards made; asking strangers to buy alcohol for them), they more commonly obtain alcohol from their own homes, their friends' homes, their parents, or from other adults (CASA, 2003; National Research Council and Institute of Medicine [NRCIM], 2003).

Adolescents report drinking for many of the same reasons that adults drink—that is, they expect positive effects from drinking. Younger ad olescents report that drinking alcohol reduces tension and they like the mild impairment it causes to their cognitive and behavioral functioning. Older adolescents say they drink primarily because of the euphoria they experience and/or the altered social and emotional behaviors that occur when they drink. Oldest adolescents refer to the empowerment effects of alcohol. Adolescent males rate the pleasurable effects and sexual enhancement of alcohol more highly than females, who, in contrast, rate the tension-reduction effects more favorably (CASA, 2003; NRCIM, 2003).

Problem Drinking in Adolescents

The hallmarks of problem drinking are loss of control over drinking (i.e., drinking more than planned or in inappropriate settings) and the occurrence of negative consequences from drinking (driving under the influence [DUI], high-risk sexual behaviors, fights, medical problems). The development of addiction is associated with repeated, heavy drinking over time, potentially as a continual attempt to recreate the pleasurable state associated with initiating drinking and intoxication. Repeated drinking can also lead to the development of physiological dependence, marked primarily by tolerance to alcohol, and withdrawal symptoms between drinking periods. Tolerance is defined as the need to drink progressively greater amounts of alcohol to yield the same pleasurable effects that can be experienced when drinking alcohol. Tolerance is one of the most commonly reported dependence symptoms in community samples and clinical samples of adolescents (Chung, Martin, Armstrong, & Labouvie, 2002; Martin & Winters, 1998).

Although less frequently reported among adolescents than among adults, heavy drinking can also lead to alcohol withdrawal symptoms between drinking periods (see Table 17.6). Severe withdrawal can be life threatening and may present as delirium tremens (DTs), which include symptoms of confusion, delirium, hallucinations, and psychosis (Dackis & O'Brien, 2003b). Delirium tremens are more likely if patients are malnourished, dehydrated, or suffer from infection or electrolyte imbalance. A careful history is critical, because withdrawal can produce seizures, especially if they have occurred before.

Table 17.6 Signs and Symptoms of Alcohol Intoxication and Withdrawal

Alcohol Intoxication

Alcohol Withdrawal

Signs (Observed)

Decreased heart rate

Increased heart rate

Lower blood pressure

Elevated blood pressure

Lower body temperature

Elevated body temperature

Sedation

Sweating

Decreased respiration

Tremors and muscle spasm

Loss of balance

Vomiting and diarrhea

Restlessness

Seizures

Slurred speech

Confusion

Delirium

Psychosis

Symptoms (Reported)

Relaxation

Craving for alcohol

Sense of well-being

Anxiety

Euphoria

Irritability

Dizziness

Insomnia

Fatigue

Nausea

Nausea

Hallucinations

Blackouts

The psychological, behavioral, and physical effects of alcohol are related to the blood alcohol level (BAL) of an individual, which is determined primarily by the quantity, frequency, and potency of alcohol consumed. The BAL (the ratio of milligrams of alcohol per 100 ml of blood) can be easily estimated by exhaling into instruments called breathalyzers, which are commonly available to treatment providers and law enforcement agencies. Impaired judgment and impaired coordination due to alcohol are legally determined by a BAL of 0.08% (i.e., as of May 2003, 39 states have this as their legal limit; for the latest statistics see http://www.nhtsa.dot.gov/people/Crash/crashstatistics/). Most European countries set the legal limit lower because discernable impairment from alcohol usually begins at about 0.05% or below. Adolescents may have higher BAL levels than adults because of their high-quantity, peer-influenced drinking patterns (Deas, Riggs, Langenbucher, Goldman, & Brown, 2000). Nonetheless, all states have zero-tolerance laws and allow no legal BAL for drivers under the age of 21.

Prevalence of Adolescent Use and Abuse of Alcohol

It is generally acknowledged that some use of alcohol is the norm among adolescents (Schulenberg & Maggs, 2002; Windle, 1999). According to national surveys, alcohol is the most widely used psychoactive substance in adolescents (excluding caffeine) (Grunbaum et al., 2002; Johnston et al., 2003b). The most recent Monitoring the Future (MTF) annual survey found that by senior year, nearly 80% of students reported some use of alcohol (Johnston et al., 2003b). In 1992, the CDC began conducting the Youth Risk Behavior Surveillance Survey (YRBSS) every 2 years, interviewing approximately 11,000 youths (ages 12 to 21) from a nationally representative sample. The results from their most recently available survey (2001) indicated that the percentage of high school students who had at least one drink of alcohol ranged from 73.1% of 9th graders to 85.1% of 12th graders (Grunbaum et al., 2002). Data are also available from the National Survey of Parents and Youths, conducted by the Annenberg School for Communications at the University of Pennsylvania (Hornik, 2003). This survey included a nationally representative sample of 2,435 youths who were initially interviewed in 2000 and reinterviewed 18 months later. Use of any alcohol increased in a linear fashion from about 5% at age 11 to approximately 89% at age 18. Thus, the most recent national surveys indicate that by senior year, approximately 80% to 90% of high school students have had at least a drink of alcohol.

Binge drinking.

One particular concern is the amount of binge drinking by adolescents. Wechsler and colleagues (Wechsler, Davenport, Dowdall, Moeykens, & Castillo, 1994) are generally credited with first using the term binge drinking in referring to excessive alcohol drinking by some adolescent and college-aged drinkers. Excessive or binge drinking has been defined in multiple ways (NRCIM, 2003), but the standard definition is drinking five or more drinks in a single episode (CASA, 2003; Wechsler, et al., 1994; Windle, 1999). This pattern of drinking in adolescents is associated with a broad range of problems, including date rape, vandalism, and academic failure (Baer, 1993). According to the YRBSS survey (Grunbaum et al., 2002), 25.5% of 9th graders, 28.2% of 10th graders, 32.2% of 11th graders, and 36.7% of 12th graders had had at least one binge-drinking episode in the past 30 days. The MTF survey (Johnston et al., 2003b) obtained information on binge drinking in the 2 weeks prior to the interview and found that 12.4% of 8th graders, 22.4% of 10th graders, and 28.6% of 12th graders had at least one binge-drinking episode. The MTF survey also asked respondents to report if they had been “drunk” in the past month, and found that 6.7% of 8th graders, 18.3% of 10th graders, and 30.3% of 12th graders responded affirmatively.

Perhaps some of the most innovative work to date has combined a developmental perspective in defining more homogeneous adolescent–young adult subgroups with respect to their amount of binge drinking. Schulenberg, O'Malley, Bachman, Wadsworth, and Johnston (1996) identified six different patterns or trajectories of binge drinking on the basis of data from MTF. These trajectories accounted for 90% of the sample. The most common trajectories were “never” (36%) or “rarely” (17%) reported binge drinking; however, the other four trajectories were 12% decreased binge drinking over time, 10% increased binge drinking, 10% increased and then decreased, and 7% sustained chronic binge drinking over time.

Drinking-related consequences among adolescents.

According to the National Institute on Alcohol Abuse and Alcoholism (NIAAA), “underage alcohol use is more likely to kill young people than all illegal drugs combined” (NIAAA, 2003). As in adult circles, excessive drinking and intoxication have serious consequences in the adolescent population. Most notable are automobile accidents. In 2001, 22.1% of high school seniors drove after drinking, and 32.8% rode with a driver who had been drinking (Grunbaum et al., 2002). Driving skills appear to be more readily impaired by alcohol in adolescent than adult drivers, and the alcohol-involved fatality rate is twice as high among adolescent than adult drivers (NIAAA, 2003).

Other harmful behaviors frequently related to excessive drinking among adolescents are high-risk sexual behaviors (unplanned with no protection); rapes, including date rape; assaults; homicides; and suicides (NIAAA, 2003; Windle, 1999). Having multiple sexual partners, failing to use condoms, and performing other high-risk sexual behaviors have been associated with alcohol use in adolescents (NIAAA, 2003). Furthermore, alcohol use by the offender, victim, or both has been linked to sexual assault, including date rape. Using the MTF data, Bachman and Peralta (2002) reported that heavy alcohol use increased the likelihood of violence for either gender, even after controlling for home environment, grades, and ethnicity. Alcohol generally is a disinhibiting intoxicant and it may also potentiate mood and stress states that lead to suicide attempts or other life-threatening behaviors. For example, heavy drinking has been correlated with suicide attempts in eighth-grade girls (Windle, Miller-Tutzauer, Domenico, 1992). Finally, alcohol is considered by some to be a “gateway” substance (along with nicotine) for illicit drugs such as marijuana (Hornik, 2003; Wagner & Anthony, 2002b). That is, on the basis of a longitudinal study of a nationally representative sample (Hornik, 2003), researchers concluded “that marijuana is a behavior taken up after alcohol and tobacco use, and only if these behaviors are present as well” (p. 342). The gateway hypothesis is discussed further in Chapter 19 on prevention.

Alcohol use disorders in adolescents.

The national surveys mentioned above are representative of the general population drinking patterns but do not specifically address the prevalence of AUDs in adolescents. Recently, Chung and colleagues (2002) reviewed the epidemiological literature on diagnosing AUDs in adolescents. Although this review summarized both community and clinical groups, the community groups are of specific relevance here. Five community samples were identified from studies in peer-reviewed journals whose sample sizes ranged from 220 to 4,023 adolescents, ages 12 to 19. Two of the studies were representative of the entire U.S. population and the other three were representative of individual states (North Carolina, Oregon, and Pennsylvania). In these surveys, the percentage of adolescents meeting criteria for alcohol abuse ranged from 0.4% to 9.6%, and for alcohol dependence, from 0.6% to 4.3%.

Issues in Determining Alcohol Use Disorders in Adolescence

Diagnostic criteria for alcohol abuse and dependence are detailed in the DSM-IV (American Psychiatric Association, 1994), and are identical to the criteria for all substance disorders in all populations. To meet a diagnosis of alcohol abuse, one of four abuse criteria must be met:

  • 1. Recurrent use causing serious consequences

  • 2. Being physically dangerous

  • 3. Use causing legal problems

  • 4. Use resulting in persistent social or interpersonal problems

To meet a diagnosis of alcohol dependence (alcoholism), three of seven dependence criteria must be met:

  • 1. Withdrawal

  • 2. Tolerance

  • 3. Larger amounts consumed than intended

  • 4. Unsuccessful attempts to stop

  • 5. Excessive time spent drinking

  • 6. Important activities given up

  • 7. Continued use despite awareness of negative effects of drinking

While the number and type of symptoms needed to determine a diagnosis appear to be valid for adults (e.g., Schuckit et al., 2001), investigators have questioned the validity of the DSM-IV diagnostic criteria for identifying AUDs in adolescent populations (Winters, 2001). This skepticism is not surprising given that six of the seven research sites from the DSM-IV field trials did not contribute data on adolescents (Cottler et al., 1995). In addition, symptoms such as “risky sexual behaviors” are not explicitly assessed in the currently accepted list of symptoms used to determine a diagnosis of alcohol abuse or dependence.

Pollack and Martin (1999) studied 372 adolescent regular drinkers. More than 10% of this sample reported symptoms of alcohol dependence but not enough to have a diagnosis, and they also did not have symptoms of alcohol abuse (termed “diagnostic orphans”). However, these individuals had drinking-related problems similar to those of adolescents who did meet diagnostic criteria for an AUD, and they had significantly more drinking-related problems than did adolescents with no symptoms of alcohol abuse or dependence. In the review by Chung and colleagues (2002), “diagnostic orphans” represented from 1.9% to 16.7% of adolescent community samples and from 7.5% to 33.7% of adolescent clinical samples. Clearly additional conceptual and empirical research is needed to adequately diagnose AUDs in adolescents.

Finally, the two physiological symptoms (withdrawal and tolerance), that are part of the diagnostic DSM-IV criteria for determining if an individual has an AUD may have limited utility in diagnosing AUDs in adolescents (Martin, Kaczynski, Maisto, Bukstein, & Moss, 1995). Withdrawal symptoms are infrequently reported by adolescents (Chung et al., 2002), even in clinical samples, presumably because these symptoms take years to develop. Tolerance, by contrast, seems to develop quickly in most adolescents, but does not readily distinguish problem from normative drinkers in the adolescent population.

Etiology: Risk and Protective Factors

The greater number of risk factors an adolescent has for developing an AUD, the more likely he or she is to abuse or to be dependent on alcohol (Jaffe & Simkin, 2002; Newcomb, 1997). However, sometimes there are protective factors that counteract risk factors. For example, a strong religious commitment, dedication to constructive activities such as sports, intense anti-alcohol beliefs, and high self-esteem all can serve to neutralize inherent risk factors for developing an AUD (Liepman, Calles, Kizilbash, Nazeer, & Sheikh, 2002).

The relevance of some of the risk factors varies with the age, gender, and ethnicity of the adolescent. In addition, other factors have been identified that influence the risk for AUDs as well as other substance use disorders. Newcomb (1997) classified these risk factors into four generic domains: cultural/societal, interpersonal, psychobehavioral, and biogenetic. The first three factors are summarized here. Discussion of genetic factors is given at the end of this chapter.

Cultural and societal factors.

While many factors contribute to the availability and acceptability of alcohol in a community (cultural, economic, legal, etc.), probably the single, most influential factor that relates to alcohol consumption in adolescents is the attitude of the adult community in the particular geographic location (see review by Newcomb, 1997). For example, the purchase of alcohol by underage drinkers is prohibited in all 50 states. However, in some places, the laws are not regularly enforced by police officers, and the liquor stores and bars do not consistently require identification from minors (Windle, 1999). Underage drinking at family gatherings or special celebrations may be acceptable to parents and relatives in some communities. An in-depth examination of the relationship between community attitudes and alcohol availability (economic and le gal) is of paramount importance but beyond the scope of this book.

Interpersonal factors.

Interpersonal factors that relate to AUD risk are parental (referring to other than heredity), sibling, and peer influences. Despite waning parental influence with the passage of time, parents' level of nurturing, monitoring, communication, and their own alcohol use affects the amounts and patterns of alcohol drinking in their adolescents (see reviews by Gilvarry, 2000; Liepman et al., 2002; Schulenberg & Maggs, 2002; Windle, 1999). That is, although adolescents may reject many of their parents' ideas and behaviors, the majority do not seem to reject their parents' drinking behaviors. Higher levels of maternal and paternal alcohol consumption are related to higher levels of alcohol use among adolescents (e.g., Kilpatrick et al., 2000; Webb & Baer, 1995). However, parents can have a positive influence on their adolescents. Higher levels of emotional support and warmth (nurturance), higher levels of appropriate monitoring and limit setting, more time spent together, and higher levels of parent–adolescent communication have been associated with lower levels of adolescent alcohol-related problems (Windle, 1999).

Siblings represent another familial influence. Older siblings typically serve as role models and there is a greater likelihood that younger siblings will drink alcohol before they are adults if their older siblings drink. This relationship is stronger if the older sibling is closer in age and the same gender (Windle, 1999).

The commonly held notion that peers exert considerable influence on the initiation and maintenance of alcohol use is sustained empirically (Schulenberg & Maggs, 2002). But there is little support for overt peer pressure causing the initiation of alcohol use. Rather, most studies support a more complex, developmental interactional process in which an adolescent selects and unselects peer groups. The individual is influenced by the course of behaviors and attitudes of these groups and in turn influences them (Schulenberg & Maggs, 2002). However, overt peer pressure can play a role in relapse (Brown, 1993).

Psychobehavioral influences.

Newcomb (1997) cites age of onset and comorbid psychopathology as primary psychobehavioral influences in alcohol use. An earlier age of first use of alcohol is frequently associated with increased alcohol-related problems then and later in life. But it is not clear whether excessive drinking in early adolescence is a marker of serious preexisting or coexisting psychopathological factors (e.g., fractured gene pool, flawed personality, poor parental modeling, nurturing, monitoring, etc.), or, alternatively, whether excessive alcohol drinking in adolescence introduces a “toxin” that negatively impacts a person who is still experiencing a critical period of growth toward adulthood.

Recently, Ellickson, Tucker, and Klein (2003) published a 10-year prospective study in which students recruited from 30 Oregon and California schools were assessed at grades 7 and 12, and then later at age 23 (N = 6,338, 4,265, and 3,369, respectively). Young drinkers in both middle school and high school, compared to nondrinkers, were more likely to report academic problems, delinquent behaviors, and other substance use. At age 23, compared to nondrinkers, those who had been adolescent drinkers were more likely to report employment problems, continued “other” substance abuse, and criminal and violent behaviors.

These findings were supported by several retrospective reports as well. According to adults interviewed as part of the National Longitudinal Alcohol Epidemiological Survey, Grant and Dawson (1997) found that over 40% of adults who had reported using alcohol before age 14 had developed alcohol dependence later in their lives. This compared to a rate of less than 10% of alcohol-dependent adults who said they did not start drinking alcohol until after the age of 18. Also, DeWit, Adlaf, Offord, & Ogborne (2000) examined a larger health survey of adult reports. Of 5,856 drinkers, 501 had a DSM-IV diagnosis of lifetime alcohol abuse, and 473 had a DSM-IV diagnosis of lifetime alcohol dependence. Approximately 13.6% of the respondents who said they had their first drink between 11 and 14 years of age met diagnostic criteria for alcohol abuse 10 years later. This compares to only 2% of alcohol abusers who said they had had their first drink after the age of 18. These rates were parallel in adults diagnosable with alcohol de pendence. That is, 15.9% of the respondents who said they had their first drink between 11 and 12 years of age reported alcohol dependence 10 years later. In comparison, those who said they had their first drink after age 18 represented only 1% of the respondents.

Comorbidity.

Comorbid psychiatric disorders frequently co-occur with AUDs (Deas & Thomas, 2002; Gilvarry, 2000), but it is often difficult to distinguish etiological from consequential associations. For example, it is easy to imagine that a psychiatric disorder can result from continual, excessive alcohol consumption, especially in a physiological and psychological developmental period such as adolescence. In this example, the AUD would precede the comorbid disorder. Another scenario, however, is drinking alcohol to treat the symptoms of a psychiatric disorder; this is called “self-medication.” For example, an individual with a social phobia may desire the relaxing and disinhibitory effects of a few drinks prior to attending a social gathering. In this case, if an AUD is identified, it is likely that the comorbid disorder preceded the AUD. Finally, both AUD and a psychiatric disorder may have the same etiology (e.g., genetic, neurochemical). Naturally, an understanding of the etiology of both concomitant disorders can guide treatment decisions.

Behavioral disorders, mood disorders, and anxiety disorders have been frequently associated with AUDs (Deas & Thomas, 2002; Gilvarry, 2000). Attention deficit-hyperactivity disorder (ADHD) has been associated with AUDs, but its independence from conduct disorder (CD) has not been well established (Gilvarry, 2000). In a 4-year longitudinal study of ADHD children and controls (6 to 15 years old), there was no difference in the prevalence of AUDs between youths with and without ADHD (Biederman et al., 1997). Conduct disorder proved to be a significant predictor of AUDs in the target and control groups. In another study, Moss and Lynch (2001) used structural modeling to illustrate an association between ADHD and AUD for adolescent males but not females, yet CD symptoms had the strongest association with AUD in adolescents.

As implied above, there is strong empirical evidence relating CD and similar disorders (e.g., oppositional defiant disorders [ODD]) with AUDs (Clark, Vanyukov, & Cornelius, 2002). Clark, Bukstein, and Cornelius (2002) provide convincing evidence that childhood antisocial behaviors precede and predict adolescent AUDs. They argue that the association between the behavior disorders and AUDs is best understood as manifestations of common underlying causes. These include poor behavioral regulation (possibly related to prefrontal cortex abnormalities), common genetic pathways (for instance, genetic variations influencing the dopamine system), and similar environmental factors (for instance, low levels of parental monitoring). Early identification and intervention efforts for these individuals with childhood antisocial behaviors may ameliorate later AUDs.

In adults, a person with alcohol dependence is nearly four times more likely to have major depression than a person without alcohol dependence (Petrakis, Gonzalez, Rosenheck, & Krystal, 2002). Gilvarry (2000) reported that up to one third of adolescents in addiction treatment facilities are diagnosed with mood disorders, especially major depression and dysthymia. Deas-Nesmith, Campbell, and Brady (1998) reported that 73% of inpatient adolescents who used substances met diagnostic criteria for depression. Furthermore, in 80% of those cases, the depressive symptoms predated the substance use, suggesting that the mood disorder for these adolescents was an important risk factor for developing a subsequent AUD. In the Biederman et al. (1997) study, bipolar disorder predicted substance use disorders, independent of ADHD. Although not all studies have found that mood disorders predate substance use disorders (e.g., Rohde, Lewinsohn, & Seeley, 1996), these observations suggest that mood disorders may be a risk factor for developing an AUD in some adolescents.

Anxiety disorders, especially social phobia and posttraumatic stress disorder (PTSD), may also be risk factors for AUDs. Rohde and colleagues (1996) reported that alcohol use among female high school students was associated with anxiety disorders that preceded the alcohol problems. Deas-Nesmith, Brady, and Campbell (1998) found that 60% of adolescents seeking treatment for addiction met diagnostic criteria for a social anxiety disorder. Furthermore, the anxiety symptoms generally predated substance dependence by about 2 years.

Posttraumatic stress disorder has also been implicated as a risk factor for AUDs. Kilpatrick and colleagues (2000) explored PTSD as a risk factor for substance use problems in adolescents. These investigators found that physical or sexual abuse, assault, or the witnessing of violence (e.g., murder, sexual assault) increased the risk of abuse of several illicit drugs, including alcohol. In another study, Clark et al. (1997) found that adolescents with an AUD were more likely to have a history of physical and sexual abuse compared to an adolescent control group. Furthermore, the association of PTSD and alcohol dependence was stronger in females than in males.

Special Case of College Drinking

Thus far, this chapter has focused on alcohol drinking and disorders observed in adolescent youth, with most of this population attending middle school and high school. However, there are significant numbers of adolescents in the age group who have just graduated from high school and may be attending college (average college age ranges from approximately 18 to 24 years). It should be noted that drinking on college campuses has its own culture, with easy access to alcohol. This clearly distinguishes it from our traditional view of adolescent drinking patterns, prevalence, and disorder development. In addition, access to alcohol on most college campuses, from both attitudinal and economic perspectives, is unparalleled in any other large, established adult community, and has been associated with a high frequency of serious and sometimes life-threatening drinking-related negative behaviors. While it is not within the scope of this chapter to detail the phenomenology of college campus drinking, we briefly mention the nature of the problem here and the need for further research.

Basically, the prevalence of drinking and heavy drinking among college students is higher than that of their peers who do not attend college (U.S. Department of Health and Human Services [DHHS], 2002). This difference is due to many factors, including specific ones such as the influence of sororities and fraternities, greater amounts of unstructured time, easy access to those who can obtain alcohol legally, differential economic issues (parents or scholarships typically provide some financial support), and special advertising of alcoholic beverages targeted to the college population. College surveys reveal that approximately 40% of college students report heavy drinking in the 2 weeks prior to the survey. Consumption is heavier among males, highest among Caucasian students, and highest among the Northeast and North Central regions of the country (DHHS, 2002).

Hingson, Heeren, Zakocs, Kopstein, and Wechsler (2002) recently reported on the serious consequences of college drinking in the United States. On an annual basis, alcohol consumption is associated with 1,400 deaths, 500,000 unintentional injuries, 600,000 assaults, and 70,000 sexual assaults of college students. Approximately 2.1 million college students drive while intoxicated each year, 400,000 report having unprotected sex while drinking, and over 150,000 develop health-related problems due to their drinking (Hingson et al., 2002). In an effort to address the serious problems of college drinking, the NIAAA (2003) recently released a program announcement “to provide a rapid funding mechanism for timely research on interventions to prevent or reduce alcohol-related problems among college students” (see http://grants1.nih.gov/grants/guide/pa-files/PAR-03–133.html).

Tobacco Use in Adolescence

Adolescent tobacco use is widely recognized as a major public health problem (Windle & Windler, 1999). According to recent data, 64% of adolescents reported ever having smoked cigarettes, 28% reported having smoked on at least 1 day in the past month, and 14% reported having smoked on at least 20 of the last 30 days (CDC, 2002b). Among high school seniors who indicated that they currently smoked, 29% reported symptoms that met the DSM-III-R criteria for nicotine dependence (Stanton, 1995). Moreover, over one half of adolescents indicated that they experience withdrawal symptoms following a quit attempt and 70% regret ever having started smoking (CDC, 1998; Colby, Tiffany, Shiffman, & Niaura, 2000a). Thus, early patterns of tobacco use among adolescents may develop into lifelong nicotine addiction.

Nicotine and Nicotine Dependence

Nicotine, a potent alkaloid in tobacco leaves, is what sustains tobacco smoking, which efficiently delivers nicotine to the brain (Benowitz, 1990). Nicotine, steam distilled from burning tobacco plant material in a cigarette, is inhaled into the lungs on small tar droplets and absorbed rapidly into arterial blood, reaching the brain within 20 sec after each puff. Nicotine has similarities to the neurotransmitter acetylcholine, binding to a complex family of nicotine cholinergic receptors distributed throughout the brain and elsewhere in the body. During cigarette smoking, with each puff, nicotine levels in brain tissues briefly rise and then decline rapidly, more because of the rapid distribution into tissues than of being broken down by metabolism. Each puff acts like an individual dose of drug.

Blood and brain nicotine levels peak immediately after each cigarette, but gradually nicotine accumulates during 6 to 10 hr of repeated smoking because of nicotine's 2-hr half-life. During sleep, nicotine levels fall, but upon awakening, when the first cigarette of the day is smoked, levels begin to rise. Thus, someone smoking 10 cigarettes a day exposes their brain to nicotine 24 hr a day but along with rewarding perturbations in brain levels of nicotine after each of the 100 puffs. Each cigarette in effect delivers about 10 separate doses of nicotine to the brain. With marijuana or cocaine smoking, a similar pattern of drug delivery is involved.

Adolescent smokers quickly learn to regulate, on a puff-by-puff basis, their smoked nicotine dose by maintaining a brain concentration of the drug that just avoids nicotine toxicity but satisfies the increasing need for nicotine as dependence develops. Tobacco smoking is initially aversive for almost everyone. It is unlikely that a young person would begin tobacco or other drug smoking without the support and teach-ing from peers, the observations of admired or envied adult smokers, and the reinforcement associated with the tobacco industry's multibillion-dollar advertising that promotes the rewards of cigarette smoking.

Nicotine delivered by cigarettes offers a beginning smoker individualized and personal control of psychoactive drug dose unobtainable by any other drug delivery system. Rapid onset of nicotine toxicity, particularly the early symptoms of nausea, weakness, and sweating, gives rapid feedback that the absorbed dose is higher than optimal, exceeding the acquired tolerance level. After repeated exposure to smoking, the difficulty in concentrating and other symptoms of nicotine withdrawal that develop when brain levels are falling offer another set of cues that it is time for a cigarette to be smoked.

If nicotine toxicity is avoided, adult tobacco smokers report enhanced concentration and improved mood. Attention to task performance improves, as does reaction time and problem solving. Adult smokers report enhanced pleasure and reduced anger, tension, depression, and stress after a cigarette. Whether performance and enhanced mood after smoking are due to relief of abstinence symptoms rather than intrinsic effects of nicotine remains unclear. However, enhanced performance of nonsmokers after nicotine suggests some direct nicotine enhancement. Reports from adolescent tobacco users parallel those of adults, which suggests that nicotine has these same pharmacologic effects in an adolescent smoker (Corrigall, Zack, Eissenberg, Belsito, & Scher, 2001).

Nicotine, by its effects on nicotinic cholinergic receptors in the brain, enhances or modulates release of many neurotransmitters—dopamine, norepinephrine, acetylcholine, serotonin, vasopressin, β‎-endorphin, glutamate, GABA, and others (Tobacco Advisory Group Royal College of Physicians, 2000). Thus changes in brain neurochemistry after nicotine exposure are profound. Neurotransmitter release is assumed to mediate nicotine's positive effects on arousal, relaxation, cognitive enhancement, relief of stress, and depression. The mesolimbic dopamine system is important in mediating the pleasurable and other rewards of nicotine, as with other drugs of abuse, and is important for understanding the withdrawal phenomena as well.

When brain nicotine levels decrease, diminished neurotransmitter release contributes to a relative deficiency state. The resulting symptoms of withdrawal—craving, lethargy, irritability, anger, restlessness, inability to concentrate, anxiety, depressed mood, and other symptoms that characterize a nicotine withdrawal syndrome—develop rapidly (DiFranza et al., 2002). Regular adolescent smokers report withdrawal symptoms similar to those reported by adults. Whether the withdrawal symptoms experienced by an adolescent nicotine addict are more or less intense after comparable levels of nicotine exposure is not established.

Young smokers who are still experimenting are likely to become regular smokers surprisingly rapidly. The precise numbers that go on to regular smoking and factors that influence progression from experimentation to regular smoking for any single individual remain uncertain. Measurable symptoms of nicotine dependence occur within weeks of the beginning of occasional nicotine use, probably well before daily smoking has been established. One third to one half of adolescents who experiment with more than a few cigarettes become regular smokers (Colby, Tiffany, Shiffman, & Niaura, 2000).

Nicotine dependence is associated with tolerance, cravings for tobacco, desire to use tobacco, withdrawal symptoms when nicotine dose is decreased or unavailable, and loss of control over frequency and duration of use. The criteria common to all drugs are used to diagnose dependence as defined in the DSM-IV (American Psychiatric Association, 1994).

Although traditionally it has been assumed that a period of sustained, daily use is required to produce dependence, in recent years clinical observations of adolescent smokers and data from animal laboratory experiments suggest that dependence develops rapidly in adolescent smokers and in adolescent laboratory animals (Abreu-Villaca et al., 2003; DiFranza et al., 2000; Slotkin, 2002). Some adolescent smokers demonstrate evidence of nicotine dependence well before becoming daily smokers and possibly after only a few days of intermittent tobacco smoking (DiFranza et al., 2000; O'Loughlin, Tarasuk, DiFranza, & Paradis, 2002).

This pattern is consistent with a variety of evidence from animal research showing that an adolescent brain is more susceptible to rapid development of nicotine dependence (Abreu-Villaca et al., 2003). Animal researchers have focused on possible brain mechanisms that account for the special susceptibility of adolescent brains (Slotkin, 2002). For example, nicotine exposure in adolescent rats results in greater and more persistent nicotine receptor up-regulation and cholinergic activity than in adult animals. The rapidity of change in the animal models is consistent with adolescent smokers who develop evidence of nicotine dependence after only a few days' experience with just a few cigarettes (DiFranza et al., 2000, 2002). Brief nicotine exposure results in alterations in cholinergic receptor activity lasting at least 1 month after exposure in rats, which suggests that brief exposure to nicotine changes cholinergic tone in a persistent manner. The level of exposure in the animal models was thought to be in the range experienced by adolescents occasionally smoking three to five cigarettes a day. The data suggest the possibility that brain mechanisms that account for nicotine dependence can be activated by nicotine exposure from only occasional smoking. Although nicotine has a variety of systemic effects in a smoker, particularly cardiovascular and neuroendocrine changes, some animal researchers believe they have found evidence of a primary neurotoxicity as well (Slotkin, 2002) with lasting cell injury, particularly cholinergic system cells. There is no evidence of cholinergic toxicity in human studies. In summary, animal experiments with nicotine suggest rapid and persistent changes in nicotinic receptor and cholinergic function in adolescent rat brains with doses perhaps as little as one tenth of those ingested by regular tobacco smokers.

Determinants of Smoking

Nicotine is essential to maintain tobacco smoking, but the beginning of tobacco addiction, as with other addictions, is influenced mostly by nonpharmacologic, learned, or conditioned factors. Peer influence, social setting, personality, and genetics determine who begins and who continues to smoke. In order to develop and implement more effective prevention and treatment programs for adolescent tobacco use, a greater understanding of the determinants of these behaviors is needed. The following summarizes a few of these determinants.

Socioenvironmental factors.

Socioenvironmental factors can have an important influence on youth tobacco use. For example, smoking among peers is a powerful determinant of smoking initiation and progression (Choi, Pierce, Gilpin, Farkas, & Berry, 1997; Conrad, Flay, & Hill, 1992). Tobacco industry promotional activities can also have a significant impact on adolescent smoking behavior (Choi, Gilpin, Farkas, & Berry, Pierce, 1998). Of particular relevance to prevention strategies are socioenvironmental factors that protect against youth smoking. For example, adolescents who are involved in interscholastic sports and non-school-related physical activity are less likely to be established smokers (Escobedo, Marcus, Holtzman, & Giovino, 1993; Patton et al., 1998; Thorlindsson & Vihjalmsson, 1991). Religious affiliation appears to be protective against smoking (Heath, Madden et al., 1999), as are school and home smoking restrictions (Farkas, Gilpin, White, & Pierce, 2000; Wakefield et al., 2000).

Psychological factors.

Relatively less attention has been devoted to the role of psychological factors in youth smoking. Available data suggest that tobacco use and nicotine dependence are more common among adolescents who experience depression symptoms (Escobedo, Kirch, & Anda, 1996; Wang et al., 1999), particularly those with more serious psychiatric conditions (Bresleau, 1995). Adolescents with ADHD are at greater risk for tobacco use (Milberger, Biederman, Faraone, Chen, & Jones, 1997). And weight concerns appear to promote smoking initiation and current smoking in female adolescents (French, Perry, Leon, & Fulkerson, 1994).

While some socioenvironmental and psychological factors appear to play an important role in the early stages of smoking uptake, genetic factors may be more influential in the development of nicotine dependence. Differentiation of the precise set of factors that are important in each of these transitions is a critical step toward developing effective strategies to prevent progression to addicted smoking and facilitate quitting.

Individual variability.

There are abundant data supporting the heritability of cigarette smoking (see discussion at the end of this chapter). This variability influences the subjective effects of nicotine. Nicotine has both positive reinforcing effects (e.g., enhances alertness, arousal, pleasure) and negative reinforcing effects (relieves adverse mood and withdrawal symptoms) (Pomerleau & Pomerleau, 1984). Individual differences in the rewarding effects of the initial dose of nicotine from a cigarette may account for the observation that some young adults become dependent smokers, whereas others can experiment and not progress to nicotine dependence (Eissenberg & Balster, 2000; Flay, d'Avernas, Best, Kersell, & Ryan, 1983). In support of this hypothesis, one cross-sectional analysis found that pleasant emotional and physiological effects of the initial smoking experience discriminated teens who continued to ex-periment with cigarettes and those who did not (Friedman, Lichtenstein, & Biglan, 1985). Among adults, retrospective reports of the rewarding effects of the initial smoking experience (e.g., pleasurable rush or buzz, relaxation) were associated with current levels of nicotine dependence (Pomerleau, Pomerleau, & Namenek, 1998). Studies of genetic influences in nicotine metabolism may lead to better forms of smoking prevention and treatment. For example, on the basis of this new knowledge, researchers are testing medications that may reduce nicotine metabolism rate, thereby increasing aversive effects of initial smoking experiences (Sellers, Tyndale, & Fernandes, 2003).

Personality traits.

Novelty seeking as a personality trait has been linked to tobacco use during adolescence (Wills, Vaccaro, & McNamara, 1994; Wills, Windle, & Cleary, 1998) and early onset of smoking in adolescent boys (Masse & Tremblay, 1997). Genetic studies have related novelty seeking with genetic variants in the dopamine pathway (Noble et al., 1998; Sabol et al., 1999), suggesting that these genetic effects on smoking behavior may be mediated in part by novelty-seeking personality traits. For example, adolescents who are novelty seekers or risk takers may be exposed at a younger age to peer smoking influences. There is also evidence to suggest that hostility, impulsivity, or anxiety-related traits might mediate the influence of serotonergic gene variants on smoking behavior (Gil bert & Gilbert, 1995). While not yet tested in the tobacco arena, interventions that include messages and format targeted to adolescents with these predisposing personality traits may be more effective than broad-based appeals (Lerman, Patterson, & Shields, 2003).

Other Substances

MDMA (Ecstasy)

MDMA (3,4-methylenedioxymethamphetamine), also called ecstasy and other names, has similarities to other amphetamines with stimulant and hallucinogen-like properties (Green, Mechan, Elliott, O'Shea, & Colado, 2003). Usually taken orally, it can also be injected. MDMA produces feelings of energy along with a pleasurable, altered sense of time and enhanced perception and sensory experiences. MDMA effects last 3 to 6 hr. A typical oral dose is one or two tablets, each containing 60 to 120 mg of MDMA, although recently the average dose may be increasing. As is characteristic of all illicit drugs, the chemical content and potency of the MDMA tablets vary, thus dose estimates or even unverified assumptions about the actual drug ingested are only estimates (Cole, Bailey, Sumnall, Wagstaff, & King, 2002).

Perhaps MDMA has a special appeal to adolescents because its usual effects include, along with mental stimulation, feelings of relatedness and empathy toward other people and feelings of well-being (Cole & Sumnall, 2003). These mood effects along with the experience of enhanced sensory perception make MDMA an appealing drug, particularly as typically used in social gatherings, dances, and concerts. At higher doses or in susceptible individuals, undesirable effects include rapid onset of anxiety, agitation, and feelings of restlessness (Gowing, Henry-Edwards, Irvine, & Ali, 2002). During the period of marked intoxication, memory is impaired, sometimes for days or longer in regular users. Information processing and task performance are disrupted. Regular users and sometimes even occasional users report withdrawal phenomena when MDMA effects are wearing off. Withdrawal effects include feelings of depression, difficulty concentrating, unusual calmness, fluctuating mood, and feelings of pervasive sadness sometimes lasting a week or more after an evening of moderate MDMA use (Parrott et al., 2002).

MDMA can be associated with addictive drug-using patterns. Some users report continued use to relieve the feelings that follow MDMA use. Compared to nonusers, regular MDMA users report increased anxiety, greater impulsiveness, and feelings of aggression, sleep disturbance, loss of appetite, and reduced sexual interest (Parrott, 2001). Whether reports from users result from their MDMA use or are symptoms and behaviors that predate MDMA use is not established.

As with other amphetamine-like stimulant drugs, high doses of MDMA, particularly if used with other stimulants, can be associated with nausea, chills, sweating, muscle cramps, and blurred vision. Anxiety, paranoid thinking, and, later, depression are common. After higher doses, markedly increased blood pressure, loss of consciousness, and seizures may occur, and under certain conditions of dose, or drug combinations and heat, the body's thermoregulation mechanisms fail. The resultant marked increase in body temperature (hyperthermia) under some circumstances is rapidly followed by multiple organ failure and death (Schifano, 2003). MDMA is commonly used with alcohol, increasing MDMA toxicity.

The nature of MDMA metabolism in the body contributes to toxicity. After a dose of MDMA is rapidly absorbed, it slows its own breakdown, resulting in unexpectedly high MDMA concentrations with repeated doses (Farre et al., 2004; Green, Mechan, et al., 2003). After regular use, tolerance to the desired MDMA effects develops (Verheyden, Henry, & Curran, 2003). This tolerance leads regular MDMA users to take larger or more frequent doses, resulting in the accumulation of toxic blood levels because of the drug-induced slowdown in its own metabolism.

MDMA increases the activity of brain serotonin, dopamine, and norepinephrine (Gerra et al., 2002; Vollenweider, Liechti, Gamma, Greer, & Geyer, 2002). When compared with methamphetamine, MDMA produces greater serotonin and less dopamine release. In animals, moderate to high doses of MDMA are toxic to serotonergic nerve cells and are associated with persistent cellular changes. As MDMA behavioral effects wear off, serotonin levels decrease for days, perhaps longer. One controversy about the toxicity of MDMA involves the correct extrapolation of human doses to doses used in animal models in which signs of toxic effects can be directly observed (Green, Mechan, et al., 2003).

A relative serotonin deficit experienced by frequent MDMA users may account for the mood, sleep, and other behaviors and symptoms associated with frequent MDMA use (Parrott, 2002). As with all psychoactive drugs, the general considerations of gender, dose, frequency of exposure, and concurrent use of other drugs, along with genetic and environmental factors, are probably important determinants of the consequences of MDMA exposure for any specific individual (Daumann et al., 2003; De Win et al., 2004; Obrocki et al., 2002; Roiser & Sahakian, 2003). Certainly many people have used MDMA and appear to have avoided measurable harm, but some have died after taking MDMA. As with nicotine, animal experiments suggest that younger brains may be more susceptible to the neurotoxic effects of MDMA (Williams et al., 2003), although important experiments with adolescent animals have not yet been reported.

Inhalant Abuse

Thousands of chemicals produce vapors or can be delivered as aerosols and inhaled to produce psychoactive effects (Anderson & Loomis, 2003). Inhalants can be organized by their chemical classification (toluene or nitrous oxide, for example), by their legitimate use (as an anesthetic, solvent, adhesive, fuel, etc.), or by their means of delivery (as a gas, a vapor, or an aerosol) (Balster, 1987). What inhalants have in common is that they are rarely taken by other routes when abused, although some can be swallowed or injected.

Volatile solvents include common household and workplace products—cleaning fluids, felt-tip markers, glues, paint thinners, and gasoline (Anderson & Loomis, 2003). Volatile medical anesthetics, halothane or isofluane, and other ethers occasionally turn up among adolescent inhalant users. Another category of inhalants, aerosols, is available as the solvents in spray cans that deliver paint, deodorants, hairspray, insecticides, and other products. Inhalant gases include household and commercial gases—butane in cigarette lighters, and nitrous oxide in whipped cream delivery cans or from medical sources. Nitrites are a special class of inhalants occasionally encountered by adolescents. When inhaled, nitrites dilate blood vessels, relax smooth muscles, and, unlike other inhalants, are more stimulating than depressant and are used primarily to enhance sexual activities.

Inhalants have been used as intoxicants for hundreds of years. Inhalants, particularly solvents, are often one of the first psychoactive drugs used by children. An estimated 6% of children had tried inhalants on at least one occasion by the fourth grade. Inhalants stand out among abused drugs by being used more by younger than older children, though on occasion, inhalant abuse persists into adulthood (Balster, 1987). Although inhalant abusers generally use whatever is available, preferred agents exist, varying from region to region in an almost fad-like way.

National and state surveys indicate inhalant use peaks around the seventh to ninth grades, with 6% of eighth graders reporting use of inhalants within the previous 30 days. The prevalence of inhalant use in young adolescents exceeds marijuana use and is more frequent in boys and in adverse socioeconomic conditions. Poverty, childhood abuse, poor grades, and early school dropout are associated with greater inhalant abuse (Beauvais, Wayman, Jumper-Thurman, Plested, & Helm, 2002; Kurtzman, Otsuka, & Wahl, 2001).

Inhalants are easy to self-administer and readily available, which explains their appeal to children. The solvents can be inhaled from a bag, from a cloth held over the face, or by sniffing from the container. Aerosol propellants can be sprayed directly into the mouth, inhaled from a balloon, or sprayed into a bag and then inhaled.

As with smoked drugs, inhalant effects depend on the substance used, efficiency of the inhalant delivery system, and the amount inhaled. Length and frequency of use are important because tolerance to many effects develops.

When inhaled, the drugs move from the lungs to brain and an onset of effects occurs within seconds. Psychoactive effects dissipate within minutes when inhalation is stopped. The chemicals are distributed to other organs, potentially damaging the liver, kidneys, and peripheral nerves. The experience produced by most inhalants is similar to that of drinking alcohol: an initial feeling of relaxation, anxiety relief, and feelings of disinhibition. As the intoxication increases with repeated doses, speech becomes slurred, fine motor movements and ability to walk are impaired, and, with increasing and repeated doses, loss of consciousness and an anesthetic state or coma occur. The neural mechanisms by which inhalant intoxication occurs are not well understood (Balster, 1998). During the period of intoxication many neural systems become dysfunctional.

As intoxication wears off, a hangover state commonly ensues. The severity of the postintoxication effect depends on dose, duration of exposure, and the amount used, but typically includes headache or nausea. Inadvertent overdose is possible, particularly when the bag or other inhalant delivery system becomes positioned so that when consciousness or coordination is lost, delivery of the inhalant continues.

Long-term effects vary with inhalant and frequency of use, but can include central nervous symptoms such as fatigue, difficulty concentrating, and impaired memory (Lorenc, 2003). Some solvents or aerosols produce nosebleed, bloodshot eyes, cough, and sores on the lips, nose, or mouth. If used over long periods of time, permanent brain damage or other organ damage (kidney, liver, and peripheral nerves) can develop (Aydin et al., 2002). Tolerance to the depressant effects develops with repeated use. In frequent users, withdrawal phenomena have been described on cessation of inhalant use.

A common cause of death during inhalant use is rapid inhalation of large amounts of solvents, followed by strenuous activity. This results in impaired cardiac function and arrhythmias. Injury and death may result from accidents associated with impaired judgment, motor impairment, or falls. Suffocation from inadequate air during the inhalation of concentrated gases or solvents is possible, and because many inhalants are flammable, fires or explosions may lead to injury or death.

When asked, children typically report that they sniff inhalants because it's fun and they like the feeling of intoxication. Initial use is often in a group with considerable peer pressure. Some users report that the intoxicated state is a way to avoid experiencing or dealing with worries and problems. Although most child inhalant use is transient and initially stems from curiosity, with the wrong kind of group pressure, it becomes a repeated behavior.

GHB (Gammahydroxybutyrate; Liquid Ecstasy, Georgia Home Boy, and Other Names)

A potent CNS depressant, GHB is typically taken to produce euphoria and a relaxed and uninhibited state, similar to that produced by alcohol (Nicholson & Balster, 2001; Teter & Guthrie, 2001). GHB is a clear, odorless, slightly salty-tasting liquid. Because of its steep dose–effect curve inadvertent overdose is frequent. Nausea, vomiting, slowed heart rate, loss of consciousness, coma, respiratory depression, and seizures can require emergency treatment. Coma, along with vomiting and an obstructed airway, can lead to death. The purity and the strength of individual doses of GHB vary greatly and can contribute to overdoses, particularly by inexperienced users. When GHB is taken with other CNS depressants, lethality increases. Deaths from GHB typically occur after combined use with alcohol. GHB is so rapidly metabolized that postmortem toxicology statistics may underestimate its frequency.

Regular users of GHB report that they must increase the dose to attain euphoric and relaxing effects; thus tolerance seems likely. A withdrawal state with increased heart rate, restlessness, anxiety, agitation, delirium, and disrupted sleep follows sudden cessation of regular GHB use (Miotto et al., 2001). GHB has been perceived as a safe drug because it was available in health food stores as a dietary supplement. Its potential toxicity may be underestimated by adolescents (Mason & Kerns, 2002). Although now a con trolled drug, GHB precursors are readily available through Internet distributors. Because it is odorless and relatively tasteless, GHB has reportedly been added to the drink of unsuspecting victims. It can sedate or anesthetize an unwary recipient, leading to its use as a date rape drug (Schwartz, Milteer, & LeBeau, 2000).

Rohypnol: Flunitrazepam (Roofies, Rophie, Forget Me)

Rohypnol is a potent benzodiazepine sedative drug with similarities to Valium or Xanax, except for its increased potency (Simmons & Cupp, 1998). Although a prescribed medication in some countries, Rohypnol is not approved for prescription use in the United States. Taken by mouth in tablet form or when dissolved in beverages, Rohypnol rapidly produces profound sedation or loss of consciousness and marked amnesia for events occuring during the period of intoxication. With no odor, and almost tasteless, it can easily be administered to someone without their knowledge. Like GHB it has been associated with date rape and other sexual assaults (Schwartz et al., 2000; Slaughter, 2000).

Hallucinogens

Hallucinogens are a pharmacologically diverse group of drugs. They have in common the ability to produce profound distortions in sensory perception but accompanied by a relatively clear level of consciousness (Hollister, 1968). The perceptual distortions are typically termed hallucinations, though in fact true hallucinations are relatively uncommon. The sought-after alterations in visual images, perception of sounds, and bodily sensations are sometimes accompanied by intense mood swings and feelings of being out of control that can be disturbing to the uninitiated (Strassman, 1984).

Some humans have valued hallucinogenic drugs for thousands of years. The older hallucinogenic plants, for example, mescaline, psilocybin, or ibogaine, contain chemicals structurally similar to brain neurotransmitters such as serotonin, dopamine, and norepinephrine. Historically, drug-induced hallucinogenic states were typically part of social and religious rituals rather than entertainment. Plant-based hallucinogens are still available and are even sold over the Internet (for example, psilocybin mushrooms and peyote cacti), but since the 1960s the prototype hallucinogen has been LSD (lysergic acid diethylamide), an extremely potent, chemically synthesized drug, readily available through illicit sources and, compared to many drugs, relatively inexpensive (Hofmann, 1994). LSD's physiologic effects are relatively few and mild—dilated pupils, increased deep tendon reflexes, increased muscle tension, and mild motor incoordination. Heart rate increases as does blood pressure and respiration, but not greatly. Nausea, decreased appetite, and increased salivation are common.

In nontolerant users, about 25 μ‎g of LSD is a threshold dose. The psychological and perceptual state produced by LSD is in general similar to that produced by mescaline, psilocybin, and hallucinogenic amphetamine analogs. The major difference is potency. LSD is hundreds to thousands of times more potent. Acquired tolerance to LSD can be profound. After 3 days of successive daily doses, a 4-or 5-day drug-free period is necessary to again experience the full sensory effects. This limits, to some extent, frequency of use.

In recent years, LSD has been distributed as “blotter acid”—that is, on sheets of paper perforated into postage stamp size squares with each square containing 30 to 75 μ‎g of LSD, ingested as a chewed dose. The effects of a single dose last from 6 to 12 hr, diminishing gradually.

LSD alters the function of brain serotonin receptors (Aghajanian & Marek, 1999). At higher doses LSD can produce a distressing drug-induced psychosis with similarities to naturally occurring psychotic states, such as acute schizophrenia. The user has difficulty in recognizing reality, thinking rationally, and communicating easily with others (Blaho, Merigan, Winbery, Geraci, & Smartt, 1997; Strassman, 1984).

For reasons not well understood, an LSD-induced experience can be psychologically traumatic, particularly for poorly prepared novices. The symptoms persist long after the pharmacologic effects of LSD have worn off (Blaho et al., 1997). An LSD persistent psychosis with mood swings ranging from mania to depression, visual disturbances, and hallucinations is relatively un common. Individuals who are predisposed, for genetic or other unknown reasons, to developing schizophrenia may be more likely to experience this (Hollister, 1968).

A disorder known as flashbacks, or more formally in DSM-IV, hallucinogen-persisting perception disorder, has been described (Halpern & Pope, 2003). Recurrent, primarily visual disturbances follow even a single exposure to LSD or other hallucinogen and recur over days or months. Flashback symptoms typically last only a few seconds. Only an occasional disorder following hallucinogen use, a flashback can be a substantial problem when it occurs. Considering the multiple hallucinogen doses taken by millions of people since the late 1950s, relatively few cases of flashback phenomena have been reported in the scientific literature (Halpern & Pope, 1999).

Other hallucinogens such as mescaline, consumed in the form of peyote buttons from cacti, or tryptamine hallucinogens, for example, dimethyltryptamine (DMT), are less commonly used, probably because they are less available to adolescents. In recent years, however, trafficking over the Internet has enhanced availability (Halpern & Pope, 2001). Psilocybin is occasionally available to adolescents, usually ingested as psilocybin-containing mushrooms. Psilocybin sold illicitly as pills or capsules more likely contains phencyclidine or LSD rather than psilocybin.

Newly rediscovered hallucinogens appear regularly. An example is Salvia divenorum, recently popularized through Internet resources (Halpern & Pope, 2001). Salvia illustrates the rapid awareness, increased interest, and progression of use of an old substance that, without the Internet, would likely have remained a relatively unknown plant hallucinogen. Salvia is a mint plant long used as a medicine and sacred sacrament in rural Mexico (Sheffler & Roth, 2003). A relatively mild hallucinogen at usually ingested doses, it is easily cultivated and now extensively discussed, advertised, and sold inexpensively via the Internet. Its active and potent component, salvinorin-A, is absorbed when the plant is chewed or the leaves are smoked. Salvia's pharmacologic effects and metabolite fate have not been adequately researched.

Phencyclidine (PCP, “Angel Dust”) and Ketamine (K, Special K, Vitamin K, Kat Valium)

Phencyclidine (PCP) and a shorter-acting analogue, ketamine, were developed as surgical anesthetics (Reich & Silvay, 1989). At lower doses both alter perception and produce feelings of detachment and of being disconnected or dissociated from the environment, leading to use of the term dissociative anesthetics to describe this class of drugs and distinguish them from hallucinogens. At anesthetic doses patients are quiet but with eyes open, fixed in a gaze, and in a seeming cataleptic state without experiencing pain during a surgical procedure. Both PCP and ketamine produce similar effects by altering the distribution of an important brain neurotransmitter, glutamate.

Phencyclidine anesthesia produced a sometimes distressing delirium as the anesthetic was wearing off, so ketamine, which is shorter acting and slightly less potent but associated with briefer and less troublesome delirium, replaced it. Most abusers do not overdose to full anesthetic levels (Freese, Miotto, & Reback, 2002). However, depending on drug dose and tolerance, PCP or ketamine intoxication can progress from feelings of detachment and perceptual changes through confusion, delirium, and psychosis to coma and coma with seizures (Dillon, Copeland, & Jansen, 2003; Jansen & Darracot-Cankovic, 2001). After overdose, the progression to recovery follows the reverse pattern. Treatment of symptoms is primarily supportive. Ketamine produces a shorter period of intoxication; in a surgical setting a single anesthetic dose produces coma for only 10 minutes as compared to a much longer coma after a single large dose of phencyclidine. When abused, these drugs can be taken by mouth or, for more rapid effects, smoked or sniffed. When used medically they are injected. With frequent use, tolerance and dependence develop (Pal, Berry, Kumar, & Ray, 2002).

Ketamine is odorless and tasteless and can be surreptitiously added to someone's drink to produce a period of impaired awareness and amnesia. Thus ketamine has been used during sexual assaults and date rape. Phencyclidine is inexpen sive to produce and distribute so it is often substituted for other illicit drugs—for example, it is misrepresented as MDMA or THC.

Club Drugs

The term club drugs refers to a variety of drugs that have in common only that they are typically used at all-night parties or “rave” dances, clubs, and bars (Smith, Larive, & Romanelli, 2002; Weir, 2000). The drugs in this group are varied. Their pharmacology and patterns of use vary in different regions (Gross, Barrett, Shestowsky, & Pihl, 2002). Patterns of use, dose, and popular drug mixes change over time. The most common club drugs, particularly marijuana, cocaine, MDMA (ecstasy), and methamphetamine, are discussed elsewhere in this chapter. Club drugs that have come to the attention of adolescents include GHB, flunitrazepam (Rohypnol), and ketamine. Thus the list includes stimulants, depressants, and hallucinogens. MDMA, GHB, and Rohypnol have received the most recent attention as club drugs. The special appeal of club drugs to an adolescent includes their novelty and fad-like qualities. Unfortunately, among users there is a misperception about the relative safety of club drugs (Koesters, Rogers, & Rajasingham, 2002). Their use, particularly by novices, can lead to serious health problems (Tellier, 2002).

THE NEUROBIOLOGY OF ADDICTION

As discussed above, most adolescents who experiment with drugs do not progress to clinical problems. Some of them progress to the level of abuse and a smaller number progress to addiction (dependence). The latter has been the focus of much biological research because the chronic relapsing nature of addiction suggests that changes in the brain underlie its persistent course. Over the past several decades, neuroscientists have uncovered compelling evidence supporting the notion that addiction is a disease, primarily affecting specific brain regions that mediate motivation and natural reward. With the help of animal models and direct studies on addicted human subjects, scientists are rapidly unraveling neuronal mechanisms that underlie many of the clinical features of addiction, including drug euphoria, tolerance, withdrawal, craving, and hedonic dysregulation. It is now apparent that brain reward circuitry is stimulated by addictive agents during drug-induced euphoria and disrupted over the course of chronic exposure. The chronic dysregulation of these reward-related regions explains many of the clinical manifestations of addiction, and the restoration of normal hedonic function through medical interventions should ultimately improve the prognosis of this refractory disorder. Interestingly, addictive agents as diverse as heroin, alcohol, and cocaine (to name only a few) produce many similar neurochemical effects, supporting the established classification of different substance dependence disorders within the single category of addiction.

The interaction between an addictive exogenous agent and endogenous reward-related circuitry produces two powerful forces, euphoria and craving, that initiate and drive addiction. Whether motivated by curiosity, boredom, peer pressure, or thrill seeking, the initial use of a euphoric drug indelibly embeds the experience into memory. Since we organisms are neurologically “wired” to repeat pleasurable experiences, drug euphoria positively reinforces subsequent use. When used excessively, addictive drugs produce unpleasant states (craving, withdrawal, impaired hedonic function) that negatively reinforce use and alternate with euphoria to produce a vicious cycle of addiction that becomes increasingly entrenched and uncontrollable, regardless of negative consequences. Although psychological, psychosocial, and environmental factors play critical roles in the initiation and perpetuation of addiction, brain involvement explains many of its paradoxes and provides important clues for the development of more effective and durable treatments.

Biological Research Based on Animal Models

Since the discovery of “pleasure centers” by Olds in the early 1950s, extensive research has been conducted using animal models that address the acute and chronic effects of addictive drugs on reward-related brain regions. These studies have contributed tremendously to our understanding of addiction by delineating relevant neuronal mechanisms and proposing hypotheses to define the disorder. Drug addiction, also known as substance dependence (American Psychiatric Association, 1994), is a chronically relapsing disorder that is characterized by (1) compulsion to seek and take the drug, (2) loss of control in limiting intake, and (3) emergence of a negative emotional state (e.g., dysphoria, anxiety, irritability) when access to the drug is prevented (defined here as dependence; Koob and Le Moal, 1997). In experimental animals, the occasional but limited use of an addictive agent is very distinct from escalated drug use and the emergence of chronic drug dependence. Therefore, an important goal of current research is to understand the neuropharmacological and neuroadaptive mechanisms within reward-related neurocircuits that mediate the transition between occasional, controlled drug use and the loss of behavioral control over drug seeking and drug taking that defines chronic addiction (Koob and Le Moal, 1997).

Historically, addiction was originally defined as the presence of an acquired abnormal state where a drug is needed to keep a normal state (Himmelsbach, 1943). Eventually, the definition of addiction became tied to the emergence of intense physical disturbances when drug taking ceased (Eddy, Halbach, Isbell, & Seevers, 1965). However, this definition did not capture many aspects of addiction that are unrelated to physical withdrawal, necessitating a second definition of “psychic” dependence in which a drug produces “a feeling of satisfaction and a psychic drive that require periodic or continuous administration of the drug to produce pleasure or to avoid discomfort” (Eddy et al., 1965). Producing pleasure and avoiding discomfort, encountered clinically as euphoria and craving, are now accepted as the primary forces that drive addiction.

From a modern perspective, drug addiction has aspects of both impulse control disorders and compulsive disorders (Fig. 17.5). Impulse control disorders are characterized by an increasing sense of tension or arousal before committing an impulsive act; pleasure, gratification, or relief is felt at the time of committing the act; and following the act there may or may not be regret, self-reproach, or guilt (American Psychiatric Association, 1994). In contrast, compulsive disorders are characterized by anxiety and stress before committing a compulsive repetitive be havior, and relief from the stress by performing the compulsive behavior. As an individual moves from an impulsive disorder to a compulsive disorder there is a shift from positive reinforcement (euphoria) driving the motivated behavior to negative reinforcement (craving, or discomfort) driving the motivated behavior. Drug addiction can be viewed as a disorder that progresses from impulsivity to compulsivity in a collapsed cycle of addiction comprised of three stages: preoccupation and anticipation, binge intoxication, and withdrawal and negative affect (Fig. 17.6). These stages have biological, social, and psychological aspects that feed into each other, intensify, and ultimately lead to the pathological state known as addiction (Koob & Le Moal, 1997).

Figure 17.5 Impulse control disorders and compulsive disorders in drug addiction.

Figure 17.5
Impulse control disorders and compulsive disorders in drug addiction.

Figure 17.6 Criteria for substance dependence (DSM-IV).

Figure 17.6
Criteria for substance dependence (DSM-IV).

Given these considerations, the modern view of addiction has shifted from a focus of physical withdrawal symptoms to the motivational aspects of addiction. This shift in emphasis is supported by the clinical axiom that mere detoxification (the elimination of drug from the body with pharmacological suppression of physical withdrawal symptoms) is insufficient treatment for addiction. More central to the transition from drug use to addiction is the emergence of negative emotions, including craving, anxiety, and irritability, when access to the drug is prevented (Koob & Le Moal, 2001). Indeed, some have argued that the development of such a negative affective state should define addiction.

Animal Models of Drug Reward

Through extensive animal research, the neurotransmitters and brain circuits that mediate drug reward, have been largely delineated; the biological basis of drug reward is exemplified by the fact that laboratory animals will press levers to receive addictive substances. When provided unlimited access, animals will consistently self-administer cocaine and amphetamine to the point of death, and the power of drug reward should not be underestimated in the clinical setting. Diverse classes of addictive drugs affect different neurotransmitter systems and produce distinct activation patterns within reward circuits. Many addictive substances (including heroin, cocaine, amphetamine, alcohol, nicotine, and marijuana) acutely increase the neurotransmitter dopamine in elements of the ventral striatum, specifically the nucleus accumbens, but this increase is most robust for psychomotor stimulants and much more modest for sedative hypnotics. Other neurotransmitter systems are also involved, including opioid peptides, GABA, glutamate, and serotonin, and play more critical roles as one moves out of the domain of psychomotor stimulants. Dopamine levels in the nucleus accumbens are also elevated during activities that lead to natural rewards, providing compelling evidence that addictive drugs tap into natural motivational circuits.

Animal Models of Motivational Effects of Withdrawal

Although considerable focus in animal studies has been directed toward neuronal sites and mechanisms that produce drug reward, new animal models have been developed to examine negative emotional states produced by neuroadaptations caused by repeated drug administration. Although drug reward certainly reinforces repeated use, the transition to drug addiction appears to require an additional source of reinforcement, the reduction of negative emotional states that are associated with repeated drug administration. The ability of addictive drugs to produce reward and negative emotional states (which they temporarily alleviate) is a powerful combination that positively and negatively reinforces the compulsive cycle of addiction. Negative emotional states can be measured by using animal models that evaluate hedonic function through the use of electrical current delivered directly into brain reward regions, which is termed intracranial self-stimulation (ICSS). Animals will press levers to deliver this current, but after chronic exposure to drugs (including cocaine, morphine, alcohol, marijuana, and nicotine), more current is required to support ICSS (Koob, Sanna, & Bloom, 1998). These animal studies support the hypothesis that repeated drug use leads to hedonic dysregulation, which is thought to be manifested clinically by craving, anxiety, irritability, and other unpleasant feelings that arise during drug withdrawal.

A framework that models the transition from drug use to addiction involves prolonged access to intravenous cocaine self-administration. Typically, after learning to self-administer cocaine, rats allowed access to cocaine for 3 hr or less per day establish highly stable levels of intake and patterns of responding between daily sessions. When animals are allowed longer access to cocaine, they consistently self-administer almost twice as much at any dose tested. These findings further suggest that there is an upward shift in the set point for cocaine reward when intake is escalated (Ahmed and Koob, 1998). In addition, escalation in cocaine intake was highly correlated with reduced hedonic function, as measured by ICSS. The decreased function of the reward system failed to return to baseline levels before the onset of each subsequent self-administration session, thereby deviating more and more from control levels. These studies provide compelling evidence for brain reward dysfunction in compulsive cocaine self-administration.

Extended Amygdala: A Key Component of the Brain Reward System

Historically, the brain reward (pleasure) system was thought to be a series of brain cells and pathways that projected bidirectionally from the midbrain to forebrain, and forebrain to midbrain, and it was called the medial forebrain bundle. A part of the forebrain component of the reward system has been termed the extended amygdala (Heimer & Alheid, 1991) and may represent a common anatomical substrate for acute drug pleasure and the dysphoria associated with compulsive drug use. The extended amygdala is made up of three major structures: the bed nucleus of the stria terminalis (BNST), the central nucleus of the amygdala, and a transition zone in the medial subregion of the nucleus accumbens (shell of the nucleus accumbens; Heimer and Alheid, 1991). Each of these regions shares certain cell types and connections (Heimer and Alheid, 1991), receives information from limbic structures such as the basolateral amygdala and hippocampus, and sends information to the lateral hypothalamus, a brain structure long associated with processing basic drives and emotions.

A principal focus of research on the neurobiology of the pleasurable effects of drugs of abuse has been the origins and terminal areas of the midbrain dopamine system, and there now is compelling evidence for the importance of this system in drug reward (Le Moal & Simon, 1991). The major components of this circuit are the projection of brain cells containing dopamine from the ventral tegmental area (the site of dopaminergic cell bodies) to the basal forebrain, which includes the nucleus accumbens. Other chemical transmitters form the many neural inputs and outputs that interact with the ventral tegmental area and the extended amygdala and include opioid peptides, GABA, glutamate, and serotonin (Koob, 1992). While dopamine is critical for the reward associated with cocaine, methamphetamine and nicotine, it has a less critical role in the pleasure associated with opiates, PCP, and alcohol. Endogenous opioid peptides (such as β‎-endorphin and enkephalin) and their receptors have important roles in opiate and alcohol reward.

Role of the Extended Amygdala in the Dysphoria Associated With Addiction

Addictive drugs may produce dysphoric effects during their withdrawal by disrupting the same sites that they activate during drug reward. As such, negative emotional states associated with chronic drug exposure may reflect the dysregulation of the extended amygdala and midbrain dopamine systems that are implicated in drug reward. There is considerable evidence that dopamine activity is decreased during drug withdrawal (as opposed to being increased during drug reward), and alterations in the activity of other reward-related neurotransmitter systems, such as glutamate, endogenous opioids, GABA, and serotonin, have also been reported.

Stress-related chemical systems in the extended amygdala may also contribute to the dysphoria associated with dependence. Drugs of abuse not only activate the brain pleasure systems but also activate the “stress” systems within the brain. One major component of the brain stress system is the brain peptide corticotropin-releasing factor (CRF), which controls the master gland (pituitary) hormonal response to stress, the sympathetic system (fight or flight) response to stress, and behavioral (emotional) responses to stress (Koob & Heinrichs, 1999). Increases in brain and pituitary CRF are associated with the dysphoria of abstinence from many drugs, including alcohol, cocaine, opiates, and marijuana (Koob et al., 1998). Another component of the brain stress systems is the chemical norepinephrine, which also is associated with the dysphoria of drug withdrawal. Conversely, the acute withdrawal from some drugs such as alcohol is associated with decreases in the levels of the brain “anti-stress” neuropeptide Y (NPY) in the extended amygdala (Roy & Pandey, 2002). It has been hypothesized that decreased NPY activity, combined with increased CRF activity, may contribute significantly to the dysphoric effects of drug withdrawal. This suggests that addictive drugs not only reduce the functional integrity of reward-related neurotransmitter systems but also produce stress by enhancing stress-related chemicals (CRF and norepinephrine) and reducing the NPY anti-stress system. Should even a small part of these changes persist beyond acute withdrawal, a powerful drive for resumption of drug taking would be established.

Animal Models for Conditioned Drug Effects

Through classical conditioning, environmental cues that have been repeatedly paired with drug administration can acquire drug-like (rewarding) and drug-opposite (dysphoria) properties that contribute significantly to drug craving and relapse in the clinical setting. Human studies have shown that the presentation of stimuli previously associated with drug delivery or drug withdrawal produce craving and increase the likelihood of relapse (Childress, McLellan, Ehrman, & O'Brien, 1988; O'Brien, Testa, O'Brien, Brady, & Wells 1977). A number of animal models are available to characterize the conditioning effects imparted on formerly neutral environmental stimuli that are subsequently paired with drug self-administration. For instance, stimuli that previously signaled drug availability will cause an animal to continue to press the lever even when drug is no longer available. In other situations, animals can be trained to work for a previously neutral stimulus that predicts drug availability. In an extinction procedure, responding with and without drug-related cues provides a measure of the rewarding effects of drugs by assessing the persistence of drug-seeking behavior. Drug-related cues can also reinstate responding by animals long after drug responding has been extinguished. These findings are important because the learning reinforced by drugs persists long after the drug has been eliminated from the body. In humans, these learned or conditioned effects can produce drug craving and possible re lapse long after the patient is discharged from a drug-free rehabilitation program.

Brain studies support an important role for discrete glutamate-containing brain regions in conditioned cue-related phenomena associated with addictive drugs, providing insight into the mechanism of craving and relapse vulnerability. Destruction of parts of the glutamate-rich basolateral amygdala (called “lesioning”) blocks the development of conditioned drug-like effects in a variety of situations, indicating that this region may be an important neural substrate for drug craving. Interestingly, these results are consistent with brain imaging studies in humans that have shown that cocaine cue-induced craving is associated with activation of the amygdala and anterior cingulate cortex. The neurochemical substrates underlying drug-like effects also involve the dopamine and opioid chemical systems located in the basal forebrain (for review, see Hyman & Malenka, 2001; Weiss et al., 2001). Evidence supports a role for opioid peptide systems in cue-induced reinstatement of drug administration, and previously neutral stimuli associated with opiate withdrawal have long been associated with dysphoric-like effects in animal models of conditioned withdrawal (Goldberg & Schuster, 1967). These animal studies are consistent with clinical findings that the opiate antagonist naltrexone attenuates craving associated with exposure to alcohol cues and reduces relapse rates in abstinent alcoholics (O'Brien, Childress, Ehrman, & Robbins, 1998). There are reciprocal interactions between dopamine and glutamate neurotransmission in the nucleus accumbens and hippocampus that are important in regulating the expression of long-term potentiation (LTP), a process considered essential for learning and memory. Thus addiction-related learning factors may involve limbic glutamate-dependent neuroadaptations.

The extended amygdala and its connections may also be critical substrates for drug-opposite effects. Basolateral amygdala lesions block the development of conditioned drug-opposite effects (Schulteis, Ahmed, Morse, Koob, & Everitt, 2000), and animal studies indicate that the aversive aspects of opiate withdrawal may involve both the central nucleus of the amygdala and the nucleus accumbens. The shell of the nucleus accumbens and central nucleus of the amygdala, key elements of the extended amygdala, showed the greatest opiate withdrawal-induced nerve cell activation that paralleled the development of conditioned place aversions (Schulteis & Koob, 1996).

In summary, brain substrates implicated in conditioned drug effects have been intensely researched, paired both with drug administration and drug withdrawal. The amygdala, and specifically the basolateral amygdala, is an important substrate in reward-related memory and conditioning. This structure has a critical role in the consolidation of emotional memories that have long been recognized as an essential component of the addictive process (Cahill & McGaugh, 1998). How such drug memories relate to memory circuitry in general and contribute to the dysregulation of already strained reward circuits is a subject for future studies.

Molecular Aspects of Addiction

The chronic administration of diverse addictive drugs produces similar molecular changes in second messengers (the immediate chemical actions that occur after a drug binds to a receptor), signal transduction pathways, and transcription factors (chemicals that change gene expression) within cells that populate reward centers (Koob et al., 1998). The prolonged expression of transcription factors produced by drugs of abuse, such as cocaine, may be relevant to the sustained dysregulation of reward centers (Self & Nestler, 1995). One such transcription factor, Δ‎ FosB, is produced at increased levels in specific brain regions after the repeated administration of addictive drugs (Nestler, Kelz, Chen, & Fos, 1999). The chronic administration of alcohol also produces molecular changes, including reduced activity of cyclic adenosine monophosphate response element binding (CREB) protein in the central nucleus of the amygdala that is linked to alcohol withdrawal (Pandey, Zhang, & Roy, 2003). Decreased CREB activity has been linked to reduced expression of NPY in the central nucleus of the amygdala, which was previously described as an important anti-stress chemical affected by addictive drugs. Thus, changes in transcription factors may provide a model by which molecular events translate into neurochemical changes in the extended amygdala that affect motivation and contribute to the development of addiction.

Neuroadaptations and Allostasis

The counteradaptation hypothesis presented in this section essentially states that within reward-related circuits, the acute rewarding effect of a drug is counteracted by regulatory changes that exert aversive states (Siegel, 1975; Solomon & Corbit, 1974). While the pleasurable effects of drugs occur immediately in response to the pharmacological activation of reward centers, negative hedonic effects emerge later, persist, and intensify with repeated exposure. Hedonic dysregulation worsens over time because repeated use, while providing temporary relief, merely exacerbates the problem. The concept of “allostasis” has been proposed to explain how physiological brain changes contribute to relapse vulnerability. In contrast to homeostasis, in which a system returns to normal function, allostasis defines a brain reward system that does not return to normal but remains in a persistent dysphoric condition because of a shift in the reward set point (Koob and Le Moal, 2001). Fueled not only by the dysregulation of reward circuits but also by the activation of brain and hormonal stress systems, this process leads gradually to the loss of control over drug intake. Disruptions in reward-related neurotransmitter systems (dopamine, glutamate, opioids, serotonin, GABA) that contribute to emotional dysfunction in drug addiction and alcoholism persist long into abstinence, and are reasonable targets for pharmacological treatments. Restoring normal hedonic function in drug addiction and alcoholism could significantly reduce relapse in these treatment-refractory conditions.

Vulnerability of the Developing Nervous System: Focus on Adolescents

Despite great interest and relevance to the ultimate development of drug abuse and dependence, relatively little work has been done on the potentially unique vulnerability of the developing nervous system to drugs of abuse. Researchers have been slow to adapt animal models in the drug abuse field to studies of adolescent animals largely because many of the established models such as intravenous self-administration historically have required extensive time and technical expertise to establish, and the window of adolescence in rodent models is quite short (postnatal days 28–42; Varlinskaya, Spear, & Spear, 2001). However, recent studies with nicotine and alcohol have begun to characterize a pattern of results in adolescent rats that may provide critical insights into the importance of adolescent exposure for future vulnerability to addiction. When treated with nicotine, amphetamine, and alcohol, for instance, adolescent rodents show smaller responses to the acute effects of the drugs and less of a withdrawal response (Levin, Rezvani, Montoya, Rose, & Swartzwelder, 2003; Spear, 2002).

Human adolescents commonly experiment with drugs but relatively few go on to develop entrenched patterns of addiction. It is not known why some individuals are more vulnerable to addiction, although family studies support a contributing hereditary role. Neuroscientists have identified reward-related molecular machinery (circuits, receptors, and enzymes) that could be encoded by specific genetic polymorphisms that significantly affect the rewarding and aversive qualities of addictive drugs. Constitutional factors might thereby enhance addiction vulnerability, affect drug preference, or provide inherent protection. Many questions also exist regarding the effects of addictive drugs on the adolescent brain, which continues to develop and mature well into early adulthood. Does early experimentation, even in childhood, enhance the likelihood of addiction? Does the early abuse of one substance, particularly marijuana or nicotine, biologically predispose the individual to developing other substance dependence disorders? Does early drug experimentation produce persistent, clinically significant brain changes? Unfortunately, these questions remain unanswered. However, the current rate of knowledge expansion should shed light on these and other critical issues, enhancing our un derstanding of addiction and guiding the development of more effective interventions.

Adults afflicted with addiction often report that their substance use began in adolescence. However, as stated above, only a relatively small subgroup of those adolescents who try alcohol or cigarettes or use illicit drugs will progress to addiction. Adolescence is clearly a period of vulnerability for those who will continue to addiction, but what is the nature of the vulnerability? Recent brain imaging research with addicted adults suggests that the vulnerability to addiction may lie in the function (and dysfunction) of two critical brain systems: (1) the ancient brain motivational system, which underlies the powerful motivation for natural rewards such as food and sex (and is described above), and (2) the brain's inhibitory or executive function systems, responsible for inhibiting behavior and putting on the brakes, for deciding when pursuit of a desired reward would be a danger or a disadvantage, in the long term.

Deficits in Executive Inhibitory Circuitry

In adolescents, changes in the reward system are powerfully evident, with hormonal changes readying the system's response to rewards (e.g., sexual opportunity) that will reinforce the all-important (from an evolutionary standpoint) behaviors directed toward reproduction. In contrast, the brain's executive circuitry is not yet fully developed in adolescence: the frontal lobes, so critical for good decision making, are now known to continue to mature well into the 20s. This asymmetry, of a fully developed reward system and a vulnerable, not yet fully developed executive inhibitory system may help account for several familiar phenomena of normal adolescence, including the new pull of sexual rewards, increased risk taking, and decision making weighted more in the moment than in the future. The imbalance between these opposing brain systems in adolescence may represent a critical period of developmental vulnerability for exposure to powerfully rewarding drugs of abuse. The adolescent brain is able to respond to rewards, including powerful drug rewards, but the brain's systems for governing the pursuit of these rewards and for weighing the potential negative consequences of this pursuit are often lagging behind.

Individual differences in the brain's reward circuitry and in the powerful motivational response to drug cues may be important contributors to adolescent addiction vulnerability. However, there are now growing indications that defects in the brain's executive inhibitory circuitry may be an equally critical and complementary source of addiction vulnerability. The brain's frontal regions usually exert a modulatory or “braking” function on the downstream reward regions. Intact frontal regions are critical for good decision making and especially allow the individual to weigh the promise of immediate reward against other competing rewards. This attribute is particularly relevant to addiction because negative consequences resulting from active addiction are typically delayed, whereas the reward of drug intoxication is immediate. In children, the brain's frontal functions are not yet developed, which explains why young children have difficulty inhibiting impulses toward a reward, delaying gratification, and making decisions that go beyond the moment. These abilities develop further in adolescence, but the brain's frontal lobes, and their associated functions, continue maturing into young adulthood.

There appear to be striking individual differences in the effectiveness of the brain's executive inhibitory circuitry; addicted adults show a variety of deficits. For instance, subjects with substance dependence often perform poorly on tests of long-term strategy and decision making (Monterosso et al., 2001; Petry, 2001) and show deficits in neuropsychological tests that assess their ability to inhibit overtrained or prepotent responses (Petry, 2001). Brain imaging data from chronic cocaine users show both functional and structural defects in frontal regions. Functionally there is evidence of lower frontal activity (both blood flow and glucose metabolism are reduced) in these critical frontal regions of cocaine patients than that in nonusers (Childress, Mozley, et al., 1999; Volkow et al., 1992, 1993). Structurally there is evidence for less concentrated gray matter (fewer nerve cells) in the frontal regions of cocaine patients (Franklin et al., 2002), and chronic alcoholics show a similar finding (Lingford-Hughes et al., 1998). These differences in the brain's executive inhibitory circuitry might explain why substance abusers find it so difficult to inhibit or manage their cravings for drugs. Glutamate-enhancing drugs, such as modafinil, might ultimately play a role in bolstering prefrontal function (Dackis & O'Brien, 2003a).

As with some of the previously discussed findings, it is not possible to tell from a cross-sectional imaging study of addicted adults whether an observed brain difference predates the long history of drug use or whether it reflects the impact of long-term drug exposure. Studies with primates do show that chronic exposure to stimulants can undermine frontal inhibitory functions (Jentsch, Olaussen, De La Garza, & Taylor, 2002), which may help explain the poor frontal function in some human cocaine users. But the primate findings do not preclude the possibility that adolescents with poorer frontal function may be at early risk for making poor choices regarding drug experimentation or other risky behaviors. Such adolescents would be very poorly equipped for handling the motivational significance of drug and drug cues. Consistent with this latter notion, childhood psychiatric disorders such as ADHD and conduct disorder are risk factors for adolescent substance abuse (Biederman, Wilens, Mick, Spencer, & Faraone, 1999; Wilens, Faraone, Biederman, & Gunawardene, 2003), and both these disorders are associated with frontal deficiencies (Biederman et al., 1999). Even children who fail to meet the full clinical criteria for ADHD or conduct disorder may have some degree of frontal impairment that would increase their risk for managing the pull of rewarding drugs and their associated cues. The neurological basis of adolescent vulnerability is reviewed in more detail by Chambers, Taylor, and Potenza (2003).

Brain Imaging: The Addicted Brain

Although addiction has a very long human history, we have only recently acquired the technology to measure alterations in the living human brain that contribute to addiction vulnerability. Within the past two decades, human brain imaging techniques have revolutionized the field of psychiatric and neurological research, allowing us to visualize both the structure and the function of living human brains. Imaging research has also begun to identify differences in the reward and executive inhibitory brain systems of addicted individuals that may be critical in addiction vulnerability.

Most of the brain imaging research in addiction has been conducted with addicted adults, posing a difficulty in applying these findings to the adolescent brain. For instance, which of the brain differences observed in adults may have existed in childhood and adolescence, as a vulnerability that predated and perhaps even predisposed the person to drug addiction? Alternatively, which brain differences in the addicted adult brain result from years of exposure to the drug of abuse? Imaging studies at only one time point in adulthood have trouble answering this important “chicken-or-the-egg” question. Imaging studies in adolescents who are at risk for drug use but have not yet begun to use the drug will be critical in analyzing the findings in adult brains. The approach in this overview is to highlight several recent findings from brain imaging in addicted adults that may provide clues about the vulnerability to addiction in adolescence. Using the framework of reward and inhibition, this overview will also identify gaps in our current knowledge and potential implications of the brain findings for treatment and prevention.

Differences in Reward Systems of Addicted Individuals

The brain's reward circuitry is composed of an ancient network of interconnected structures whose evolutionary function is to ensure pursuit of the natural rewards necessary for daily survival (food) and for survival of the species (sex). For survival, it is not sufficient simply to appreciate the natural rewards whenever they happen to occur; it is critical to learn which cues in the environment signal the critical rewards, so that the rewards can be accessed again and again. The learned signals for reward, such as the sight of a desired food or reproductive partner, have a powerful “pull” or incentive value. As previously discussed, drugs of abuse activate the brain's circuitry for natural rewards. However, reward center activation by addictive drugs greatly exceeds that of natural rewards, which explains why drugs like cocaine can produce euphoria that is outside the range of normal human experience. The powerful subjective effects (which, in the case of cocaine and heroin, are likened to “orgasm, but much stronger”) of drugs result in powerful reactions to drug cues.

Although many chemical messenger systems are involved in the brain circuitry for reward and reward signals, the neurotransmitter dopamine has been the focus of most research in human brain imaging (Volkow et al., 1990; Volkow, Wang, Fischman, et al., 1997; Volkow, Wang, et al., 1999). This focus is due in part to the large number of animal studies that implicate a role for dopamine in reward function (Di Chiara, 1999; Di Chiara, Acquas, Tanda, & Cadoni, 1993; Koob & Nestler, 1997; Roberts & Ranaldi, 1995; Schultz, 2002; Wise, 1996). The focus on dopamine is also due to a current research limitation: there are several dopamine-related tracers available for human imaging research, but very few are available for the other transmitter systems. As previously noted, most drugs of abuse acutely increase the level of dopamine in the nucleus accumbens and other reward-related brain regions. This allows more dopamine to bind specialized dopamine receptors, increasing transmission of the dopamine message. Increased dopamine neurotransmission may be associated with an increase in positive mood, energy, arousal, and motor activity, all of which are effects that have been linked to the dopamine system.

Low D2 Dopamine Receptors

In terms of addiction vulnerability, one might expect that individuals with more dopamine receptors would potentially experience a greater (positive) drug effect and might therefore be more likely to become addicted. However, brain imaging research suggests the opposite may be true. Cocaine-addicted adults with long histories of addiction had low numbers of dopamine (type D2) receptors in the striatum (a critical way-station in the reward circuitry), compared with controls who had no history of any substance abuse (Volkow et al., 1990, 1993).

For some years, the finding of low D2 dopamine receptors in cocaine patients was regarded as a possible consequence of the cocaine use. This interpretation was based on knowledge (from animal studies) that the increased flood of dopamine caused by cocaine or other drugs of abuse can often trigger adaptive and compensatory responses in the brain. In the case of excessive dopamine message, as occurs during drug intoxication, reductions in dopamine synthesis, release, or reduction in dopamine receptors could help reduce the transmission of the message and help bring the dopamine system back into homeostatic balance. Dramatic recent findings from imaging studies suggest that low D2 receptors may also predate drug use, and may constitute a vulnerability factor in their own right. In a study of normal controls without addiction, those individuals within the group who “liked” an infusion of the stimulant methylphenidate had D2 receptor levels that were as low as those in cocaine patients addicted for many years (Volkow, Wang, et al., 1999). In the same study, individuals with a higher level of D2 receptors rated stimulant administration as “too much” and downright unpleasant. The study suggests that a higher level of D2 dopamine receptors may actually be protective against stimulant addiction by reducing the pleasurable effects of the powerful stimulant.

The potential protective effect of higher dopamine D2 receptors and the interaction of environmental experience with this effect was dramatically demonstrated in recent imaging studies with nonhuman primates given the opportunity to administer cocaine (Morgan et al., 2002). Individually housed male monkeys were imaged and some were then group housed, allowing dominance hierarchies to be established. Alpha-male monkeys, who had achieved dominance in the group-housing situation, showed a significant increase in dopamine D2 receptors in the striatum, and did not find cocaine initially appealing. However, the subordinate monkeys who had low D2 dopamine receptors avidly self-administered cocaine (Morgan et al., 2002).

These imaging findings suggest that a genetically determined trait, the initial level of D2 do pamine receptors in the striatal portion of the reward system, may be one vulnerability factor for enjoyment of drugs, drug taking, and eventual addiction. The findings equally demonstrate the critical role of the environment in determining whether a genetic vulnerability is expressed or even is reshaping the trait itself. For example, the human control subjects with low D2 receptors (those who liked drugs in the methylphenidate study) had survived adolescence and early adulthood without developing addiction. The mastery experiences of the alpha-male monkeys apparently reshaped a biological risk factor for addiction into one of protection.

We have no imaging studies of D2 dopamine receptor function in adolescents. This represents an important gap in our knowledge. Consequently, we do not yet know whether adolescents with low D2 dopamine receptor levels will have more preference for stimulants and experience enhanced vulnerability of future addiction. The D2 dopamine receptor imaging technique is unlikely to be used in research with adolescents and children because it currently requires minute amounts of a radioactive tracer. However, other “surrogate” measures of dopamine receptor function may be obtained without radioactive imaging, e.g., by measuring the subjective response to a stimulant challenge and/or by testing the impact of a known dopaminergic agent within a nonradioactive imaging modality such as functional magnetic resonance imaging (fMRI). In addition, some cognitive tasks are sensitive to dopaminergic manipulations, and an adolescent's performance on these (within or outside an imaging setting) could be used to indirectly determine tonic dopamine function.

For those at risk, an implication of these findings for prevention and treatment might be to reset the D2 receptor numbers to a more protective level. The teaching of social and behavioral coping tools to increase mastery and control over stressors could help turn a vulnerable individual (with low D2 dopamine receptors) into one who is more like the alpha monkey—ready to take on challenges and challengers. Once these monkeys had established dominance, they were much less attracted to cocaine. Alternatively, a medication could be used to reset the reward system to a more protective level. Agents that occupy the dopamine D2 receptors but block their action should, over time, lead to a compensatory increase in the D2 receptors. Unfortunately, the chronic administration of dopamine-blocking drugs (e.g., the typical antipsychotic neuroleptic medications such as chlorpromazine and haloperidol) often have prohibitive side effects, including sedation and a Parkinson-like neurological syndrome, that make these medications undesirable for long-term treatment. Medications that reduce the activity of the dopamine system but do not completely block it are better tolerated. For example, GABA agonists reduce dopamine neurotransmission without producing side effects associated with neuroleptics and might theoretically produce a gradual (compensatory) increase in D2 dopamine receptors. Consistent with this prediction, the GABA-B agonist baclofen has shown some early promise in the treatment of cocaine (Ling, Shoptaw, & Majewska, 1998), alcohol (Addolorato et al., 2000), and opiate (Akhondzadeh et al., 2000) dependence (trials in nicotine dependence are just beginning). Whether GABA-B agonists could also have a prophylactic effect in those at risk for addiction has not yet been tested, but this benefit might be predicted by the adult imaging findings with D2 dopamine receptors.

Brain Response to Drugs of Abuse and Drug-Related Cues

As previously described, drugs of abuse increase dopamine in critical parts of the reward circuitry, and this increase is most robust for psychomotor stimulants. Animal research also shows that the learned signals, or “cues,” for these drugs (as well as for natural rewards) also increase dopamine release in these same brain regions. In humans, drug cues trigger strong craving and arousal and may precede relapse. The brain response to drugs, and to cues that signal the availability of drugs thus represent two additional sources of potential addiction vulnerability in the reward system.

Research in animals has shown that under certain circumstances, the brain response to drugs of abuse (as measured by either brain dopamine release or behavioral activation) can “sensitize” or increase with repeated exposures to the drug. This might lead to the prediction that chronic drug use in humans would similarly lead to an increased brain response, compared with those who have not previously used the drug. Contrary to this expectation, imaging studies in chronic cocaine users have shown that the brain dopamine response to administration of a stimulant in chronic users is actually lower than the response of non–drug users (Volkow, Wang, et al., 1997). Though this lower brain response can be interpreted as evidence for tolerance (a reduced response to drug with repeated administrations), we do not yet know whether the response is indeed an effect of cocaine exposure or (as with lower D2 receptors) possibly a preexisting neurochemical condition that predated chronic cocaine use. How could a lower brain response to rewards be a risk factor in adolescence? One possibility is that a lower brain dopamine response to natural rewards would mean that these rewards are insufficiently engaging, whereas the powerful, supranormal stimulation by drugs of abuse might be experienced as “just right.” Some theories of sensation seeking and thrill seeking take this view. For sensation seekers, the arousal produced by natural rewards may be low, and thus high-intensity, high-arousal experiences are pursued and experienced as pleasurable (Zuckerman, 1986; Zuckerman & Kuhlman, 2000). In contrast, for those with a normal response to natural rewards, the high-intensity (often higher-risk) experiences (parachuting, bungee jumping, etc.) could be experienced as overwhelming and unpleasant.

We do not yet know whether adolescents at risk for substance abuse have a blunted brain response to natural rewards or to drugs of abuse. Although imaging studies that probe dopamine tone require small amounts of radioactive tracers and thus would not be permitted in adolescents, other nonradioactive imaging techniques could be used to measure response to the presentation of common rewards (money, food, etc.). Nonradioactive techniques such as fMRI use magnetic fields to map the regional change in brain blood flow, an index of increased brain activity. This technique is currently being used with adults to map the normal response of the brain to monetary (Elliott, Newman, Longe, & Deakin, 2003), food (Small, Zatore, Dagher, Evans, & Jones-Gotman, 2001), or sexual stimuli (Karama et al., 2002). These studies demonstrate that research on the reward circuitry could be conducted in adolescents.

As previously described, animal research has shown activation of the brain reward circuitry by both drugs of abuse and the cues signaling these drugs. The drug and the cues for the drug lead to dopamine increases at important nodes in the reward circuitry. In humans, cues regularly associated with drug use (e.g., the sight of a drug-using friend, dealer, location, or drug paraphernalia) can come to trigger profound craving and motivation for their drug of choice, potentially leading to drug use and relapse in the clinical setting (Childress, Franklin, Listerud, Acton, & O'Brien, 2002). Brain imaging studies of this conditioned motivational state in addicted adults have shown activation of several way stations in the motivational/reward circuitry, including those linked to attention, affect, autonomic arousal, and the rapid assignment of emotional valence to incoming stimuli (Childress, Mozley et al., 1999; Childress et al., 2002). Studies also demonstrate significant similarity in the brain regions activated by the cues for cocaine (Bonson et al., 2002; Childress, Mozley, et al., 1999; Garavan et al., 2000; Grant et al., 1996; Kilts et al., 2001; Maas et al., 1996), heroin (Daglish et al., 2001; Sell et al., 1999), alcohol (Schneider et al., 2001), and cigarettes (Brody et al., 2002). Similar actions by diverse drugs on motivational circuitry provide biological evidence that supports the commonality of substance abuse disorders. This circuitry also normally manages the motivation for natural rewards, as demonstrated by human brain imaging studies using food (chocolate) (Small et al., 2001) or sexual (Karama et al., 2002) stimuli. Addicted adults often report their craving for drugs exceeds their desire for natural rewards. A very recent fMRI study in adolescents with alcohol use disorder indeed found that the brain response (which included regions in the reward circuitry) to visual cues of their preferred alcohol beverage was larger than the response to pictures of a nonalcohol beverage (Tapert et al., 2003).

Most substance-dependent individuals find that behavioral techniques are difficult to apply when they are already in the throes of a full-blown craving episode. Therefore, medications that help bring the powerful brain reward system into a more manageable range are much needed. The GABA-B agonist medications baclofen, described above as having the potential to reset dopamine receptors, has also shown promise in blunting the response to cocaine (Brebner, Childress, & Roberts, 2002; Roberts, Andrews, & Vickers, 1996) or heroin (Di Ciano & Everitt, 2003) cues in animals, and it also blunts the craving and brain activation by cocaine cues in humans (Brebner et al., 2002; Childress, McElgin, et al., 1999). Other candidate medications for reducing the brain response to drug cues are discussed in Chapter 18.

Conclusions on Neurobiology

A large body of neuroscience research, only partially reviewed in this section, supports the notion that addiction is a disease that disrupts brain pleasure centers, including the extended amygdala and its numerous connections with other reward-related systems. Neurobiological research has provided an understanding of brain mechanisms that can guide medication development and potentially improve outcome. While the anatomy and circuitry of reward neurocircuits have been largely delineated, we know little about molecular changes within these regions that mediate the transition into addiction, enhance relapse vulnerability, and produce hedonic dysregulation. Nevertheless, the disease concept of addiction is supported by its strong biological basis, which is conclusively demonstrated by several lines of animal and human research. Although addictive drugs produce pleasure by activating brain reward circuits, their long-term effect is to inhibit these regions, leading to hedonic dysregulation and unpleasant emotional states. The short-term fix of more drug use provides temporary relief but then merely worsens this vicious cycle. Animal models of addiction have identified specific neurochemical alterations in reward-related and stress-related systems that contribute to dysphoric motivational states associated with drug abstinence, and the pharmacological reversal of these neuroadaptations is a promising strategy to improve outcome in clinical practice. Human studies likewise demonstrate functional and structural brain abnormalities associated with addiction, especially in the prefrontal cortex and amygdala, although the issue of causality has not been adequately addressed. Are these abnormalities produced by repeated drug administration, or do they predate and even contribute to addictive vulnerability? Can they be normalized with abstinence or through specific interventions? Will brain abnormalities identified through imaging techniques eventually serve to identify individuals who are most at risk of developing addiction? The issue of vulnerability is particularly important to identify adolescents who might benefit from specific interventions, be they preventive or therapeutic. Unfortunately, prodigious gaps exist in our knowledge of the neurobiology of addiction in adolescents, which represents an important area for future re-search.

THE ROLE OF GENETICS

Overview of Genetic Models

Research using both animal and human models is advancing our understanding of the role of genetic factors in substance use. Animal models of drug addiction can manipulate genetic factors through selective breeding or “knock-outs” (mice that are lacking a critical gene) to explore general and specific genetic effects on behavioral responses to drugs and propensity to self-administer drugs. As reviewed by Ponomarev and Crabbe (2002), this line of research has generated important knowledge about the role of genetic factors in initial sensitivity to drugs, neuroadaptive changes from chronic exposure, withdrawal syndromes, and reinforcing effects. Of particular relevance to adolescent substance use, animal research is elucidating how the adolescent brain may be especially vulnerable to the stimulating effects of both novel environment and drugs of abuse (Laviola, Adriani, Terranova, & Gerra, 1999). This work suggests that adolescence is a critical period during which exposure to drugs may interfere with more adap tive coping strategies and produce lifelong behavioral substance abuse patterns.

In humans, twin models have been used to explore the relative contribution of genetic and environmental factors to substance use and dependence. This is accomplished by comparing concordance rates for a particular trait in monozygotic twins who share all of their genes in common to those for dizygotic twins who share roughly 50% of their genes (Kendler, 2001). As described in greater detail below, this methodology has been used to study the role of heritable factors for smoking, alcohol use, and use of illegal drugs.

Once a particular behavioral trait (also referred to as a “phenotype”) has been established as heritable, molecular genetic approaches are used to identify the specific genetic variants that may be responsible. One such approach identifies candidate genes based on neurobiological or biochemical pathways (e.g., dopamine or serotonin genes) and uses a case–control study design to compare the frequency of genetic variants (alleles) in these pathways among persons with and without the phenotype (e.g., nicotine-dependent persons vs. nondependent persons) (Sullivan, Jiang, Neale, Kendler, & Straub, 2001). Several studies employing the candidate gene approach to investigate substance abuse genes are described below. The role of specific genetic variants can be also investigated through family-based designs that examine allele sharing or allele transmission for candidate genes within families (Spielman et al., 1996). This latter approach controls for potential bias due to ethnic admixture, but has less statistical power and is more costly to implement.

In contrast to these hypothesis-driven approaches, genetic linkage analysis can be used to search for as yet unidentified genetic variants that may be linked with substance use phenotypes. In this approach families or relative pairs (e.g., sibling pairs) are used to look for linkage with anonymous markers across the genome. Because the effect sizes of any individual gene conferring susceptibility to a behavioral trait are expected to be small (Comings et al., 2001), this approach requires a large number of family members. As described below, the results of such studies are beginning to reveal regions of interest in the genome; however, it is likely to take several years before specific loci are identified and validated as being important in substance use and dependence.

Below, we summarize the literature on the heritability and specific genetic effects for tobacco use, alcohol use, and use of illegal drugs. Although most of these studies used adult populations, we highlight investigations that included adolescent participants. The results of both adolescent and adult studies provide insights into the biobehavioral basis of substance use and its relevance to prevention in high-risk youth.

Genetic Contributions to Tobacco Use

Abundant data from twin studies provide evidence for the heritability of cigarette smoking. Using the Australian twin registry, Heath and Martin (1993) found that inherited factors accounted for 53% of the variance in smoking initiation. More recent data suggest that the heritability of a diagnosis of nicotine dependence is even higher (Kendler et al., 1999; True et al., 1999). Sullivan and Kendler (1999) summarized data from a large number of twin studies indicating that additive genetic effects account for 56% of the variance in smoking initiation and 67% of the variance in nicotine dependence. Significant genetic influences have also been documented for age at smoking onset (True et al., 1999) and for smoking persistence (Madden et al., 1999).

Genes in the dopamine pathway have been studied most extensively with respect to tobacco use and addiction. It is speculated that individuals with low-activity genetic variants may experience greater reinforcement from nicotine because of its dopamine-stimulating effects. In support of this hypothesis are three studies showing a higher prevalence of the more rare A1 or B1 allele of the dopamine 2 receptor (DRD2) gene among smokers than among nonsmokers (Comings et al., 1996; Noble et al., 1994; Spitz et al., 1998). However, a small, family-based analysis did not provide evidence for significant linkage of smoking to the DRD2 locus (Bierut et al., 2000).

In a case–control study of smokers and nonsmokers, Lerman and colleagues (1999) found that DRD2 interacted with the dopamine transporter (DAT) gene in its effects on smoking behavior. The DAT polymorphism is of particular interest because the 9-repeat allele has been associated with a 22% reduction in dopamine transporter protein (Heinz et al., 2000). Since a reduction in dopamine transporter level would result in less clearance and greater bioavailability of dopamine, it is speculated that individuals who have the 9-repeat may have less need to use nicotine to stimulate dopamine activity. The association of the DAT gene with smoking behavior has been supported in one study (Sabol et al., 1999), but not replicated in two other studies (Jorm et al., 2000; Vandenbergh, 2002). Thus, the role of DAT in smoking behavior remains unclear.

The serotonin pathway is also under investigation in genetic studies of smoking behavior. Candidate polymorphisms (genetic variants) include those in genes that are involved in serotonin biosynthesis (e.g., tryptophan hydroxylase, TPH) and serotonin reuptake (serotonin transporter, 5HTTLPR). Two recent studies have shown that individuals who are homozygous for the more rare A allele of TPH are more likely to initiate smoking and to start smoking at an earlier age (Lerman et al., 2001; Sullivan et al., 2001). Although 5HTTLPR was not associated with smoking status (Lerman et al., 1998), there is evidence from two studies that this polymorphism modifies the effect of anxiety-related traits on smoking behavior (Hu et al., 2000; Lerman et al., 2000).

While genes in the dopamine and serotonin pathways may have generalized effects on risk for substance abuse, genes that regulate nicotine metabolism should be specifically relevant to smoking behavior. One hypothesis is that slower metabolizers of nicotine may be less prone to initiate smoking because they may experience more aversive effects (Pianezza, Sellers, & Tyndale, 1998). Once smoking is initiated, slower metabolizers may require fewer cigarettes to maintain nicotine titers at an optimal level (Benowitz Perez-Stable, Herrera, & Jacobs, 2002). Initial support for this premise was provided in a study of the P450 CYP2A6 gene, which encodes the key enzyme involved in metabolism of nicotine to inactive cotinine (Pianezza et al., 1998). Unfortunately, however, later studies did not support this finding and suggested that the CYP2A6 variant is much more rare than originally reported (London, Idle, Daly, & Coetzee, 1999; Oscarson et al., 1998; Sabol et al., 1999).

Although genes regulating nicotine receptor function would be prime candidates for smoking risk, data on functional genetic variation in humans are not yet available. In two recent studies of the B2 nicotinic receptor, several single neucleotide polymorphisms (of unknown functional significance) were identified, but none were associated with smoking behavior (Lueders et al., 2002; Silverman et al., 2001).

As mentioned above, linkage analysis can also be used to scan the genome for regions that may harbor nicotine dependence susceptibility genes. There are, however, a limited number of reports using this approach. Straub and colleagues (1999) performed a complete genomic scan to search for loci that may confer susceptibility to nicotine dependence. Using a sample of affected sibling pairs, linkage analysis provided preliminary evidence for linkage to regions on chromosomes 2, 4, 10, 16, 17, and 18. But these results were not statistically significant, and the sample size in this study (130 families) may not have been large enough to identify genes with small effects. Two other studies used families from the Collaborative Study on the Genetics of Alcoholism (COGA) and reported evidence for linkage of smoking behavior to chromosomes 5 (Duggirala et al., 1999), 6, and 9 (Bergen et al., 1999). Notably, the regions identified in the different studies do not overlap. This may be attributable to the fact that regions identified in the COGA sample may harbor loci predisposing to addiction to both alcohol and smoking.

Genetic Contributions to Alcohol Use

As with tobacco, twin studies of alcohol use provide consistent evidence for significant genetic effects. Estimates for the proportion of variance accounted for by genetic factors range from about 30% to 70%, depending on whether the studies used population-based or treatment sam ples (Kendler, 2001) and on the specific phenotype examined (van den Bree, Johnson, Neale, & Pickens, 1998). One study of over 1,500 twin pairs, ages 20 to 30 years old, reported that 47% of the variance in use (vs. abstinence) in males was attributable to genetic factors with 48% of the variance being due to shared environment (and the remainder due to individual environmental effects; Heath & Martin, 1988). The comparable figures for females were 35% and 32%, respectively. Heritability estimates in other studies ranged from about 50% for alcohol dependence to 73% for early age of onset for alcohol problems (McGue, Pickens, & Svikis, 1992; Pickens, & Svikis, 1991; Prescott & Kendler, 1999). Physical symptoms of alcohol dependence also appear to have a significant heritable component (e.g., binge drinking, withdrawal), although the potential behavioral consequences appear to be less heritable (e.g., job trouble, arrests; Slutske et al., 1999). Of particular relevance to the biobehavioral model of substance abuse, there is evidence for shared genetic influences for tobacco and alcohol consumption (Swan, Carmelli, & Cardon, 1996).

The search for specific genetic effects on alcohol use has led to the discovery of genes in key neurotransmitter pathways and genes that influence the metabolism of alcohol. Once again, the dopamine pathway has been a central focus of this research. An initial study relating the DRD2 A1 allele to alcoholism attracted a great deal of attention (Blum et al., 1990); however, several studies failed to replicate this initial result (Bolos et al., 1990). Noble (1993) reviewed nine independent studies including 491 alcoholics and 495 controls. Across these studies, the more rare A1 allele of the DRD2 gene was carried by 43% of alcoholics, compared with 25% of nonalcoholic controls. When only severe alcoholics were examined, the prevalence of the A1 allele was 56%. Hill, Zezza, Wipprecht, Locke and Neiswanger (1999) used the more conservative family-based approach to test for linkage between DRD2 and alcoholism. Although an overall association with alcoholism was not supported, there was evidence for linkage when only severe cases were examined. Studies examining other genes within the dopamine pathway for association with alcoholism have yielded mostly negative results (Parsian, Chakraverty, Fishler, & Cloninger, 1997).

Genes in the serotonin pathway are also plausible candidates for alcohol dependence because of the effects of alcohol on brain serotonin levels (Lesch & Merschdorf, 2000). The low activity S allele of the serotonin transporter gene (5HTTLPR) has been linked with alcoholism in one family-based study (Lichtermann et al., 2000). Although the prevalence of this variant has not been found to differ significantly in case–control studies comparing alcoholics and nonalcoholics, there is evidence that it increases risk for particular alcoholism subtypes, including binge drinking (Matsushita et al., 2001) and early-onset alcoholism with violent features (Hallikainen et al., 1999). Similarly, the TPH gene has been linked with alcoholism with comorbid impulse control problems, such as antisocial behavior or suicidal tendencies (Ishiguro et al., 1999; Nielsen et al., 1998).

The most consistent evidence for genetic effects on alcoholism has been generated from studies of genes that regulate the metabolism of alcohol. Alcohol is converted to its major metabolite acetaldehyde by the enzyme alcohol dehydrogenase (ADH). Decreased metabolism results in more aversive effects of alcohol consumption, such as flushing and toxicity. A reduced-activity allele of the ADH2 gene (ADH2*2) is found more commonly in Asian populations and has been shown to be protective for alcohol dependence in Chinese (Chen et al., 1999) and European (Borras et al., 2000) populations. There is some evidence that the genetic effect is stronger for males than for females (Whitfield et al., 1998). The reduced-activity allele of ADH2 is also found more commonly in Ashkenazi Jewish populations and has been associated with reduced alcohol consumption among Jewish college students (Shea, Wall, Carr, & Li, 2001).

The opioid system has also been implicated in the reinforcing effects of alcohol as well as other drugs of abuse (see below). With respect to alcoholism, the results of initial studies have been mixed. Two studies have suggested that variants of the μ‎-opioid receptor gene may be associated with a general liability to substance dependence, including alcohol (Kranzler et al., 1998; Schinka et al., 2002). However, another larger study did not find significant differences in allele frequencies in dependent and nondependent individuals (Gelernter, Kranzler, & Cubells, 1999).

Genetic Contributions to Illegal Substance Use

The Harvard Twin study is one of the most extensive investigations of the role of heritable factors in drug use (Tsuang, Bar, Harley, & Lyons, 2001; Tsuang et al., 1999). Summarizing the results from 8,000 twin pairs, Tsuang and colleagues (2001) reported heritability estimates ranging from .38 for sedative drugs to .44 for stimulant drugs. Interestingly, the variance in illicit drug use attributed to shared environmental influences tended to be much smaller than that due to individual environmental effects. Somewhat higher estimates for heritability were generated from a study of twins ascertained through alcohol and drug programs and thus exhibiting more severe forms of substance abuse disorders (van den Bree et al., 1998). Among males, heritability estimates for substance dependence were 58% for sedatives, 57% for opiates, 74% for cocaine, 78% for stimulants, and 68% for marijuana. With the exception of dependence on stimulants, estimates were significantly lower for females. In general, the genetic variance appeared to be greater for heavy use or abuse than for ever using (Kendler, Gardner, & Gardner, 1998).

Of particular relevance to youth substance abuse is the finding that the transitions in survey drug use categories (never used to ever used to regular use) have a significant heritable component. For example, genetic variance for the transition from never to ever using was reported to be 44% for marijuana, 61% for amphetamine, and 54% for cocaine (Tsuang et al., 1999). For the transition to regular use, the comparable figures were 30%, 39%, and 34%. As was shown for tobacco and alcohol, family studies showed evidence for common genetic variance underlying dependence on illegal drugs (Pickens, Svikis, McGue, & LaBuda, 1995; Tsuang et al., 2001).

Because many drugs of abuse increase levels of dopamine (Dackis & O'Brien, 2001; Shimada et al., 1991), initial genetic investigations have focused on this pathway. Genetic variations affecting mesocorticolimbic function might affect drug-induced reward and thereby contribute to addiction vulnerability. Uhl, Blum, Noble, and Smith (1995) summarized data from nine studies of mixed groups of substance abusers and reported a 2-fold increase in risk in individuals who have at least one copy of the DRD2 A1 allele. The risk ratio was nearly 3-fold for more severe substance abuse. A high activity allele of the catechol-O-methyltransferase gene, which codes for a dopamine metabolizing enzyme, has also been associated with polysubstance abuse (Vandenbergh, Rodriguez, Miller, Uhl, & Lachman, 1997).

Several twin, family, and adoption studies have concluded that the vulnerability to develop heroin dependence is partially inherited. Twin studies have reported significantly higher concordance rates for identical twins than for nonidentical twins (Tsuang et al., 1996), and an estimated heritability of .34 has been published for heroin-dependent males (van den Bree et al., 1998). One study of male and female subjects who were adopted away from their natural parents found that opioid dependence correlated with genetic loading for antisocial personality and alcoholism, and with environmental factors such as divorce and turmoil in the adoptive family (Cadoret et al., 1986). Another study reported that subjects with opioid dependence had an 8-fold increase in addiction prevalence among their first-degree relatives, independent of alcoholism and antisocial personality disorder, with evidence of specificity for familial opioid dependence (Merikangas et al., 1998). Therefore, a family history of addiction appears to be a potent risk factor for the development of opioid dependence. Specific genes associated with increased vulnerability for heroin dependence have not been identified, although animal models demonstrate that genes encoding the μ‎-opioid receptor might influence the animal's opioid preference, as evidenced by their willingness to self-administer morphine (Berrettini, Alexander, Ferraro, & Vogel, 1994). However, human studies of the gene (OPRM1) that encodes the human μ‎-opioid receptor are mixed with regard to opioid dependence vulnerability (Crowley et al., 2003; Hoehe et al., 2000).

Key Findings from Research on Genetics of Substance Use

There is no “gene for addiction.”

Although heritable factors are clearly important in substance abuse and dependence, such effects involve a complex interaction between multiple genes in different biological pathways. Some genetic variants may result in a more generalized predisposition to substance use and dependence, while other variants may influence risk for dependence on specific substances. These genetic effects interact with environmental factors, and any individual genetic variant is likely to account for only a small proportion of the overall variance in a substance use behavior.

Findings on the effects of specific genetic variants are not consistent.

The use of different study designs and methods of subject ascertainment, the focus on polymorphisms of unknown functional significance, and ethnic admixture have resulted in inconsistent findings in this field. Very large studies using both population-based and family-based designs are needed to validate specific genetic effects and to identify the set(s) of genetic variants that predispose to general addiction potential and dependence on specific substances.

Genetic effects on substance abuse are mediated by personality traits.

Such traits particularly involve the drive for sensation and novelty and deficits in impulse control. Individuals exhibiting these traits may be more prone to drug use, and as such, these traits may serve as liability markers for susceptibility to substance use and dependence. Whether these trait markers provide greater predictive value than the underlying genetic markers remains to be determined.

A complete understanding of specific genetic influences on substance dependence will reveal only part of the picture. On average, genetic influences account for roughly one-half of the variance in specific substance use behaviors. Such effects occur in the context of complex socioenvironmental and psychological influences. Even the best panel of genetic tests to identify individuals predisposed to substance abuse will have low sensitivity and specificity unless nongenetic influences are incorporated into the model. Increased understanding of the role of genetic factors in addiction will never diminish the importance of behavioral and social influences.

Notes:

1. A full listing of publications from MTF may be found at www.monitoringthefuture.org

2. In the past, estimates that emanate from the National Household Surveys have been lower than those that derive from the school surveys, possibly because of greater concealment in the household setting or other sampling or measurement differences (e.g., see Harrison, 2001). Since MTF covers a considerably longer time period, more classes of drugs, and more information on related attitudes, beliefs, and environmental factors than the YRBS, it will be used as the source for most of the estimates presented here. The YRBS, begun in 1991, generates prevalence estimates that tend to be slightly higher than those from MTF, but shows trends that tend to be quite similar.

3. A listing of publications on the NHSDUH may be found at http://www.samhsa.gov/oas/nhsda.htm

4. A full list of publications from the National Comorbidity Study may be found at http://www.hcp.med.harvard.edu/ncs/publicationbyyear.htm

5. It should be noted that data based on school surveys, such as MTF or YRBS, of necessity omit the out-of-school segment of the youth population. How great an omission varies with age, of course. At 8th grade less dropping out has occurred; more has occurred before the end of 10th grade, although most states have compulsory attendance laws through the age of 16. By 12th grade, it is estimated from Census data that perhaps 15% of an age cohort has permanently dropped out. (See Johnston, et al., 2003b, Appendix A, for a more detailed discussion of this issue and for numerical estimates of the degree of underreporting likely to occur as a result of dropping out and absenteeism.)

6. Pairs of years have been combined to increase the reliability of the estimates.

7. It should be noted that with regard to use of the legal drugs, alcohol and tobacco, the United States has enjoyed one of the lowest prevalence proportions of any of the countries compared, a fact of considerable consequence for the long-term health and longevity of this generation of youth.