Source
Sackler Institute for Developmental Psychobiology, Weill Cornell Medical College, New York, NY 10065, USA. [email protected]
Abstract
OBJECTIVE:
Adolescence is a developmental period that entails substantial changes in risk-taking behavior and experimentation with alcohol and drugs. Understanding how the brain is changing during this period relative to childhood and adulthood and how these changes vary across individuals are key in predicting risk for later substance abuse and dependence.
METHOD:
This review discusses recent human imaging and animal work in the context of an emerging view of adolescence as characterized by a tension between early emerging “bottom-up” systems that express exaggerated reactivity to motivational stimuli and later maturing “top-down” cognitive control regions. Behavioral, clinical, and neurobiological evidences are reported for dissociating these two systems developmentally. The literature on the effects of alcohol and its rewarding properties in the brain is discussed in the context of these two systems.
RESULTS:
Collectively, these studies show curvilinear development of motivational behavior and the underlying subcortical brain regions, with a peak inflection from 13 to 17 years. In contrast, prefrontal regions, important in top-down regulation of behavior, show a linear pattern of development well into young adulthood that parallels that seen in behavioral studies of impulsivity.
CONCLUSIONS:
The tension or imbalance between these developing systems during adolescence may lead to cognitive control processes being more vulnerable to incentive-based modulation and increased susceptibility to the motivational properties of alcohol and drugs. As such, behavior challenges that require cognitive control in the face of appetitive cues may serve as useful biobehavioral markers for predicting which teens may be at greater risk for alcohol and substance dependence.
Copyright © 2010 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Introduction
Adolescence is a transitional period of development when there are many changes experienced concomitantly, including physical maturation, drive for independence, increased salience of social and peer interaction, and brain development 1–3. This developmental period is also a time characterized by an inflection in risky behaviors including experimentation with drugs and alcohol, criminal activity and unprotected sex. Understanding the neural basis of these risky behaviors is key in identifying which teens may be at risk for poor outcomes, such as substance dependence and abuse.
A number of hypotheses have been postulated for why adolescents may engage in impulsive and risky behaviors. Traditional accounts of adolescence suggest that it is a period of development associated with progressively greater efficiency of cognitive control capacities. This efficiency in cognitive control is described as dependent on maturation of the prefrontal cortex as evidenced by imaging 4–7 and post mortem studies8–10 showing continued structural and functional development of this region well into young adulthood.
Improved cognitive control with development of the prefrontal cortex is consistent with a linear increase in this ability from childhood to adulthood. Yet suboptimal choices and actions observed during adolescence represent an inflection in development 11 that is unique from either childhood or adulthood, as evidenced by the National Center for Health Statistics on adolescent behavior and mortality 12. If cognitive control and an immature prefrontal cortex were the basis for suboptimal choice behavior alone, then children should look remarkably similar or presumably worse than adolescents, given their less developed prefrontal cortex and cognitive abilities 2. This review addresses the primary question of how the brain is changing during adolescence that may explain inflections in risky and impulsive behavior. In addition, we provide examples of how alcohol and drug use during this period of development may further exacerbate these changes and can lead to subsequent abuse and dependence.
To accurately capture cognitive and neurobiological changes during adolescence, this period must be treated as a transitional one rather than a single snapshot in time 3. In other words, to understand this developmental period, transitions into and out of adolescence are necessary for distinguishing distinct attributes of this period relative to other time points in development. Therefore, empirical data that establishes developmental trajectories from childhood to adulthood for cognitive and neural processes are essential in characterizing these transitions and more importantly in constraining any interpretations about changes in brain or behavior in adolescence.
Second, accurate depictions of adolescence require a refinement in the phenotypic characterization of this period. For example, on a behavioral level, adolescents are often characterized as impulsive and greater risk-takers with these constructs used almost synonymously. Yet, these constructs are distinct and appreciating this distinction is important for describing their developmental trajectories and neural underpinnings. We provide behavioral, clinical and neurobiological evidence that suggest that risk-taking is more tightly coupled with sensitivity to environmental incentives (sensation-seeking) whereas impulsivity is associated with poor top down cognitive control.
To theoretically ground the empirical findings, we provide a plausible neurobiological model for adolescence and suggest how development during this time may lead to an enhancement in vulnerabilities for alcohol and drug abuse. The intention of this review is not to psychopathologize adolescence, but rather to explain why some teens but not others are vulnerable to substance abuse. As such, we attempt to identify potential biological and behavioral markers for early identification and for outcome assessments of interventions.
Neurobiological Model of Adolescence
A neurobiological model of adolescent development 2 that builds on rodent models 13, 14 and recent imaging studies of adolescence 6, 7, 15–20 is depicted Figure 1. This model illustrates how subcortical and prefrontal top-down control regions must be considered together as a circuit. The cartoon shows different developmental trajectories for signaling of these regions, with limbic projections developing earlier than prefrontal control regions. According to the model, the adolescent is biased by functionally mature subcortical relative to less mature cortical circuitry during adolescence (i.e., imbalance in reliance of systems), compared to children, for whom this frontolimbic circuitry is still developing; and compared to adults, for whom these systems are fully mature. With development and experience, the functional connectivity between these regions is strengthened and provides a mechanism for top down modulation of the subcortical systems 7. Thus it is the frontostriatal circuitry, along with functional strengthening of connections within this circuitry, that may provide a mechanism to explain changes in both impulsivity and risk-taking observed across development.
This model is consistent with previous ones 21–24 in that it provides a basis for nonlinear inflections observed in behavior from childhood to adulthood, due to earlier maturation of subcortical projections relative to less mature top down prefrontal ones. Specifically, the triadic model 21 proposes that motivated behavior has three distinct neural circuits (approach, avoidance and regulatory). The approach system is largely controlled by the ventral striatum, avoidance system by the amygdala and lastly, the regulatory system by the prefrontal cortex 25. The current model differs largely from others in that it is grounded in empirical evidence for brain changes not only in the transition from adolescence to adulthood, but rather the transition into adolescence from childhood and later out of adolescence into adulthood. Moreover, the model does not suggest that the striatum and amygdala are specific to approach and avoidant behavior given recent studies showing valence independence of these structures 26, but rather are systems that are important in detecting motivationally and emotionally relevant cues in the environment that can bias behavior. In this review, we describe the most recent evidence from behavioral and human imaging studies of adolescence in the context of our model that illustrates the transition from childhood to adulthood.
Phenotypic Characterization of Adolescence
The ability to resist temptation in favor of long-term goals is a form of cognitive control. Lapses in this ability have been suggested to be at the very core of adolescent risky behavior 27. Cognitive control, which includes resistance from temptation or delay of immediate gratification has been studied in the context of social, developmental and cognitive psychology. Developmentally, this ability has been measured by assessing how long a toddler can resist an immediate reward (e.g., a cookie) in favor of a larger reward later (e.g., two cookies) 28. Although individuals vary in this ability even as adults, developmental studies suggest windows of development when an individual may be particularly susceptible to temptations. This ability has been described as a form of impulse control 29 and it is multi-faceted 30, 31, but can be operationally defined as the ability to accomplish goal-directed behavior in the face of salient, competing inputs and actions 32.
Historically, developmental studies have shown a steady improvement in cognitive control capacity from infancy to adulthood 33. This observation is supported by a wealth of behavioral evidence from experimental paradigms in controlled laboratory settings including paradigms such as the Go-NoGo task, Simon task, and task-switching paradigms that require participants to override a prepotent response in order to achieve a correct one 32, 34. However, when it is advantageous to suppress a response to incentive-related cues, cognitive control suffers 20. This reduced control is especially evident during the period of adolescence, when suboptimal choices in sexual and drug related behaviors peak 3, 11, 12, 14. These observations imply that developmental trajectories in cognitive control are complex and can be modulated by emotionally charged or reinforcing contexts (e.g., social and sexual interactions), in which cognitive control demands interact with motivational drives or processes.
Motivation can modulate cognitive control in at least two ways. First, being rewarded for performance on a given task can make people work harder and ultimately perform better than when not rewarded 17. Second, the capacity to exert control can be challenged when required to suppress thoughts and actions toward appetitive cues 20. Recent studies of adolescent development have begun to compare cognitive control capacity in relatively neutral versus motivational contexts. These studies suggest a change in sensitivity to environmental cues, especially reward-based ones at different points in development, and suggest a unique influence of motivation on cognition during the adolescent years.
In the following section, we highlight some of the most recent studies of how adolescent behavior is differentially biased in emotionally charged contexts relative to adults.
For example, Ernst and colleagues 35, 36 examined performance on an antisaccade task with a promise of financial reward for accurate performance on some trials but not others. Results showed that promise of a reward, facilitated adolescent cognitive control behavior more than for adults, a finding that has been replicated 17 and recently been extended to social rewards (e.g. happy faces 20).
While the previous examples provide instances of enhanced performance in teens with incentives, rewards can also diminish performance when suppressing responses to rewards that lead to high gain. For example, using a gambling task in which reward feedback was provided immediately during decision-making (“hot” trials which heightened task-elicited affective arousal) or withheld until after the decision (“cold” deliberate decision making trials), Figner and colleagues 37 showed that adolescents made disproportionately more risky gambles compared to adults but only in the “hot” condition. Using a similar task, the Iowa Gambling Task, Cauffman and colleagues 38 have shown that this sensitivity to rewards and incentives actually peaks during adolescence, with a steady increase from late childhood to adolescence in tendency to play with more advantageous decks of cards and then a subsequent decline from late adolescence to adulthood. These findings illustrate a curvilinear function, peaking roughly between 13 and 17, and then declining 27. While prior findings with the Iowa Gambling task have shown a linear increase in performance with age 39, these studies did not look at age continuously nor did they examine only trials with advantageous decks of cards.
Recent studies have suggested that social contexts, particularly peers, may also serve as a motivational cue and can diminish cognitive control during adolescence. It has been shown that the degree to which an adolescent’s peers are using substances is directly proportional to the amount of alcohol or illegal substances that the adolescent themselves will use 40. Using a simulated driving task, Gardner and colleagues 41 have shown that adolescents make riskier decisions in the presence of peers than when alone and that these risky decisions decrease linearly with age 23, 40.
Taken together, these studies suggest that during adolescence, motivational cues of potential reward are particularly salient and can lead to improved performance when provided as a reinforcer or rewarded outcome, but to riskier choices or suboptimal choices when provided as a cue. In the latter case, the motivational cue can diminish effective goal-oriented behavior. Furthermore, these studies suggest that sensitivity to rewards and sensation- seeking behavior are distinct from impulsivity with very different developmental patterns (curvilinear function versus a linear function, respectively). This distinction is further evident in a recent study by Steinberg et al. 42 using self- report measures of sensation- seeking and impulsivity. They tested whether the often-conflated constructs of sensation-seeking and impulsivity develop along different timetables in nearly 1000 individuals between the ages of 10 and 30. The results showed that differences in sensation-seeking with age followed a curvilinear pattern, with peaks in sensation-seeking increasing between 10 and 15 years and declining or remaining stable thereafter. In contrast, age differences in impulsivity followed a linear pattern, with decreasing impulsivity with age in a linear fashion (see Figure 2 panel A). These findings together with the laboratory based findings, suggest heightened vulnerability to risk- taking in adolescence “may be due to the combination of relatively higher inclinations to seek excitement and relatively immature capacities for self-control that are typical of this period of development” 42.
Neurobiology of Adolescence
As denoted in our model of adolescence, two key regions implicated in cognitive and motivational behavior are the prefrontal cortex, known to be important for cognitive control 43 and the striatum critical in detecting and learning about novel and rewarding cues in the environment 44. We highlight recent animal and human imaging work on neurobiological changes supporting these motivational and cognitive systems across development in the context of the previous behavioral findings on the development of sensation-seeking and impulsivity. We use the previously described imbalance model of linear development of top down prefrontal regions relative to a curvilinear function for development of bottom-up striatal regions involved in detecting salient cues in the environment to ground the findings. The importance of examining circuitry rather than specific regional change, especially within frontostriatal circuits that underlie different forms of goal-oriented behavior is key. This perspective moves the field away from examination of how each region matures in isolation to how they may interact in the context of interconnected circuits.
Seminal animal and human work has shown how striatal and prefrontal cortical regions shape goal-directed behavior 7, 27, 37, 38, 44. Using single-unit recordings in monkeys, Pasupathy & Miller 45 demonstrated that when flexibly learning a set of reward contingencies, very early activity in the striatum provides the foundation for reward-based associations, whereas later, more deliberative prefrontal mechanisms are engaged to maintain the behavioral outputs that can optimize the greatest gains, these findings have been replicated in lesion studies 46–48. A role for the striatum in early temporal coding of reward contingencies prior to the onset of activation in prefrontal regions has also been extended to humans 49. These findings suggest that understanding the interactions between regions (along with their component functions); within frontostriatal circuitry is critical for developing a model of cognitive and motivational control in adolescence.
Frontostriatal circuits undergo considerable elaboration during adolescence 50–53 that are particularly dramatic in the dopamine system. Peaks in the density of dopamine receptors, D1 and D2 in the striatum occur early in adolescence, followed by a loss of these receptors by young adulthood 54–56. In contrast, the prefrontal cortex does not show peaks in D1 and D2 receptor density until late adolescence and young adulthood 57, 58. Similar developmental changes have been shown in other reward related systems including cannabinoid receptors 59. It remains unclear how changes in the dopamine systems may relate to motivated behavior as controversy remains as to whether reward sensitivity is modulated by dopamine systems (e.g., 60, 61) and whether it is a result of less active or hypersensitive dopamine systems (e.g., 62, 63). However, given the dramatic changes in dopamine rich circuitry during adolescence, it is likely to be related to changes in sensitivity to rewards distinct from childhood or adulthood 50, 64. Beyond the significant changes in dopamine receptors, there are also dramatic hormonal changes which occur during adolescence that lead to sexual maturity, and influence functional activity in frontostriatal circuits 65, however, a detailed discussion is beyond the scope of this paper, see 66, 67 for detailed reviews on the subject.
Human imaging studies have begun to provide support for strengthening in the connections of dopamine rich frontostriatal circuitry, across development. Using diffusion tensor imaging and functional magnetic resonance (fMRI), Casey and colleagues 68, 69 and others 70 have shown greater strength in distal connections within these circuits across development and have linked connection strength between prefrontal and striatalregions with the capacity to effectively engage cognitive control in typically and atypically developing individuals 68, 69. These studies illustrate the importance of signaling within corticostriatal circuitry which support the capacity to effectively engage in cognitive control.
Likewise, there is mounting evidence from human functional neuroimaging studies on how subcortical systems like the striatum and the prefrontal cortex interact to give rise to risky behavior observed in adolescents 71. The majority of imaging studies have focused on one or the other regions showing that the prefrontal cortex, thought to subserve age-related improvement in cognitive control 72–78 undergoes delayed maturation 4, 79, 80 while striatal regions sensitive to novelty and reward manipulations develop earlier 74, 81. Several groups have shown that adolescents show heightened activation of the ventral striatum in anticipation and/or receipt of rewards compared to adults 6, 15, 17, 18, but others report a hypo-responsiveness 82.
One of the first studies to examine reward related processes across the full spectrum of development from childhood to adulthood was completed by Galvan and colleagues 6 in 6 to 29 year olds. They showed that ventral striatal activation was sensitive to varying magnitudes of monetary reward 49 and that this response was exaggerated during adolescence, relative to children and adolescents 6 (see Figure 3), indicative of signal increases 6 or more sustained activation 83. In contrast to the pattern in the ventral striatum, orbital prefrontal regions, showed protracted development across these ages (Figure 2b).
But, how does this enhancement of signaling in the ventral striatum relate to behavior? In a follow-up study, Galvan and colleagues 16 examined the association between activity in the ventral striatum to large monetary reward with personality trait measures of risk-taking and impulsivity. Anonymous self-report rating scales of risky behavior, risk perception and impulsivity were acquired in her sample of 7 to 29 year olds. Galvan et al. showed a positive association between ventral striatal activity to large reward and the likelihood of engaging in risky behavior (see Figure 3) These findings are consistent with adult imaging studies showing ventral striatal activity with risky choices 84, 85.
To further support an association between adolescents’ risky behavior and sensitivity to reward as indexed by an exaggerated ventral striatal response, Van Leijenhorst and colleagues 18 tested this association using a gambling task. The task included Low-Risk gambles with a high probability of obtaining a small monetary reward and High-Risk gambles with a smaller probability of obtaining a larger monetary reward. The fMRI results confirmed that High-Risk choices were associated with ventral striatal recruitment whereas Low-Risk choices were associated with activation in ventral medial prefrontal cortex. These findings are consistent with the hypothesis that risky behavior in adolescence is associated with an imbalance caused by different developmental trajectories of subcortical reward and prefrontal regulatory brain regions consistent with our neurobiological model of adolescence.
While there appears to be an association between risk-taking behavior and ventral striatal activation, in the Galvan study 16 no correlation was reported between ventral striatal activity with impulsivity. Rather, impulsivity ratings were correlated with age, consistent with numerous imaging studies showing linear development with age in prefrontal cortical recruitment during impulse control tasks 7, 75, 77 (and see reviews by 34, 86). Moreover, recent studies have shown that impulsivity ratings inversely correlate with volume of the ventral medial prefrontal cortex in a sample of healthy boys (7–17yrs) 87. Finally, studies of clinical populations characterized by impulsivity problems like ADHD, show impaired impulse control and reduced activity in prefrontal regions compared to controls, 88, 89 but do not show heightened responses to incentives 90.
These findings provide neurobiological empirical support for a dissociation of the constructs related to risk-taking and reward sensitivity from that of impulsivity with the former showing an curvilinear pattern and the latter a linear pattern (see Figure 2 B). Thus adolescent choices and behavior cannot be explained by impulsivity or protracted development of the prefrontal cortex alone. Rather, motivational subcortical regions must be considered to elucidate why adolescent behavior is not only different from adults, but from children as well. Thus, the ventral striatum appears to play a role in levels of excitement 82, 91 and positive affect 15 when receiving rewards, as well as the propensity for sensation-seeking and risk-taking 16, 91. More importantly, these findings suggest that during adolescence, some individuals may be more prone to engage in risky behaviors due to developmental changes in concert with variability in a given individual’s predisposition to engage in risky behavior, rather than to simple changes in impulsivity.
A scientific area that has received less attention is determining how cognitive control and motivational systems interact over the course of development. As mentioned earlier, Ernst and colleagues 35, 36 showed that promise of a monetary reward facilitated adolescent cognitive control behavior more than for adults. Geier et al. 17 recently identified the neural substrates of this cognitive up-regulation using a variant of an antisaccade task during functional brain imaging. In adolescents and adults, trials for which money was at stake speeded performance and facilitated accuracy, but this effect was larger in adolescents. Following a cue that the next trial would be rewarded, adolescents showed exaggerated activation in the ventral striatum while preparing for and subsequently executing the antisaccade. An exaggerated response was observed in adolescents within prefrontal regions along the precentral sulcus, important for controlling eye movements, suggesting a reward-related up-regulation in control regions as well.
Rewards, as suggested above, can enhance as well as diminish goal-directed behavior. The observation that adolescents take more risks when appetitive cues are present versus absent during gambling tasks makes this point (e.g., 37). In a recent imaging study 20, Somerville et al. identified the neural substrates of down-regulation of control regions with appetitive cues. Somerville et al. tested child, adolescent, and adult participants while they performed a go nogo task with appetitive social cues (happy faces) and neutral cues. Task performance to neutral cues showed steady improvement with age on this impulse control task. However, on trials for which the individual had to resist approaching appetitive cues, adolescents failed to show the expected age-dependent improvement. This performance decrement during adolescence was paralleled by enhanced activity in the striatum. Conversely, activation in the inferior frontal gyrus was associated with overall accuracy and showed a linear pattern of change with age for the nogo versus go trials. Taken together, these findings implicate exaggerated ventral striatal representation of appetitive cues in adolescents in the absence of a mature cognitive control response.
Collectively, these data suggest that although adolescents as a group are considered risk-takers 41, some adolescents will be more prone than others to engage in risky behaviors, putting them at potentially greater risk for negative outcomes. These findings underscore the importance of considering individual variability when examining complex brain-behavior relationships related to risk-taking and impulsivity in developmental populations. Further, these individual and developmental differences may help to explain vulnerability in some individuals to risk-taking which is associated with substance use, and ultimately, addiction 64.
Substance Use and Abuse in Adolescents
Adolescence marks a period of increased experimentation with drugs and alcohol 92, with alcohol being the most abused of illegal substances by teens 11, 93, 94. Early use of these substances, such as alcohol, is a reliable predictor of later dependence and abuse 95. Given the surge in alcohol dependence between adolescence and adulthood that is unequaled at any other developmental stage 96, we focus predominantly on a select review here of its use and abuse in adolescents and motivational properties.
Alcohol as well as other substances of abuse, including cocaine and cannabinoids, have been shown to have reinforcing properties. These substances influence mesolimbic dopamine transmission with acute activations of neurons in frontolimbic circuitry rich in dopamine, including the ventral striatum 97–99. As suggested by Hardin and Ernst (2009) 92, the use of these substances may exacerbate an already enhanced ventral striatum response resulting in heightened or strengthening of reinforcement properties to the drug. Robinson and Berridge 61, 63, 100 have suggested that these drugs of abuse can “hijack” the systems associated with drug incentives like the ventral striatum, thus down regulating top down prefrontal control regions.
The majority of empirical work on adolescent use of alcohol has been done in animals, given ethical constraints in performing such studies in human adolescents. Animal models of ethanol also provide the most evidence for differential effects of alcohol in adolescents relative to adults and are consistent with human findings of adolescents having relative insensitivity to ethanol effects. Spear and colleagues have shown that adolescent rats relative to adults, are less sensitive to the social, motor, sedation, acute withdrawal and “hangover effects” of ethanol 101–103. These findings are significant in that many of these effects serve as cues to limit intake in adults 11. Likewise, at the same time when adolescents are insensitive to cues that may help to limit their alcohol intake, positive influences of alcohol such as social facilitation may further encourage alcohol use 104. Most risky behaviors in humans- including alcohol abuse- occur in social situations 23, potentially pushing adolescents towards greater use of alcohol and drugs when this behavior is valued by their peers.
How is the brain altered with alcohol use and abuse in adolescence relative to adults? Whereas adolescents may be less sensitive to some of the behavioral effects of alcohol, they appear to be more sensitive to some of the neurotoxic effects 94. For example, physiological studies (e.g., 105) show greater ethanol induced inhibition of NMDA-mediated synaptic potentials and long-term potentiation in hippocampal slices in adolescents than in adults. Repeated exposure of intoxicating doses of ethanol also produces greater hippocampal dependent memory deficits 106, 107 and prolonged ethanol exposure has been associated with increased dendritic spine size 108. These latter findings of dendritic spine changes are suggestive of modification of brain circuitry that may stabilize addictive behavior 94.
Data from brain imaging studies provide parallel evidence in humans of neurotoxic effects of alcohol on the brain. A number of studies have reported altered brain structure and function in alcohol-dependent or -abusing adolescents and young adults compared to healthy individuals. These studies show smaller frontal and hippocampal volumes, altered white matter microstructure and poorer memory 109–113. Moreover, these studies show positive associations between hippocampal volumes and age of first use 109 suggesting that early adolescence may be a period of heightened risk to alcohol’s neurotoxic effects. Duration, which was negatively correlated with hippocampal volume, may compound this effect.
Currently, only a few studies have examined functional brain activity to drug or alcohol related stimuli (i.e. pictures of alcohol) in adolescents 114, although this is an area of future research (see 115). Studies of high-risk populations (e.g. familial load of alcohol dependence) suggest impairments in frontal functioning are apparent prior to drug use exposure (e.g. 116, 117) and can predict later substance use 118, 119. However, in an early behavioral study of the effects of alcohol in 8 to 15 year old boys of low and high familial risk 120, the most significant finding was little if any behavioral change or problem on tests of intoxication -even after given doses which had been intoxicating in an adult population were observed. These neurotoxic effects together with increased sensitivity to the motivational effects of alcohol and evidence of poorer top down prefrontal control apparent even prior to drug use exposure 116 may set up a long-term course of alcohol and drug abuse well beyond adolescence 118, 119.
Conclusions
Together, the studies described support a view of adolescent brain development as characterized by a tension between early emerging “bottom-up” systems that express exaggerated reactivity to motivational stimuli and later maturing “top-down” cognitive control regions. This bottom-up system which is associated with sensation- seeking and risk-taking behavior gradually loses its competitive edge with the progressive emergence of “top-down” regulation (e.g., 2, 7, 15, 23, 64, 121–123). This imbalance between these developing systems during adolescence may lead to heightened vulnerability to risk-taking behaviors and increased susceptibility to the motivational properties of substances of abuse.
This review provides behavioral, clinical and neurobiological evidence for dissociating these subcortico-cortico systems developmentally. Behavior data from laboratory tasks and self -report ratings administered to children, adolescents and adults (e.g., 18, 20, 37, 42) suggest curvilinear development of sensation-seeking with a peak inflection roughly between 13 and 17 years, while impulsivity decreases across development in a linear fashion from childhood to young adulthood. Human imaging studies show patterns of activity in subcortical brain regions sensitive to reward (ventral striatum) that parallel the behavioral data. Specifically, they show a curvilinear pattern of development in these regions and the magnitude of their response is associated with risk-taking behaviors. In contrast, prefrontal regions, important in top down regulation of behavior, show a linear pattern of development that parallels those seen in behavioral studies of impulsivity. Moreover, clinical disorders with impulse control problems show less prefrontal activity, further linking neurobiological substrates with the phenotypic construct of impulsivity.
The tension between subcortical regions relative to prefrontal cortical regions during this period may serve as a possible mechanism for the observed heightened risk-taking, including use and abuse of alcohol and drugs. The majority of adolescents have tried alcohol 93, but this does not necessarily lead to abuse. Individuals with less top-down regulation may be particularly susceptible to alcohol and substance abuse as suggested by studies of high-risk populations showing impairments in frontal functioning prior to alcohol and drug exposure (e.g. 116, 117). In the context of our neurobiological model of adolescence, these individuals would have an even greater imbalance in cortico-subcortical control. These findings are also in accordance with clinical findings in ADHD populations who show reduced prefrontal activity and are four times as likely to develop a substance use disorder compared to healthy controls 124. This imbalance in cortico-subcortical control would be further compounded by the insensitivity of adolescence to the motor and sedative effects of alcohol that otherwise may help to limit intake, and the positive influences of alcohol in social facilitation which may further encourage alcohol use 104. As shown by Steinberg and colleagues 23, 41, most risky behaviors- including alcohol and substance abuse- occur in social situations. Thus use of alcohol and drugs may be encouraged and maintained by peers when this behavior is valued.
One of the challenges in addiction related work is the development of biobehavioral markers for early identification of risk for substance abuse and/or for outcomes assessments for interventions/treatments. Our findings suggest that behavioral challenges that require both cognitive control in the presence of tempting appetitive cues may be useful potential markers. Example of such behavioral assays include gambling tasks with high and low risk or “hot” and “cold” conditions described in this review 18, 37 or simple impulse control tasks that require suppressing a response to an appetitive/tempting cue 20. These tasks are reminiscent of the delay of gratification task developed by Mischel 125. In fact, performance on simple impulse control tasks such as these in adolescents and adults has been associated with their performance as toddlers on the delay of gratification task 28, 29. Mischel and colleagues have shown the high level of stability and predictive value of this task in later life. Relevant to substance abuse, they showed that the ability to delay gratification as a toddler, predicted less substance abuse (e.g., cocaine) later in life 126. In our current work, we are beginning to use a combination of these tasks to identify the neural substrates of this ability to further understand potential risk factors for substance abuse.
Collectively, these data suggest that although adolescents as a group are considered risk- takers 41, some adolescents will be more prone than others to engage in risky behaviors, putting them at potentially greater risk for negative outcomes. However, risk-taking can be quite adaptive in the right environments. So rather than trying to eliminate adolescent risk-taking behavior that has not been a successful enterprise to date 23, a more constructive strategy may be to provide access to risky and exciting activities (e.g., after school programs with in-door wall climbing) under controlled settings and limit harmful risk-taking opportunities. As the adolescent brain is a reflection of experiences, with these safe risk-taking opportunities, the teenager can shape long-term behavior by fine tuning the connections between top-down control regions and bottom-up drives with maturity of this circuitry. Other successful strategies are cognitive behavioral therapies which focus on refusal skills, or cognitive control, to reduce risky behaviors 127. The findings underscore the importance of considering individual variability when examining complex brain–behavior relationships related to risk-taking and impulsivity in developmental populations. Further, these individual and developmental differences may help explain vulnerability in some individuals to risk-taking associated with substance use, and ultimately, addiction.
Acknowledgements
This work was supported in part by NIDA R01 DA018879, NIDA Pre-Doctoral training grant DA007274, the Mortimer D. Sackler family, the Dewitt-Wallace fund, and by the Weill Cornell Medical College Citigroup Biomedical Imaging Center and Imaging Core.