Show Summary Details
Page of

Prevention of Schizophrenia 

Prevention of Schizophrenia
Chapter:
Prevention of Schizophrenia
DOI:
10.1093/9780195173642.003.0008
Page of

PRINTED FROM OXFORD MEDICINE ONLINE (www.oxfordmedicine.com). © Oxford University Press, 2016. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Medicine Online for personal use (for details see Privacy Policy and Legal Notice).

date: 16 October 2018

There are currently recognized precursors of schizophrenia that are apparent during adolescence. A wide variety of early-intervention techniques have been developed that draw on the knowledge of these precursors to identify individuals at risk for the illness and to prevent the predisposition toward schizophrenia from developing into the full disorder. Unfortunately, most of the research that has enabled the identification of these precursors and the development of these intervention techniques has been performed retrospectively in adults with schizophrenia, with little specific research attention directed toward forms of schizophrenia that manifest during adolescence. In addition, prevention efforts have necessarily lagged behind studies of the risk factors, detection, and early intervention of the disease. Yet, a great deal has already been learned about risk-profiling and early intervention in schizophrenia generally, and those aspects that may be useful in understanding the adolescent forms of the illness are discussed below.

What Guides the Development of Early Intervention and Prevention Efforts?

Traditionally, prevention efforts have been classified at three levels: (1) primary prevention, which is practiced prior to the onset of the disease; (2) secondary prevention, which is practiced after the disease is recognized, but before it has caused suffering and disability; and (3) tertiary prevention, which is practiced after suffering or disability has been experienced, to prevent further deterioration. The primary/secondary/tertiary classification scheme is attractive and simple, but it does not serve to distinguish between preventive interventions that have different epidemiological justifications and require different strategies for optimal utilization. For example, this classification into primary, secondary, and tertiary prevention focuses on intended outcomes rather than on target populations or prevention strategies.

More recently, the terms universal, selective, and indicated have been adopted as a valuable way to distinguish preventive interventions. All three of these preventive intervention strategies refer to the target population. Universal preventive interventions are applied to whole populations, and aim at reducing risk and promoting protective factors. Because obstetric complications have been linked to the subsequent onset of schizophrenia in several studies (Zorenberg, Buka, & Tsuang, 2000), one potentially effective universal prevention strategy would be to focus on lowering the incidence of such complications through improved pre-, peri-, and postnatal care.

In contrast to universal prevention strategies, selective and indicated interventions target specific subgroups for intervention. Selective interventions target those who are at elevated risk based on group-level characteristics that are not directly related to etiology. Because schizophrenia is a familial and heritable disorder (Gottesman, 1991), a selective prevention program for schizophrenia might focus on asymptomatic children with first-degree affected relatives or, more specifically, on those with particular combinations of schizophrenia-risk–specific gene variants, as they become known.

Finally, an indicated intervention involves targeting individuals who either have signs of the disorder but are currently asymptomatic, or are in an early stage of a progressive disorder. Because there are no universal signs of schizophrenia, indicated interventions for this disorder have a somewhat broad definition. Two lines of research that may lead to indicated interventions for schizophrenia include the study of individuals with prodromal signs of schizophrenia (Eaton, Badawi, & Melton, 1995) and the characterization of individuals with schizotaxia, which can be defined as the underlying predisposition to schizophrenia that may or may not be expressed as prodromal symptoms (Tsuang, Stone, Tarbox, & Faraone, 2002).

In order to develop and refine selective and indicated prevention efforts for schizophrenia, the disorder itself (as well as its precursors) must be thoroughly understood. Some of the risk factors for schizophrenia, such as birth complications and a family history of the disorder, are widely recognized. Others are just becoming known or are still being validated. When a wide variety of schizophrenia-specific precursors are available, these features can be used to maximize the efficiency and effectiveness of preventive ef forts by narrowly specifying the characteristics of at-risk individuals, allowing only those who would benefit from intervention to be selected to receive it.

PREMORBID ASPECTS OF SCHIZOPHRENIA

The etiology of schizophrenia is complex, most likely involving a range of genetic and gene-environment interactions that are well summarized as the “epigenetic puzzle” (Gottesman & Shields, 1982; Plomin, Reiss, Hetherington, & Howe, 1994), as discussed in Chapter 5. The schizophrenia syndrome—the delusions, hallucinations, thought disorder, negative features, and cognitive dysfunction—is manifest at some stage during the lives of around 1 in 100 people. Figure 7.1 shows that the occurrence begins to take off in the early teenage and adolescent years, being rare before puberty and becoming less common in the second half of life. However, important events may occur in the period leading up to illness and in the early years of development, the so-called prodromal and premorbid periods.

Figure 7.1 Age at onset distribution of schizophrenia [from Hafner, H., Maurer, K., Loffler, W., & Riecher-Rossler, A. (1993). The influence of age and sex on the onset and early course of schizophrenia. British Journal of Psychiatry, 162, 80–86, used with permission].

Figure 7.1
Age at onset distribution of schizophrenia [from Hafner, H., Maurer, K., Loffler, W., & Riecher-Rossler, A. (1993). The influence of age and sex on the onset and early course of schizophrenia. British Journal of Psychiatry, 162, 80–86, used with permission].

Prodromal and Premorbid Phases of Schizophrenia

In most cases, schizophrenia does not come totally out of the blue; there are important changes that occur before the psychotic syndrome. Fragmentary psychotic symptoms, depression, changes in behavior, attenuated general functioning, and other nonspecific features commonly occur in the weeks, months, and sometimes years before the first psychotic break. This period before the schizophrenia syndrome is established is known as the prodrome, and is a change that can frequently be identified by either the affected individual or by their family.

The prodrome is a period of considerable interest from a clinical and theoretical point of view because it may be possible to intervene early during this time and thus prevent the onset of psychosis or improve its outcome. This exciting prospect of early intervention, considered elsewhere in this volume (Chapter 6), is technically complex because of the nonspecific nature of some of the symptoms in the prodrome. Schizophrenia or other psychoses are by no means inevitable in a group of adolescents who show apparently prodromal features. Looking back to adolescents who have developed schizophrenia, the psychological difficulties are, of course, much more difficult. Much research is aiming to understand the biology underlying this period just before and around the onset of schizophrenia when important neuropsychological and structural changes may be occurring (Pantelis et al., 2003; Wood et al., 2003). There is general agreement, however, that earlier-onset cases such as these occurring in childhood or ad olescence are likely to have more severe premorbid abnormalities (Nicolson & Rapoport, 1999; Nicolson et al., 2000).

There are other differences and abnormalities that occur well before the period of risk shown in Figure 7.1 begins. They are not just in a psychological domain and show no obvious continuity with the schizophrenia syndrome. Rather than being changes from the preexisting state that herald the illness during a prodrome, these differences are more a long-term part of the person, his or her personality, and early development.

These differences are known as premorbid features. The distinction from the prodrome is not always clear, particularly in younger people, but may have theoretical importance because they seem to point toward early vulnerability or predisposing factors, rather than to events that occur as an illness is triggered or precipitated. The existence of premorbid abnormalities and differences in those who will develop schizophrenia years later suggests that parts of the epigenetic puzzle are put in place in very early life. In childhood-onset cases, the distinction may be almost impossible because of the severity and insidious onset of schizophrenia before age 13 (Alaghband-Rad et al., 1995). Why look in early life for premorbid differences and causes of schizophrenia?

From its first descriptions, schizophrenia has had a longitudinal dimension. Thomas Clouston (Clouston, 1892; Murray, 1994; Murray & Jones, 1995) recognized a syndrome that he called “developmental insanity” in which developmental physical abnormalities were associated with early-onset psychotic phenomena, particularly in adolescent boys. When defining the schizophrenia syndrome more clearly, both Kraepelin (1896/1987) and Bleuler (1908/1987, 1911/1950) noted that many of the people who developed the psychotic syndrome had been different from their peers long before the psychosis began. Here is a description from one of Bleuler's (1911/1950) early accounts of what has become known as schizophrenia:

It is certain that many a schizophrenia can be traced back into the early years of the patient's life, and many manifest illnesses are simply intensifications of an already existing character.All ten of my own school comrades who later became schizophrenics were quite different from the other boys.

If some of the seeds of schizophrenia are sown in early life, then there ought to be other evidence. The excess of minor physical abnormalities (Green, Satz, Gaier, Ganzell, & Kharabi, 1989; Gualtieri, Adams, Shen, & Loiselle, 1982; Guy, Majorski, Wallace, & Guy, 1983; Lane et al., 1997; Lohr & Flynn, 1993; Sharma & Lal, 1986), and the dermatoglyphic or fingerprint abnormalities in people with schizophrenia (Bracha, Torrey, Gottesman, Bigelow, & Cunniff, 1992; McGrath et al., 1996) are seen as “fossilized” reminders of insults very early in life, during the first or second trimester of pregnancy, such as infections and nutritional problems (reviewed in Tarrant & Jones, 1999). These factors and some of the neuropathological data are probably best explained in terms of developmental processes having gone awry (Weinberger, 1995).

However, these processes are difficult to observe directly. Genetic high-risk studies in which the offspring of people with schizophrenia are followed up have shown subtle differences in the neurological development of these children at special risk, and in those not known to be so (Erlenmeyer-Kimling et al., 1982; Fish, 1977; Fish, Marcus, Hans, Auerbach, & Perdue, 1992; Walker & Lewine, 1990). Genetic studies such as these are discussed in Chapter 5.

What Are the Premorbid Differences Seen in Schizophrenia?

Bleuler wasn't very precise when he mentioned that many of the people he'd known who developed schizophrenia as adults were different from other boys as children. It's certainly interesting that he mentions boys specifically, because tightly defined schizophrenia does seem to be more common in men than in women, and the early developmental differences are often more obvious in boys than in girls. This may be partly an artifact of some research designs, as well as an effect of differences in the wiring of male and female brains.

Many aspects of development can be seen to be slightly different in children who will later develop schizophrenia. Often these differences are subtle and would not be noticed at the time by parents or professionals. Usually, differences can be noted in characteristics that are developing rapidly according to the age of the child, things that are on the cusp of the developmental wave, and the child appears to catch up later on. Following are some examples.

Early Milestones and Motor Development

Direct evidence of neurodevelopmental differences is available (Weinberger, 1995). One source is a remarkable piece of opportunistic research by Walker and colleagues (Walker, Grimes, Davis, & Smith, 1993; Walker & Lewine, 1993) who studied home movies of families in which one child later developed schizophrenia. Facial expression of emotion and general motor functions were rated blind to that child's identity among siblings. The preschizophrenic children were distinguished on both accounts, some with fairly gross but transitory motor differences. These may point to the basal ganglia of the brain as being involved in the underlying mechanism, reminding us that subtle motor disturbances are apparent at the beginning of schizophrenia, before any treatment (Gervin et al., 1998).

Such developmental differences have now been demonstrated in large, population-based or epidemiological samples. In the British 1946 birth cohort, a group of several thousand people born in 1 week in March 1946 have been studied regularly throughout their lives. Their mothers were asked about development when the children were age 2 years, before anyone knew what would happen later on. All the milestones of sitting, standing, walking, and talking were slightly though clearly delayed in those who developed schizophrenia as adults, but there was nothing that would have alarmed parents at the time. There were other indications that language acquisition was different before onset of schizophrenia. Nurses were more likely to notice a lack of speech by 2 years in the children who developed schizophrenia as adults, and school doctors noted speech delays and problems in them throughout childhood.

Developmental differences have been replicated in similar cohort studies in other domains, such as bladder control, fine motor skill, and coordination during late childhood and adolescence (Cannon et al., 1999; Crow, Done, & Sacker, 1995). The motor and language delays were replicated and extended in a birth cohort study from Dunedin, New Zealand (Cannon et al., 2002), where over a thousand children have been followed during childhood. Those who indicated in their mid-20s that they had experienced symptoms suggestive of schizophrenia, mania, and other disorders were compared with those who said that they had never had such phenomena.

Figure 7.2 shows how a summary motor performance score was lower throughout most of childhood for those who experienced a schizophreniform disorder than that of the other groups. Figure 7.3 indicates that there was also a receptive language problem in those who later had hallucination, delusion, and thought disorder.

Figure 7.2 Mean standardized scores for motor performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

Figure 7.2
Mean standardized scores for motor performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

Figure 7.3 Mean standardized scores for expressive and receptive language performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

Figure 7.3
Mean standardized scores for expressive and receptive language performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

Developmental differences before onset of schizophrenia were observed during the first year of life in the North Finland 1966 birth cohort. This comprises about 12,000 babies due to be born in this geographical area during 1966 (Rantakallio, 1969). Their early development was charted in the first year of life and later linked to information about who had developed schizophrenia through adolescence and into the early 30s (Isohanni et al., 2001). Figure 7.4 shows the incidence of schizophrenia in male subjects according to how quickly the little boys learned to stand without support or “toddle” during the first year. The figure for girls was similar.

Figure 7.4 Relationship between age at standing without support, or toddling, and later schizophrenia in boys from the 1966 North Finland birth cohort (Isohanni et al., 2001). The later boys could stand during the first year of life, the greater the risk of schizophrenia, even when the milestone was passed within normal limits [from Isohanni, M., Jones, P.B., Moilanen, K., Rantakallio, P., Veijola, J., Oja, H., Koiranen M., Jokelainen, J., Croudace, T.J., & Järvelin, M-R. (2001). Early development milestones in adult schizophrenia and other psychoses. A 31-year follow-up of the north Finland 1966 birth cohort. Schizophrenia Research, 52, 1–19, used with permission from Elsevier].

Figure 7.4
Relationship between age at standing without support, or toddling, and later schizophrenia in boys from the 1966 North Finland birth cohort (Isohanni et al., 2001). The later boys could stand during the first year of life, the greater the risk of schizophrenia, even when the milestone was passed within normal limits [from Isohanni, M., Jones, P.B., Moilanen, K., Rantakallio, P., Veijola, J., Oja, H., Koiranen M., Jokelainen, J., Croudace, T.J., & Järvelin, M-R. (2001). Early development milestones in adult schizophrenia and other psychoses. A 31-year follow-up of the north Finland 1966 birth cohort. Schizophrenia Research, 52, 1–19, used with permission from Elsevier].

It is clear that not only was there an effect whereby the later a boy learned to toddle, the greater his chance of developing schizophrenia in later life, but also that this effect seemed to hold true throughout the range of variation in reaching this milestone, all of which might be considered normal. If one were looking only for very late developers, then one might be more likely to find them within the preschizophrenia group than in those who did not develop the illness. However, this approach would completely obscure the widespread nature of this association, the meaning of which is considered later on in this chapter.

There is another finding apparent from Figure 7.4. For the boys who passed the milestone early, in the 9-month and 10-month categories, the relatively few individuals who developed schizophrenia all did so in their mid-teens to mid-20s; their period of risk seems fairly short. For those who were later developers, the period of risk is longer; these groups are still accruing cases of schizophrenia into their early 30s and beyond. It may be that the overall risk period for schizophrenia is shorter when neurodevelopment is more efficient, and longer when it is less efficient.

Behavioral Development

Bleuler's description of schizophrenia most obviously implies differences in behavior and temperament. Studies in this area have also moved on through retrospective research methodologies to cohort designs. Sophisticated rating scales for the retrospective assessment of behavior and personality demonstrate differences prior to psychosis, with the most common being characteristics of a shy, “schizoid” habit (Ambelas, 1992; Cannon-Spoor, Potkin, & Wyatt, 1982; Foerster, Lewis, Owen, & Murray, 1991; Gittleman-Klein & Klein, 1969).

Robins (1966) undertook a pioneering, historical cohort study in which she followed a group of boys who had been referred to a child guidance clinic in St. Louis, Missouri. Here antisocial behavior was associated with later schizophrenia. Watt and Lubensky (1976; Watt, 1978) traced the school records of people with schizophrenia who came from a geographically defined neighborhood in Massachusetts. Girls who were to develop schizophrenia were introverted throughout kindergarten into adolescence. Boys who were to become ill were more likely to be rated as “disagreeable,” but only in the later school grades (7 to 12). This pattern has been identified (Done, Crow, Johnson, & Sacker, 1994) in a British cohort using a similar set of behavioral ratings and in the Dunedin cohort mentioned above (Cannon et al., 2002). The 1946 British birth cohort contained children's own ratings of their behavior at age 13 years and teachers' ratings 2 years later. These data showed no evidence of antisocial traits in the preschizophrenia group, but a strong association with shy, “schizoid” behaviors at both ages. The two views gave a very similar picture; the shyer someone seemed as a child, the greater the risk. Other studies, however, remind us of the varied childhood psychiatric conditions that predate schizophrenia (Kim et al., in press).

The behavioral differences seem to persist toward the prodrome, though are independent from it. Malmberg, Lewis, David, and Auerbach (1998) studied a sample of some 50,000 men conscripted into the Swedish army at age 18 to 20 years when they underwent a range of tests and assessments. Four behavioral variables at age 18 were particularly associated with later schizophrenia: having only one or no friends, preferring to socialize in small groups, feeling more sensitive than others, and not having a steady girlfriend. Cannon et al. (1997) also noted the same relationship.

Another twist to the story about premorbid behavioral differences comes from the recent recognition that some of the individual parts of the schizophrenia syndrome, such as hallucinations or delusions, can exist in otherwise well-functioning individuals in the population. However, they are indeed associated with greater risk of occurrence of subsequent schizophrenia whether they occur in early adolescence (Poulton et al., 2000) or adulthood (Myin-Germeys, Krabbendam, Delespaul, & Van Os, 2003).

Thus, there seems to be a consistency over childhood and adolescence and across several types of study regarding the presence of premorbid behavioral differences. People who will develop schizophrenia as adolescents and adults are different from their peers in terms of behavior in childhood, just as Bleuler noted a century ago; the effects may be even more widespread than he thought.

Cognitive Function and IQ

This aspect of psychological function also shows differences in the premorbid period. Aylward, Walker, and Bettes (1984) have provided a comprehensive review of intelligence in schizophrenia. They concluded that intellectual function is lower in prepsychotic individuals than in age-matched controls. Linking the prepsychotic deficit to outcome, they raised the question as to whether IQ may be an independent factor that can protect otherwise vulnerable individuals, or whether the deficits are part of that vulnerability.

Once again, the birth cohort studies shed light on the question. Cannon et al. (2002) showed that mean IQ test scores were consistently lower during childhood in those children who developed schizophreniform disorder (Fig. 7.5). This mean shift in premorbid IQ was also seen in two British cohorts (Jones & Done, 1997). When the childhood IQ data from the 1946 cohort (Pidgeon, 1964, 1968) is studied in greater detail, it is clear that the lower mean premorbid IQ is not due to a subset of people with very low scores; rather, the whole distribution of those who develop schizophrenia when they reach adolescence or adulthood is shifted down—most children seem not to be doing as well as they might have been expected to perform (Jones, Rodgers, Murray, & Marmot, 1994). This is a similar situation to the motor findings in the Finnish cohort (Cannon et al., 1999). It is not that there is a group of very abnormal individuals driving the findings; the effects are seen across the normal range.

Figure 7.5 Mean standardized scores for IQ performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

Figure 7.5
Mean standardized scores for IQ performance at four ages during childhood for adults who indicated symptoms of schizophreniform disorder (36), mania (20), or anxiety/depression (278), compared with controls (642) [from Cannon, M., Caspi, A., Moffitt, T.E., Harrington, H., Taylor A., & Murray, R.M. (2002). Evidence for early-childhood, pan-developmental impairment specific to schizophreniform disorder: results from a longitudinal birth cohort. Archives of General Psychiatry, 59, 449–456. Copyright © 2002, American Medical Association. All rights reserved, used with permission].

David, Malmberg, Brandt, Allebeck, and Lewis (1997; see above) replicated this result in the Swedish conscript study, although the measures were later in life at age 18. There was no evidence of a threshold effect below or above which this relationship did not hold. Very bright individuals can develop schizophrenia, but they are less likely to than those who are less able. Put another way, any individual is more likely to develop schizophrenia than someone who is more able in terms of IQ, although the effect is small. Recent interest in the cognitive aspects of schizophrenia (David & Cutting, 1994; Green, 1998) suggests a parsimonious conclusion that prepsychotic IQ deficits (and perhaps social characteristics) may be manifestations of the same abnormal cognitive processes that later result in psychosis.

What Do Premorbid Abnormalities Mean?

The range of differences in the developmental histories of people who develop schizophrenia when they are older suggests that something to do with the causes of this syndrome is active long before the characteristic features begin (Marenco & Weinberger, 2000). There is evidence for many such early factors, including genetic effects (Jones & Murray, 1991; Fish et al., 1992), obstetric complications (Cannon et al., 2002), psychosocial stresses, famine, infections, and other toxic events during brain development (see Jones, 1999, for review).

It seems that many events that may lead to early brain development being suboptimal may increase the risk of later schizophrenia. There may be specific causes or combinations of causes, such as gene–environment interactions, that make people vulnerable to developing the schizophrenia syndrome, perhaps after later, necessary events that act as triggers. These may include normal (Weinberger, 1987, 1995) or abnormal brain development (Feinberg, 1982a, 1997; Pogue-Geile, 1997), as well as traditional precipitants such as psychosocial stressors or drugs (see Chapter 6).

The behavioral, motor, language, and cognitive differences shown in the premorbid period may be manifestations of vulnerability or predisposition to schizophrenia; they may not be risk modifiers in themselves. These indicators seem remarkably homogeneous—in retrospect, like a final common pathway. The idea of only a subgroup of individuals having this manifest vulnerability, as suggested in the seminal views of developmental aspects of schizophrenia (Murray & Lewis, 1987), is not supported by recent research. Most people or even every person who develops the syndrome may have had a degree of developmental vulnerability, although this will not have been obvious at the time.

The early motor findings in the Finnish birth cohort (Fig. 7.4) are consistent with the vulnerability being due to developmental processes being generally less efficient, the formation or enhancement of functional neural networks, for instance. The greater the inefficiency, the greater the risk of schizophrenia when that same inefficiency is played out in the formation of complex and integrative systems later in adolescence and adult life (Chapter 5).

There are several candidates to explain this unifying vulnerability. These include hormonal events (Walker & Bollini, 2002) that are able to tie together motor and other system abnormalities in early life and links with psychosocial stress in models of predisposition and precipitation (Walker, Lewis, Loewy, & Palyo, 1999). Molecular biology and the investigation of not only the presence but also the functional activity of genes and the proteins that code them may yield other dimensions of vulnerability. For instance, Tkachev et al. (2003) showed that expression of genes associated with glial cells facilitate the nutritional support of nerve cells (oligodendrocytes) and with myelin, the insulating sheaths provided for these neurons, were down-regulated in the frontal cortex of brains of deceased people who had suffered from schizophrenia. Expression seems a very good candidate for the homogeneous vulnerability factor posited in this account of premorbid abnormalities before schizophrenia, and may be an endophenotype or hidden manifestation of the disorder. The deficient gene expression remains to be demonstrated before onset of schizophrenia and will itself have its own prior causes.

As mentioned at the beginning of this section, premorbid features of schizophrenia are not yet of use in terms of prediction and early intervention. They occur in multiple domains, but many of the effects we can measure are subtle and leave individuals remaining well within the wide range of normality. Premorbid features tell us a great deal about what we should be looking for in terms of underlying mechanisms and causes of schizophrenia and when these may operate; these features are signposts toward these mechanisms. As we learn about the processes that underpin the behavioral, cognitive, and motor differences that we can measure in the premorbid phase of schizophrenia, we may become able to identify those individuals who are vulnerable with enough precision to be able to do something useful for them.

Developmental Precursors of Adolescent-Onset Schizophrenia

There are precursors of schizophrenia prior to the first onset of psychosis in many but not all adolescents who develop schizophrenia. As will be seen below, the precursors of schizophrenia can be subtle changes in basic brain functions such as motor functions, attention and memory, certain behavior problems, or attenuated schizophrenic symptoms. Identifying the developmental precursors of adolescent-onset schizophrenia has important implications for both enhancing our understanding of the underlying neurobiology of schizophrenia and developing of preventive interventions for schizophrenia.

Neurobiological factors present in individuals at high risk for developing a schizophrenic disorder, prior to the onset of frank psychotic symptoms, may represent potential etiological factors for schizophrenia. A number of brain systems known to be disturbed in schizophrenia, including prefrontal and medial temporal lobes (Selemon & Goldman-Rakic, 1999; Weinberger, 1986), may underlie certain neurocognitive impairments in children at risk for schizophrenia (Asarnow, 1983; Cannon et al., 1993). Determining how these neurobiological factors evolve when a schizophrenic disorder develops could provide important clues about how the diathesis for schizophrenia is potentiated into the overt disorder. A combination of disease-related progressions and maturational changes are hypothesized to exacerbate these dysfunctions when individuals at risk for the disorder convert to having the disorder.

Research Methods Used to Identify Developmental Precursors

Two broad classes of methods have been used to identify developmental precursors of schizophrenia. The first class of methods is prospective studies of children. A common feature of prospective methods is identifying, then characterizing, a group of children and following them up to determine which children subsequently develop a schizophrenic disorder. One important prospective method is to study children who are at increased statistical risk of developing a schizophrenic disorder. The lifetime risk for schizophrenia in the general population is less than 1%. Very large samples are required to prospectively identify the precursors of schizophrenia by following up children drawn from the general population. Given the population base rate of schizophrenia (<1%), one would need to start off with at least 2,500 children (without accounting for subjects being lost to follow-up) to identify the developmental precursors of schizophrenia in 25 individuals. High-risk studies ascertain individuals with an increased lifetime risk for schizophrenia for inclusion in prospective, longitudinal studies. This is typically accomplished by studying the children of parents with schizophrenia. The lifetime risk for schizophrenia for children of one parent with schizophrenia is approximately 10% to 12%, an approximately 10-fold increase in risk for the disorder. High-risk studies frequently measure putative etiological factors for schizophrenia prior to the onset of the disorder. In this way, studies of children at risk for schizophrenia provide a vehicle for testing hypotheses about etiological factors in schizophrenia.

Most (85% to 90%) patients with schizophrenia do not have parents with a schizophrenic disorder. This has raised the concern that findings from “genetic high-risk” samples may not accurately describe the developmental precursors of schizophrenia in the much larger number of individuals who develop schizophrenia but do not have a schizophrenic parent. Recognition of this problem has led to an interest in complementary strategies for identifying developmental precursors of schizophrenia. Birth cohort studies are prospective studies that can provide information on precursors of schizophrenia but do not have some of the ascertainment biases inherent in high-risk studies. In contrast to studies of children at risk for schizophrenia, birth cohort studies follow up large, representative samples of entire birth cohorts. Birth cohort studies are designed to provide information about a wide range of medical, psychiatric, and social conditions, so they use very large samples, literally thousands of subjects. For example, the 1946 British birth cohort study that provided im-portant data on developmental precursors of schizophrenia studied almost 5,400 children born during the week of March 9, 1946, then systematically followed them up to determine that 30 children developed schizophrenia as well as a broad range of other psychiatric and medical outcomes. A great strength of birth cohort studies is the large, representative sample size. However, a limitation of birth cohort studies is that because they are not typically designed to test hypotheses about any particular disorder, they use a rather broad range of measures, which are not specifically tailored to measure potential precursors of schizophrenia.

By studying children prior to the onset of the disorder it becomes possible to identify the precursors or antecedents of the disorder, not the consequences of the disorder—for example, the initiation of antipsychotic drug treatment. We will review some of the key findings that have emerged from three decades of studies of children at risk for schizophrenia and from birth cohort studies.

A second class of methods involves the collection of information on the premorbid development of individuals, usually adults, who have been diagnosed with schizophrenia. Some of the earliest studies of this type relied on retrospective reports from informants who knew the patient as a child. This approach has obvious limitations, among them being that recollections of the past may be subject to bias. The follow-back method features the ascertainment of individuals with schizophrenia and then, using different types of archival material, characterization of them prior to the onset of psychosis. Since the focus of this section is on adolescent-onset schizophrenia, we will emphasize the few studies that ascertained adolescent-onset schizophrenics.

Follow-back studies vary in the type of archival material used to describe the premorbid characteristics of individuals who develop schizophrenia. There is wide agreement (see Watt, Grubb, & Erlenmeyer-Kimling, 1982) about the advantages of using contemporaneous childhood records over retrospective interviews to reconstruct the premorbid histories of individuals who develop schizophrenia. The major limitation of follow-back studies is that the childhood evaluations were not guided by specific hypotheses about the age-specific manifestations of schizophrenia, and as a consequence, the most informative measures may not have been collected. These studies also have ascertainment biases, the nature of which varies depending on how the sample of schizophrenia patients was identified.

Birth cohort and follow-back studies can show associations between childhood characteristics and the development of schizophrenia because in both types of studies, individuals with schizophrenia have been identified. These associations are prospective in birth cohort studies, and retrospective in follow-back studies. Because the data used to describe childhood risk factors in birth cohort and follow-back studies were not collected with the intent of testing hypotheses about schizophrenia, the measures may not be sensitive to some of the more subtle manifestations of liability to schizophrenia. In contrast, the measures included in more recent studies of children at risk for schizophrenia were specifically designed to tap vulnerability to schizophrenia. Most studies of children at risk for schizophrenia, while intended to be longitudinal, were not able to follow up subjects through the age of risk to determine which high-risk subjects developed a schizophrenic disorder. Consequently, although there are extensive cross-sectional comparisons of children at risk for schizophrenia to controls, there are far fewer data on the long-term predictive validity of childhood risk factors identified in high-risk studies.

If the results of follow-back studies of adolescent-onset schizophrenia patients yield converging results to those of children at risk for schizophrenia and birth cohort studies, then the generalizability and validity of the results will hold more value for future research and treatment.

A Developmental Perspective on Risk Factors

There are relatively age-specific manifestations of liability to schizophrenia (see J. Asarnow, 1988; R. Asarnow, 1983; Erlenmeyer-Kimling et al., 2000; Walker, 1991 for reviews), and the manifestations to liability to schizophrenia are somewhat different at different ages. For example, one of the interesting findings that emerges from a review of developmental precursors of schizophrenia is that some deficits observed during infancy that are frequently found in high-risk, birth cohort, and follow-back studies are not found in later stages of development. Another important reason to attend to the developmental progression of risk factors is that, from the point of view of targeting individuals for prevention, risk factors more proximal to the period of time when schizophrenia develops may have better diagnostic accuracy than, for example, infancy predictors.

Table 7.1 summarizes some of the major findings concerning precursors of schizophrenia at three different developmental periods: infancy, early childhood, and middle childhood and early adolescence. Table 7.1 is not an exhaustive summary of the results of high-risk, birth cohort, and follow-back studies. Rather, Table 7.1 presents the characteristics that best differentiate high-risk children from controls or predict later development of schizophrenia that have thus far been identified in the literature. Cited below are comprehensive reviews of the results of high-risk, birth cohort, and follow-back studies.

Table 7.1 Developmental Precursors of Schizophrenia Identified by Means of Three Different Research Strategies

Children At Risk for Schizophrenia Retrospective Studies

Birth Cohort Studies

Follow-Back Studies Studies

Life Stage

CNS Functioning

Symptoms and Behaviors

CNS Functioning

Symptoms and Behaviors

CNS Functioning

Symtoms and Behaviors

Infancy (0–2 years)

Impaired motor and sensory functioning

High or variable sensitivity to sensory stimulation

Abnormal growth patterns

Short attention span

Low IQ

Difficult temperament

Passive, low energy, quiet, inhibited

Absence of fear of strangers

Low communicative competence in mother–child interaction, less social contact with mothers

Delays in motor milestones

Speech problems or delays

Delayed potty training

Abnormal motor functioning

Impaired language

Early childhood (2–4 years)

Low reactivity

Poor gross and fine motor coordination

Inconsistent, variable performance on cognitive tests

Depression and anxiety

Angry and hostile disposition

Schizoid behavior (i.e., emotionally flat, withdrawn, distractible, passive, irritable, negativistic)

Low reactivity

More likely to receive a diagnosis of developmental disorder

Speech problems

Motor problems

Solitary play

Impaired language

Neuromotor impairments

Middle childhood/early adolescence years (4–14 years)

Neurological

Passive impairment (poor fine motor coordination, Socially balance, sensory perceptual isolated signs, delayed motor development)

Poor social adjustment

Attentional impairment under ADD overload conditions

Anxious/Variance-scatter on depressed intellectual tests

Poor affective control (emotional instability, aggressive, disruptive, hyperactive, impulsive)

Poor interpersonal relationships, withdrawn

Cognitive slippage disturbance

Mixed internalizing–externalizing symptoms, fearful

ADD-like syndrome

Twitches, grimaces

Poor academic achievement

Poor balance, clumsiness

Solitary play

Less socially confident

“Schizoid” social development

Reduced general intelligence

Poor academic achievement

Poor attention

Neuromotor impairments

Passive

Specially isolated

Poor social adjustment

ADD

ADD, attention-deficit disorder; CNS, central nervous system.

The format of Table 7.1 was modeled after a review by J. Asarnow (1988), and the entries for studies on high-risk children come from reviews by J. Asarnow (1988), Erlenmeyer-Kimling (2000, 2001), R. Asarnow (1983), and Cornblatt and Obuchowski (1997). The entries for birth cohort studies are based on reviews by Jones, Rogers, Murray, and Marmot (1994) and by Jones and Tarrant (1999). The data for entries of follow-back studies of adolescent-onset schizophrenia come from Watkins, Asarnow, and Tanguay (1988) and Walker, Savoie, and Davis (1994). Watt and Saiz (1991) provided a broad review of follow-back studies of adult-onset schizophrenia.

Two types of risk characteristics are differentiated into separate columns in Table 7.1: endophenotypes versus clinical and behavioral features. Endophenotypes are putative reflections of the underlying schizophrenic genetic diathesis. Most of the putative endophenotypes employed in high-risk studies are neuromotor or neurocognitive functions (e.g., language, attention, and memory) believed to tap central nervous system disturbances that reflect liability to schizophrenia. In contrast, clinical and behavioral features are either nonschizophrenic psychiatric symptoms or behavior problems which, while they may reflect the underlying genetic diathesis, are much more proximal to the overt symptoms of schizophrenia. The reason for making this distinction is that these two different classes of risk characteristics have somewhat different implications as targets for prevention.

High-Risk Studies

The results of high-risk studies have to be considered in the context of a major limitation: there are limited data on how well the cross-sectional differences between children at risk for schizophrenia and matched controls predict the later onset of schizophrenia. Only six studies of children at risk for schizophrenia have obtained diagnostic evaluations in adulthood or late adolescence: (1) the New York High-Risk study (Fish, 1984); (2) the Copenhagen High-Risk project (Cannon et al., 1993; Mednick & Schulsinger, 1968); (3) the Israeli High-Risk study (Ingraham, Kugelmass, Frankel, Nathan, & Mirsky, 1995); (4) the New York High-Risk project (Erlenmeyer- Kimling et al., 2000); (5) the Swedish High-Risk study (McNeil, Harty, Blennow, & Cantor-Graae, 1993); and (6) the Jerusalem Infant Development study (Hans et al., 1999). The New York High-Risk project studied the largest number of subjects for the longest period of time and therefore provides the most extensive data on the diagnostic accuracy of childhood and adolescent predictors of schizophrenia-related psychoses. None of these studies focused on the prediction of adolescent-onset schizophrenia. Indeed, there are very few cases of adolescent-onset schizophrenia in the entire high-risk literature. As a consequence, we are making the assumption that the factors that predict adult-onset schizophrenia are germane to the prediction of adolescent-onset schizophrenia.

Infancy.

In most but not all studies (see Walker & Emory, 1985, for review), during infancy neurological signs or neuromotor dysfunctions are found more frequently in children at risk for schizophrenia than in controls. In these studies neuromotor anomalies were assessed by observation during a pediatric neurological examination or by performance on standardized tests of infant development (e.g., the Bayley). Neurological signs and neuromotor dysfunctions are not specific to infants at risk for schizophrenia and are not rare events in the general pediatric population. Neurological abnormalities in neonates typically tend to improve. In contrast, it appears that these abnormalities in children at risk for schizophrenia persist, and may worsen over time. Infants with neurological or neuromotor abnormalities are the high-risk infants most likely to develop schizophrenic disorders in adolescence and early adulthood (Fish, 1987; Marcus et al., 1987; Parnas, 1982). Neurologic dysregulation in infancy predicts the development of schizophrenia spectrum disorders (Fish, 1984). Impaired performance on tasks with extensive motor demands during middle childhood also predicts the presence of schizophrenia spectrum disorders during adolescence (Hans et al., 1999).

Disturbances in early social development are found more frequently in children at risk for schizophrenia than in controls. Depending on the study, these disturbances are manifested in difficult temperaments, apathy or withdrawal, being inhibited, less spontaneous, and imitative, reduced social contact with mothers, and absence of fear of strangers. The absence of fear of strangers during infancy could be an indication that the child does not differentiate between familiar adults to whom the child is attached (for example, the parents) and others. This absence of the fear of strangers may reflect inadequately developed attachment. These disturbances are not specific to children at risk for schizophrenia. They are also associated with broad risk factors such as socioeconomic status, general maternal distress, early trauma or neglect, and poor quality of parenting. Although there are scant data on how well these disturbances in infant social development predict the development of schizophrenia, many of these findings are related to the development of social competencies (Watt & Saiz, 1991).

Early childhood.

During early childhood (2 to 4 years of age), children at risk for schizophrenia are more likely than controls to show poor fine and gross motor coordination and low reactivity. Although poor fine and gross motor coordination was found in a sample of children that was different from the samples of infants at risk for schizophrenia who showed a variety of neurological signs and neuromotor dysfunction, these data suggest that the dysfunctions observed in infancy are persistent.

In early childhood there is an increased occurrence of internalizing symptoms (depression and anxiety), angry and hostile dispositions, and schizoid behavior (emotionally flat, socially withdrawn, passive and distractible) in children at risk for schizophrenia. Again, these characteristics are not specific to children at risk for schizophrenia and there is no evidence that these characteristics are strongly predictive of the later development of schizophrenia.

Middle childhood and early adolescence.

Neuromotor impairments, including gross motor skills (Marcus et al., 1993), are found more frequently in children at risk for schizophrenia than in controls during middle childhood and early adolescence (4 to 14 years of age). One of the most robust cross-sectional findings during middle childhood and early adolescence is the presence of neurocognitive impairments, especially on measures with high attention demands. A subgroup of children at risk for schizophrenia showed impairments on some of the same tasks for which patients with schizophrenia show impairments (Asarnow, 1988). The neurocognitive tasks for which children at risk for schizophrenia show impairments include measures of sustained attention (various continuous performance tests) and secondary memory (e.g., memory for stories). For example, children at risk for schizophrenia, as well as acutely disturbed and partially remitted schizophrenia patients, performed poorly on a partial-report-span-of-apprehension task (Asarnow, 1983) in the high attention/processing demand condition. The span of apprehension measures the rate of early visual information processing (Fig. 7.6).

Figure 7.6 Span of apprehension data [from Asarnow, Steffy, MacCrimmon, & Cleghorn, 1978].

Figure 7.6
Span of apprehension data [from Asarnow, Steffy, MacCrimmon, & Cleghorn, 1978].

There are some data on the predictive validity of the neurocognitive impairments identified during middle childhood and early adolescence. In the New York High-Risk project, the presence of impairments on a number of attentional tasks (an Attentional Deviance Index) given in middle childhood predicted 58% of the subjects who developed schizophrenia-related psychoses by mid-adulthood (Erlenmeyer-Kimling et al., 2000). Attentional impairments in mid-dle childhood were also associated with anhedonia (Freedman, Rock, Roberts, Cornblatt, & Erlenmeyer-Kimling, 1998) in adolescents prior to the onset of schizophrenia and social deficits during early adulthood (Cornblatt, Lenzenweger, Dworkin, & Erlenmeyer-Kimling, 1992; Freedman et al., 1998). Neuromotor dysfunction during childhood (assessed by the Lincoln-Oseretsky Motor Development Scale) identified 75% of the high-risk children who developed schizophrenia-related psychoses during adulthood (Erlenmeyer-Kimling et al., 2000). A verbal short-term memory factor that included a childhood digit span task and a complex attention task predicted 83% of the New York high-risk children who developed schizophrenia-related psychoses during adulthood and showed high specificity to those psychoses (Erlenmeyer-Kimling et al., 2000). If replicated, these findings would suggest that the combination of genetic risk (being the child of a parent who has schizophrenia) and neurocognitive impairments during middle childhood might identify individuals with a greatly increased risk for developing schizophrenia. The sensitivity (correctly predicting the onset of schizophrenia-related psychoses) was higher for the verbal memory (83%) and motor skills (75%) factors than for the attentional factor (58%). Conversely, the false-positive rate (incorrectly predicting that a child would develop schizophrenia) was lower for the Attentional Deviance Index (18%) than for the memory factor (28%) and motor factor (27%).

The short-term follow-up in the Jerusalem High-Risk study provides an important link between the attentional impairments frequently observed in children at risk for schizophrenia during adolescence and the motor impairments found during infancy and early childhood. The children who showed impaired neuromotor performance during childhood were the subjects most likely to show impairments on a variety of measures of attention and information processing during early adolescence (Hans et al., 1999).

During middle childhood, children at risk for schizophrenia receive an increased frequency of a variety of psychiatric diagnoses, including an attention deficit disorder (ADD)-like syndrome. Poor affective control, including emotional instability and impulsivity, as well as aggression and disruptive behaviors are found more frequently in children at risk for schizophrenia than in controls. Early precursors of thought disorder may be reflected in the presence of cognitive slippage. Poor peer relations are one of the more frequently found behavioral characteristics during middle childhood and early adolescence. None of these symptoms is specific to children at risk for schizophrenia. For example, poor affective control is found in children who subsequently develop an affective disorder.

Birth Cohort Studies

A British birth cohort study of almost 5,400 people born the week of March 9, 1946, complements the studies of individuals at risk for schizophrenia by virtue of being representative of the general population. Thirty cases of schizophrenia were identified among individuals between the ages of 16 and 43 in this cohort, which reflects the population base rate of the disorder. A 1956 birth cohort study in Northern Finland (Isohanni et al., 2001) yielded 100 cases of DSM-III schizophrenia.

Across the major birth cohort studies, including the British birth cohorts of 1946 (Jones, Rodgers, Murray, & Marmot, 1994) and 1958 (Done, Crow, Johnson, & Sacker, 1994; Jones & Done, 1997) and the Northern Finland 1956 birth cohort (Isohanni et al., 2001), a number of developmental precursors of schizophrenia have been identified. Neurologic signs, reflected in various forms of motoric dysfunction ranging from tics and twitches, poor balance and coordination and clumsiness, to poor hand skill, are consistently identified as developmental precursors of individuals who later go on to develop schizophrenia (Done et al., 1994; Jones et al., 1994). There was an increased frequency of speech problems up to age 15 in patients who subsequently developed schizophrenia. Low educational test scores at ages 8 and 11 were also risk factors (Jones et al., 1994).

During early and late middle childhood, individuals who subsequently developed a schizophrenic disorder could be differentiated from their peers by their preference for solitary play, poor social confidence, and a “schizoid” social development.

In general, birth cohort studies suggest that there appears to be “consistent dose-response relationships between the presence of developmental deviance and subsequent risk” (Jones & Tarrant, 1999). The more deviant an individual is toward the “abnormal” end of a population distribution, the greater the risk of the disorder.

There is considerable overlap between the developmental precursors of affective disorders and schizophrenia (Van Os, Jones, Lewis, Wadsworth, & Murray, 1997). For example, lower educational achievement is associated with affective disorders in general, whereas delayed motor and language milestones are associated with childhood onset of an affective disorder. As in schizophrenia, there is evidence of persistence of motor difficulties, with an excess of twitches and grimaces noted in adolescents.

Follow-back Studies

With regard to endophenotypic characteristics, during infancy, children who subsequently develop schizophrenia as adolescents are characterized by the presence of abnormal motor functioning and impaired language. Neuromotor and language impairments and decreases in positive facial emotion are also present in early childhood (Walker et al., 1993). During middle childhood the language impairments fade; however, the neuromotor impairments persist. In addition, during middle childhood, children who subsequently develop a schizophrenic disorder are characterized by poor academic achievement, poor attention, and reduced general intelligence.

During middle childhood, children who subsequently develop a schizophrenic disorder are characterized as being passive and socially isolated, with poor social adjustment. They frequently present with symptoms of attention-deficit hyperactivity disorder (ADHD) and/or anxiety and depression.

A novel approach to using archival data to characterize the premorbid histories of individuals who develop schizophrenia is the use of home movies to identify infant and childhood neuromotor dysfunctions (Walker et al., 1994). In such studies, ratings were made of neuromotor functioning in children who subsequently developed schizophrenia, in their healthy siblings, in preaffective disorder participants, and in their healthy siblings. The preschizophrenia subjects showed poorer motor skills, particularly during infancy, than those of their healthy siblings and preaffective disorder participants and their siblings. The abnormalities included choreoathetoid movements and posturing of the upper limbs, primarily on the left side of the body.

Consistency of Findings Across Methods

Endophenotypes.

Table 7.1 shows a consistency across studies of children at risk for schizophrenia, birth cohort studies, and retrospective studies in the presence of motor and language problems during infancy. This consistency is particularly impressive given the considerable variation across studies in the ways in which motor functioning and language were assessed.

During early childhood (2 to 4 years of age), neuromotor problems are observed in all three types of studies. In birth cohort studies and retrospective studies, impaired language is noted. In high-risk studies children at risk for schizophrenia are noted as being depressed, anxious, angry, and schizoid, whereas in birth cohort studies they are noted as preferring solitary play.

During middle childhood (4 to 14 years of age) there is a persistence of neurologic impairments reflected in poor motor functioning in high-risk, birth cohort, and follow-back studies. High-risk studies, unlike birth cohort studies and retrospective studies, included laboratory measures of attention information processing. On these tasks, children at risk for schizophrenia showed attentional impairment under conditions of high processing demands. This may be related to the poor academic achievement observed in birth cohort studies and retrospective studies during middle childhood as well as the frequent diagnosis of ADHD. In contrast to the persistence of neuromotor problems, language problems tend to diminish over time, so by middle childhood they are rarely noted across the three classes of studies. In adolescents who develop a schizophrenic disorder, language functions are relatively preserved compared to visual-spatial and motor functioning (Asarnow, Tanguay, Bott, & Freeman, 1987).

The results of this brief review suggest a developmental pathway from precursors first identified in infancy to the development of schizophrenia-related psychoses in late adolescence and early adulthood. Neurologic signs or neuromotor dysfunctions are present in infancy and persist through early and middle childhood and early adolescence. Neuromotor dysfunction in early childhood predicts the presence of attentional impairments under high processing demands during early adolescence. Neuromotor dysfunctions and attentional impairments during adolescence predict the development of schizophrenia-related psychoses. Because the characterization of key points in this developmental sequence is based on only one or two studies, clearly this model needs to be tested in future research.

The developmental pathway sketched here has potentially interesting implications for our understanding of the neurobiology of schizophrenia. What brain systems are involved in the control of simple motor functions and attention? The developmental link between early neuromotor dysfunction and later attentional impairments may implicate cortical-striatal pathways that support both motor functions and attentional control mechanisms. Striatal dysfunction results in impaired sequential motor performance and chunking of action sequences. Impairments in a variety of attentional functions, including set shifting and self-monitoring, are also associated with striatal dysfunction (Saint-Cyr, 2003).

Clinical and behavioral characteristics.

In high-risk studies, the precursors of later difficulties in developing social relations can first be detected in infancy. In some studies, children at risk for schizophrenia have less social contact with their mothers and less fear of strangers, as well as having a difficult temperament.

Poor peer relations are one of the most frequently found behavioral characteristics during middle childhood and early adolescence. A preference for solitary play, poor social confidence, and a generally “schizoid” social development are frequent precursors of schizophrenia. Studies of children at risk for schizophrenia, birth cohort studies, and retrospective studies all find an increased frequency of nonpsychotic symptoms, particularly internalizing symptoms and poor affective control (including emotional instability and impulsivity) during middle childhood and early adolescence. However, none of these symptoms are specific to children at risk for schizophrenia. Many of these symptoms and behavioral characteristics are found in children who subsequently develop an affective disorder.

Limitations: What We Don't Know

Neuromotor and attentional dysfunctions appear to be putative developmental precursors to schizophrenia. They consistently appear with increased frequency in high-risk, birth cohort, and follow-back studies. In a number of high-risk studies, infancy and childhood neuromotor impairments predicted the later onset of schizophrenia-related psychosis. Attentional impairments during middle childhood and early adolescence in the New York High-Risk project predicted the development of schizophrenia-related psychosis.

The endophenotypic indices that appear to have the greatest predictive validity are neuromotor dysfunction and impaired performance on measures that tap processing under high attention demands, or measures of secondary memory. It remains unclear whether these measures tap schizophrenic-related processes specifically. A number of the measures (including continuous performance tests, partial-report-span-of-apprehension tasks, and secondary verbal memory tests) that are sensitive to subtle neurocognitive impairments in children at risk for schizophrenia in middle childhood and adolescence also detect neurocognitive impairments in children with ADHD and learning disabilities. It is unlikely that these impairments have cross-sectional diagnostic specificity.

Although the ability of childhood and ado-lescent measures of attention to predict schizophrenia-related psychosis in the New York High-Risk project is promising, those results need to be replicated in an independent sample. Future studies will need to determine the extent to which childhood and adolescent neurocognitive measures predict schizophrenia-related psychosis conditioned on the presence of a second risk factor—having a parent who is schizophrenic. In effect, the analyses reported by the New York High-Risk project contained two risk factors that predicted schizophrenia-related psychosis: being the child of a schizophrenic parent and having attentional, verbal short-term memory or neuromotor impairments. These factors did not predict the onset of schizophrenia in the children of parents with an affective disorder nearly as well as they did in the children of parents with schizophrenia. As noted above, children with other, more common psychiatric diagnoses show deficits on these types of tasks. More research is needed on the diagnostic accuracy of these measures when they are used in the general pediatric population before they can be used to screen children for precursors for schizophrenia. At present, all that we know is that these measures have some promise in predicting which children who have a parent with schizophrenia are likely to develop a schizophrenic disorder themselves.

What is needed in the next generation of studies is not merely the demonstration of group mean differences between high-risk and control groups. If endophenotypic measures are to be used as candidates for preventive intervention programs, then diagnostic accuracy analyses that specify the sensitivity and specificity of tasks by means of various cutting scores need to be developed. Cutting scores can be created, depending on the purpose, that optimize sensitivity (detecting true positives) or specificity (false negatives). For example, if the intervention can produce significant adverse events, it might be desirable to set a cutting score to minimize false positives.

Poor peer relations, a preference for solitary play, a “schizoid” social development, various nonpsychotic symptoms (particularly internalizing symptoms), and poor affective control occur frequently during middle childhood and early adolescence in high-risk, birth cohort, and follow-back studies. While these behavior problems and symptoms are precursors of schizophrenia, they are not diagnostically specific. Many of these symptoms are associated with other psychiatric disorders. For example, poor affective control is both a symptom of and precursor to affective disorders. Poor peer relationships are associated with the presence of both externalizing and internalizing disorders. There are relatively few data on the diagnostic accuracy (i.e., specificity and sensitivity) of symptoms and behavior problems detected in middle childhood and early adolescence as predictors of schizophrenia-related psychoses.

The behavior problems and symptoms that are putative precursors of schizophrenia are associated with psychiatric disorders (e.g., depression and ADHD) that are much more common than schizophrenia in the general population. Thus high false-positive rates may result if these symptoms are used in the general pediatric population in an attempt to identify individuals likely to develop schizophrenia.

Implications for Preventive Intervention

There is great interest in developing preventive interventions for schizophrenia, in part because of the belief that once the disorder emerges, a neurodegenerative process is initiated that can only be partially forestalled by currently available treatments. The neurocognitive impairments, nonschizophrenic symptoms, and behavior problems that are putative developmental precursors of schizophrenia may have important implications for the development of preventive interventions for this disorder. These precursors could be used to identify children who might benefit from preventive intervention and serve as targets of interventions.

The neurocognitive impairments that are putative developmental precursors of schizophrenia have potential utility in identifying candidates for preventive interventions. Depending on the risk profile of the intervention, cutting scores on neurocognitive indices could be constructed to either maximize sensitivity or minimize false positives. However, as noted above, before the cutting scores for putative neurocognitive precursors of schizophrenia can be applied to the general pediatric population, additional research is required to evaluate the diagnostic efficiency of these measures in populations without a genetic risk. The neurocognitive precursors of schizophrenia seem to be unlikely targets for preventive interventions. There is no evidence that mitigating attentional, memory, and neuromotor impairments forestalls the development of schizophrenia-related psychoses. Identifying neurocognitive precursors of schizophrenia does advance attempts to develop new somatic treatments for schizophrenia by helping to elucidate the dysfunctional neural networks that underlie this complex disorder.

The diagnostic accuracy of the behavior problems and symptoms that are putative precursors of schizophrenia thus far identified in high-risk, birth cohort, and follow-back studies have not been carefully examined. Given the nonspecificity of these behavior problems and symptoms, it seems likely that they would yield high rates of false positives if used to identify candidates for preventive interventions for schizophrenia. It may be that clinical features more proximal to the onset of schizophrenia-related psychoses, such as prodromal signs and symptoms, have greater diagnostic accuracy in predicting which children will develop schizophrenia. A number of research groups are currently addressing this question.

The behavior problems and symptoms that are putative precursors of schizophrenia are potentially interesting targets for interventions. To the extent that poor peer relations, the presence of internalizing symptoms, and poor affective control pose difficulties for the child and parent, they become worthy targets of therapeutic interventions. Behavioral (e.g., social skills training) and pharmacological (mood stabilizing drugs) interventions for these problems are based on symptomatic presentations. The nonspecificity of these problems is not particularly problematic in this case. While there is no reason to believe that successfully enhancing social skills and controlling affective symptoms will forestall the development of schizophrenia, there is good reason to believe that enhancing social skills and controlling affective symptoms will enhance the current quality of life and may also improve the postpsychotic episode adaptation. The best predictor of postpsychotic psychosocial functioning is the level of premorbid social competencies.

What Are the Precursors of Schizophrenia?

One of the best ways to develop early-intervention efforts for schizophrenia is to start by identifying key features of those individuals who are or will become schizophrenic and determine how these features differ from those seen in normal individuals who are not ill and are not likely to ever become afflicted with the illness. Several research designs can accomplish this goal. For example, cross-sectional studies of patients and control subjects can be used to characterize each group on as many potentially meaningful variables as possible, including behavior, personality, social activity, neuropsychological abilities, brain structure and function, and genetics. One problem with this method, however, is that any differences observed between the two groups cannot necessarily be attributed a causal role in the development of disease. For example, if total brain volume were lower among a group of schizophrenic patients than it was among a group of well-matched controls, this might indicate that low brain volume is a precursor or predictor of the development of schizophrenia. However, from such a cross-sectional design, it is unclear if the brain volume deficit in the patient group actually preceded the onset of schizophrenic illness. In fact, it is possible that it did, but it is also possible that the onset of schizophrenia caused a decline in brain volume due to some degenerative process. Alternatively, other factors, such as treatment with antipsychotic medication, may have precipitated the decline in brain volume. It is further possible that the brain volumetric decline in the patient group was concurrent with the onset of illness but causally unrelated to it.

Numerous cross-sectional studies have unearthed a wealth of information regarding the ways in which schizophrenic patients are different from patients with other psychiatric illnesses and from normal control subjects. However, because of the limitations on causal inference that exist in these types of studies, their results can only guide further research; they are not powerful enough to dictate a specific pattern of behavioral, neuropsychological, or biological characteristics that would be useful for identifying individuals for targeted prevention efforts. As already mentioned, studies of individuals with prodromal signs of schizophrenia and individuals with schizotaxia provide more insight into those traits that precede the disorder than do cross-sectional studies. Thus, great efforts have been made to enable identification of individuals in the earliest stages of the illness or even in the premorbid period so that they may be targeted for intervention.

By characterizing the prodromal phase of schizophrenia, subtle changes in behavior have been noted in those who are beginning to deteriorate into the early stages of the disease, and these changes are now being used to identify other clinically at-risk individuals for inclusion in early intervention programs. Some of the more pronounced changes observed during the prodrome occur in domains of thought, mood, behavior, and social functioning (Phillips, Yung, Yuen, Pantelis, & McGorry, 2002). Specifically, difficulties in concentration and memory may emerge, as well as preoccupations with odd ideas and increased levels of suspiciousness. Mood changes may include a lack of emotionality, rapid mood changes, and inappropriate moods. Beyond simply odd or unusual behavior, the prodrome may also be characterized by changes in sleep patterns and energy levels. Social changes can be quite marked, with withdrawal and isolation being the most predominant features. These characteristics may be particularly informative of the disease process in schizophrenia, be cause they are by definition not related to the effects of medication or the degenerative effects of being ill for a prolonged period.

Perhaps the most powerful window into the premorbid changes in preschizophrenic individuals comes from the longitudinal study of children and adolescents who are genetically at high risk for the illness. By studying the biological children of schizophrenic parents, the clinical, behavioral, and biological features of schizotaxia can be revealed. Longitudinal studies of individuals such as these, who harbor the latent genetic liability toward schizophrenia, can be extremely informative for early intervention and prevention efforts because they can track the emergence of schizophrenia precursors before any signs of illness are apparent. Thus, any differences observed between children of schizophrenic patients and children of control subjects can be definitively attributed to factors other than the effects of antipsychotic medication, the degenerative effects of the illness, or any other factors that are subsequent to disease onset. The observed differences can be viewed as antecedents to the illness, which is as close to a causal relationship as can be ascribed in human research studies in which group membership cannot be experimentally assigned.

Studies of children of patients with schizophrenia have yielded a variety of findings of altered behavioral, neuropsychological, and biological processes. The richness and diversity of measures taken on these subjects can make profiling the premorbid genetic susceptibility to schizophrenia difficult. Such studies have also produced some surprisingly uniform findings, which simplify our understanding of what may be the most central or universal deficits among those who are at the highest risk for schizophrenia.

Certain personality characteristics seem to reliably differentiate children of schizophrenic parents from children of control subjects (Miller et al., 2002). For example, schizotypal personality features, including social withdrawal, psychotic symptoms, socioemotional dysfunction, and odd behavior, have been shown to precede the onset of psychosis among genetically high-risk children. Deficits of social functioning are also commonly observed in this group (Dworkin et al., 1993). Specifically, children of schizophrenic patients are more likely than children of controls to have more restricted interests, significantly poorer social competence (especially in peer relationships and hobbies and interests), and greater affective flattening. Some neuropsychological deficits have also been reliably observed in these high-risk individuals (Asarnow & Goldstein, 1986; Cosway et al., 2000; Erlenmeyer-Kimling & Cornblatt, 1992; Schreiber et al., 1992). For example, several studies have replicated a pattern of impaired discrimination, sustained attention, and information processing on the visual continuous performance test among children of schizophrenic patients. These high-risk individuals also exhibit marked impairments on memory for verbal stimuli and in executive functioning, as well as neuromotor deficits such as soft neurological signs, gross and fine motor impairments, and perceptual-motor delays.

Perhaps underlying these personality, social, and neuropsychological deficits, children of schizophrenic patients have also been shown to have altered brain structure and function compared to that of children of control subjects (Berman, Torrey, Daniel, & Weinberger, 1992; Cannon et al., 1993; Liddle, Spence, & Sharma, 1995; Mednick, Parnas, & Schulsinger, 1987; Reveley, Reveley, & Clifford, 1982; Seidman et al., 1997; Weinberger, DeLisi, Neophytides, & Wyatt, 1981). The most commonly observed structural brain abnormality among children of schizophrenic patients is a reduced volume of the hippocampus and amygdala region. Loss of volume in the thalamus has also been observed in these children, and there has been some support for enlarged third ventricular volume and smaller overall brain volume in this group. Children of schizophrenic patients also have been found to exhibit linear increases in cortical and ventricular cerebrospinal fluid to brain ratios with increasing genetic load—that is, children with the greatest number of affected biological relatives showed the highest ratios.

Ultimately, these clinical, behavioral, social, and biological profiles of risk for emergent schizophrenia will be augmented by information on specific genes that increase susceptibility. Recently, genes coding for neuregulin 1 (NRG1; Stefansson et al., 2002), nitric oxide synthase (NOS1; Shinkai, Ohmori, Hori, & Nakamura, 2002), and dystrobrevin-binding protein 1 (DTNBP1; Straub, Jiang, et al., 2002) have been reported to have an association with schizophrenia, but these findings will require verification. Many other polymorphisms have shown a positive association with the disorder, but attempts to replicate these findings have often failed. For several of these widely studied polymorphisms, meta-analysis has been used to clarify the presence or absence of a true allelic association with the disorder in the presence of ambiguity. In fact, using this approach, some candidate genes, including those that code for the serotonin 2A receptor (HTR2A) and the dopamine D2 (DRD2) and D3 (DRD3) receptors, have already been shown to have a small, but reliable, association with the disorder (Dubertret et al., 1998; Glatt, Faraone, & Tsuang, 2003; Williams, McGuffin, Nothen, & Owen, 1997). Eventually, other gene variants, including perhaps NRG1, NOS1, and DTNBP1, will be found to be reliably associated with schizophrenia. This may make it possible to create a genetic risk profile that will be predictive of future onsets of schizophrenia, especially in combination with other known risk indicators.

Together, the various abnormal features of children of schizophrenic patients provide a “composite sketch” of the underlying premorbid susceptibility toward schizophrenia. Because the probability of developing schizophrenia among children of one or two affected individuals (12% and 46%, respectively) is far greater than that probability among children of control subjects (1%), these abnormalities signal the subsequent development of schizophrenia with a relatively high degree of sensitivity and reliability. However, it is also clear that these trends are not absolute, and many children of schizophrenic patients will not exhibit these signs, nor will they ever develop schizophrenia.

Do Early Intervention and Prevention Efforts Work?

It has been recognized for some time that the duration of untreated illness in schizophrenia is correlated with the prognosis for the disease, such that those with the longest period of untreated psychosis experience the least favorable outcomes (Browne et al., 2000). Recently, it has also been discovered that outcome correlates with the duration of illness as measured from the onset of the prodrome rather than only from the onset of frank psychosis. From this line of evidence, the rationale for early-intervention efforts was born. It was reasoned that, if early treatment of the illness led to a more favorable outcome, early intervention even before the onset of the illness might further inhibit the progression of the illness, either delaying its onset, decreasing its severity, or both.

A fundamental question in designing early intervention protocols is, what will be the target of the intervention? There is no single best answer to this question, which may be why various targets are being used in current early intervention efforts. The earliest interventions might realize the greatest opportunities to divert high-risk individuals from the subsequent development of schizophrenia, but the ability to predict schizophrenia accurately might be greatest in the period closest to disease onset. For example, targeting attention problems in young children of schizophrenic parents might allow the identification of the children who are at highest risk of transitioning to psychosis and afford ample time to intervene in that process; yet because of the restricted sensitivity and specificity of this deficit, targeting attention problems may also cause some high-risk children to be excluded from the protocol while, inevitably, some of the children who were included in the protocol would not go on to develop the illness. Targeting the changes of the prodrome, by contrast, such as the emergence of odd behaviors or increased suspiciousness, might lower false-positive and false-negative classification errors, but the ability of the intervention protocol to influence the course of the illness might be relatively restricted compared to earlier interventions. Thus, a balance must be maintained between the potential effectiveness of the intervention and the specificity of the intervention to the target population.

Another key question in developing early intervention protocols is, at what level should the intervention be administered? Again, this is a question without a simple answer. Universal and selected interventions will have the greatest likelihood of reaching those individuals most in need of intervention—that is they will have the greatest sensitivity. However, these may also be too expensive to implement successfully. Indicated interventions will be more feasible, simply because of their more restricted nature, but this will prevent such protocols from reaching some individuals who may benefit from them. In fact, interventions administered at multiple levels may work better than protocols designed to intervene only at a single level.

Perhaps the least consensus in the design of early intervention trials is on the form of the intervention. The effectiveness of various early-intervention programs is currently an active area of research and, quite fortunately, multiple types of interventions have shown promise for keeping at least some high-risk individuals from developing schizophrenia. In fact, educational programs, as well as psychosocial and psychotherapeutic interventions, have all shown some degree of promise in either reducing the duration of untreated psychosis or postponing the onset of schizophrenia, which suggests that these methods may also be useful in decreasing the likelihood of schizophrenic illness altogether. In Norway, for example, the establishment of a comprehensive, multilevel, multitarget psychosis education and early detection network reduced the average duration of untreated psychosis in the catchment area by approximately 75% over a 5-year period (Johannessen et al., 2001).

The preventive effects of various psychotherapeutic techniques, such as individual cognitive behavior therapy or family-based cognitive remediation, have yet to be evaluated with great rigor, but pharmacologic intervention has received a fair amount of empirical support for efficacy in preventing or delaying the transition from prodrome to psychosis. A variety of psychopharmacological compounds may have efficacy in suppressing schizophrenia, including second-generation antipsychotic drugs such as risperidone, antidepressants such as the selective-serotonin reuptake inhibitors, mood stabilizers such as lithium and valproate, and antianxiolytics such as benzodiazepines, but few of these have been tested for such a role. Of these, the novel antipsychotic risperidone has shown tremendous promise in preventing descent into schizophrenia among prodromal individuals when compared with needs-based therapy alone, even up to 6 months after discontinuation of treatment (McGorry & Killackey, 2002). Risperidone has also been shown to improve neuropsychological functioning among the nonpsychotic, nonprodromal schizotaxic relatives of schizophrenic patients (Tsuang et al., 2002).

In light of these successes, it is not so troubling that consensus is difficult to reach on which form of intervention is the most appropriate; it seems that the method of intervention is not quite as important as the fact that any intervention is better than none. There are, however, a number of problems with current early-intervention efforts. For example, because our screening criteria cannot definitively identify individuals who are at risk for developing psychosis, early-intervention efforts are sometimes administered to individuals who do not need them or cannot benefit from them. Alternatively, because the warning signs of psychotic decompensation sometimes go unrecognized, some individuals who should have received intervention do not. Furthermore, little is known about the potential harm that may be caused by informing individuals that they are at risk for schizophrenia, but presumably there may be some negative consequences of receiving this knowledge. In addition, the benefits of some of our most promising early-intervention and prevention protocols (pharmacotherapies) may be offset by the potential side effects of individual compounds.

A careful analysis of the benefits and the risks of early intervention has led to the general consensus that intervention in the prodrome of schizophrenia is warranted. There is less agreement about the feasibility of selective and indicated intervention in the premorbid phase of schizotaxic individuals, who may or may not ultimately develop a schizophrenia-spectrum illness. Studies have shown that pharmacological intervention can improve the subclinical deficits experienced by some nonschizophrenic genetically at-risk individuals; however, at such an early stage of research and with a limited understanding of schizotaxia, it is not yet clear if these benefits outweigh their associated risks when the selection of proper candidates for intervention may still be suboptimal. As the phenomenology and time course of schizotaxia becomes better understood, criteria for inclusion in preventive and early-intervention efforts will improve, along with the efficiency of such protocols in treating only those individuals who will receive maximal benefit while sustaining little harm.