Risk-based decision making increasingly has global dimensions, extending from the international management of chemical risks to the sustainable development of our planet. Environmental risk assessment is firmly based on toxicological sciences with input from other public health disciplines. Increasing understanding of how the human genotype and phenotype affects absorption, distribution, metabolism, and excretion of external agents, including foods, is providing insight into answers to the oldest human question about disease: ‘Why me?’ The risk paradigm components of hazard assessment, dose–response analysis, exposure assessment, and risk characterization, and the toxicological concepts on which they are based, have proven durable in approaching increasingly complex environmental hazards. Newer approaches to managing risk, such as the precautionary principle, and newer challenges ranging from nanotechnology to the health impacts of global climate change, are necessitating more systematic thinking on how best to protect human health and the environment.
The goal of this chapter is to synthesize toxicology and risk assessment as a basis for comprehension of human health risks posed by chemical and physical agents in the environment. Disciplines other than toxicology, such as epidemiology and exposure assessment, also are of signal importance to understanding risk, and for many specific chemical and physical agents will provide the major basis for the information underlying risk assessment and risk management.
Environmental risk analysis is a broad field, encompassing risks to ecosystems and materials as well as to human health. Only human health risks will be considered in this chapter. However, the unity of human and environment health is unambiguous. Risk to ecosystems can often serve as a warning about human health risk. As just one example, the concern about the impact of acid deposition on trees and lakes preceded by about two decades the recognition that relatively low atmospheric concentrations of the same fine particulates responsible for acid deposition are a human health risk.
Toxicology has two important roles in environmental risk management: Ascertainment of cause and effect relationships linking chemical and physical agents to adverse effects in humans or the general environment, and the development of techniques capable of preventing these problems. Toxicologists usually approach questions of disease causation by starting with the chemical or physical agent and studying its effects in laboratory animals or in test tube systems. One of the more exciting aspects of modern toxicology is the development of tools, primarily through molecular biology, capable of probing the extent to which a given disease in an individual is caused by a chemical or other environmental factor. This reversal of approach, in which we start with disease and move toward determining the cause, is enabled by the increasing ability of epidemiology to link subtle biological markers indicative of early effects to biological markers indicative of exposure.
Toxicology is also an important discipline in the primary and secondary prevention of human health effects. Understanding the mechanisms by which chemical agents cause biological effects can result in toxicological tests useful to prevent the development of harmful chemicals, or the early detection of potential adverse effects.
General concepts of toxicology relevant to risk assessment
Toxicology is the science of poisons. Knowledge about poisons extends back to the beginning of history as humans became aware of the toxicity of natural food components. The bible contains injunctions concerning poisons, including how to avoid them. Greek and Roman history gives evidence of the use of poisons as an instrument of statecraft, an approach that was extended in the middle ages with such notable practitioners as the Borgias. Toxicologists tend to view Paracelcus, a sixteenth-century alchemist and a bit of a charlatan, as their ancestor, crediting him with the first law of toxicology, that the dose makes the poison. There are two other major maxims that underlie modern toxicology: That chemicals have specific biological effects, a maxim that has been credited to Ambrose Pare (Goldstein & Gallo 2001); and that humans are members of the animal kingdom.
The ‘laws’ of toxicology
The following section discusses ‘laws’ and general concepts of toxicology pertinent to understanding how a chemical or physical agent acts in a biological system (Table 8.7.1). The focus will be on the biological response, rather than on the intrinsic properties of the agent.
Table 8.7.1 The three ‘laws’ of toxicology
♦The dose makes the poison
♦ Chemicals have specific effects
♦ Humans are animals
The dose makes the poison
Central to toxicology is the exploration the relation between dose and response. As a generalization, there are two types of dose–response curves (see Fig. 8.7.1). One is an S-shaped curve that is characterized by having at lowest doses no observed effect and, as the dose increases, the gradual development of an increasing response. This is followed by a linear phase of increase in response in relation to dose and, eventually, a dose level at which no further increase in response is observed. Of particular pertinence to environmental toxicology is that this curve presumes that there is a threshold level below which no harm whatsoever is to be expected. There is an ample scientific base for the existence of thresholds for specific effects. For example, if one drop of undiluted sulphuric acid is splashed on the skin it is capable of producing a severe burn. Yet one drop of pure sulphuric acid in a bathtub of water is sufficiently dilute to be without effect. Thresholds for an adverse effect will differ among individuals based upon a variety of circumstances, some of which are genetically determined and others may represent stages of life or specific circumstances. In the example of sulphuric acid on the skin, there are genetically determined differences in susceptibility related to the protective presence of skin hair; babies will be more susceptible than adults; and skin that is already damaged will be at particular risk. This S-shaped dose–response curve is assumed to fit all toxic effects except those that are produced by direct reaction with genetic material.
The second general type of dose–response curve covers those endpoints caused by persistent changes in the genes. This occurs in cancer, in which a somatic mutation occurring in a single cell results in a clone of cancer cell progeny, or in inherited mutations of the genetic components of cells involved in reproduction. It is believed that a single change in DNA can alter the genetic code in such a way to lead to a mutated cell. It therefore follows that any single molecule of a carcinogenic chemical, or packet of physical energy such as ionizing radiation, that can alter DNA is theoretically capable of causing a persistent mutation. The presumption that every single molecule or ionizing ray has the possibility of changing a normal cell to a cancerous cell implies that there is no absolutely safe dose. The resultant dose–response curve starts at a single molecule, i.e. it has no threshold below which the risk is zero. As a further simplification, the shape of the curve can be linearly related to dose in that the risk of two molecules of a DNA-altering chemical causing a mutation is conceivably twice that of one molecule, and so on until a dose level results in dead cells.
Specificity of effects
That chemical and physical agents have specific effects is in essence no different than recognizing that possession of a gun does not make one a murder suspect if the victim has been stabbed to death. The law of specificity is well understood by the general public in terms of drugs: Aspirin will help with your headache but is useless for constipation, while laxatives have the opposite effect. However, various surveys suggest that the selectivity of effects of environmental chemicals is not well understood by the lay public; many believing that a chemical that can cause cancer in a particular organ can cause cancer and other diseases anywhere in the body.
The specificity of effects is due both to chemistry and to biology. Understanding the relationship between chemical structure and biological effect has been central to both pharmacology and toxicology. Structure activity relationships (SAR) are often used as a means to design a chemical with a specific effect that might be useful as a therapeutic agent. SAR is also used to predict whether a new chemical being readied for manufacture might be of potential harm. While SAR is a useful tool which is being improved through modern computational approaches, its predictive value remains too limited to be used without recourse to additional testing of a potentially toxic agent. For example, only one simple methyl group separates toluene from benzene, with only the latter causing bone marrow damage and leukaemia; ethanol from methanol, the latter causing metabolic acidosis and renal failure; and hexane from either n-heptane or n-pentane, with only n-hexane being responsible for peripheral nerve damage. These examples of specificity reflect both the formation of toxic metabolites, such as active species derived from the metabolism of benzene, and the interaction of a chemical or its metabolite within specific biological niches, such as the diketone metabolite of n-hexane within neuronal axons.
Specificity of effects is also conferred by cellular processes that lead certain cells to be more of a target to environmental agents. For example, red blood cells have an iron-containing protein known as haemoglobin that is responsible for the delivery of oxygen. Toxicity through alteration of efficient oxygen delivery occurs through certain specific mechanisms. One is through the oxidation of the reduced ferrous form to the ferric form of iron, known as methaemoglobin, which can no longer carry oxygen. This occurs with a limited number of agents or their metabolites that once within the red blood cell are capable of oxidizing intracellular iron. Another specific mechanism of interference with oxygen delivery by haemoglobin is exemplified by carbon monoxide. This otherwise relatively inert gas has a physical chemistry that sufficiently resembles oxygen so that it is able to tightly combine with the oxygen combining site of haemoglobin, thereby displacing oxygen. There are many other examples in which in essence a normal body process is disrupted by an exogenous chemical through a specific chemical alteration, such as oxidation or covalent addition, or by fitting into a niche designed through evolution to accommodate a necessary internal chemical which it superficially resembles.
Humans are animals
The conceptual foundation for extrapolating from animals to humans is a central facet of modern toxicology. The basic principles of cell function are common to all of biology. All cells must obtain energy, build structure, and release waste. Cell function in complex organisms such as humans is highly specialized, but there is still a great deal of similarity in cellular and organ function among mammals facilitating extrapolation of effects from one species to another. In general, the specificity of toxic effects is relatively similar across mammals, e.g. a kidney poison in one species is likely to be a kidney poison in another, although there are certainly exceptions. However, dose–response considerations often vary substantially, reflecting differences in adsorption, distribution, metabolism, excretion, function, and target organ susceptibility among species. Understanding the factors responsible for inter-species differences greatly facilitates extrapolation from animals to humans. Once elucidated, the role of different absorption rates, metabolism, or other factors can be taken into account, often through a mathematical approximation that has come to be called physiologically based pharmacokinetics (or toxicokinetics). One of the greatest threats to the public health value of toxicological sciences comes from animal rights activists who intentionally ignore the major positive impact of animal toxicology on the well-being and life-span of animals, including pet dogs and cats.
Pathways of exogenous chemicals within the body
The four major processes governing the impact of an exogenous chemical within the human body are absorption, distribution, metabolism, and excretion. All can vary greatly among different individuals, and within the same individual depending upon stage of life and circumstances. These variations are among the major reasons for differences among humans in susceptibility to risks due to exposure to chemical and physical agents. The increased understanding of how the human genotype and phenotype affects absorption, distribution, metabolism, and excretion of external agents, including foods, is providing insight into answers to the oldest human question about disease: ‘Why is this happening to me?’ (Omenn 2000).
Absorption of a chemical into the body occurs through ingestion, inhalation, and across the skin. Depending upon the specific chemical, the route of exposure can have major implications on the extent of absorption and the resultant toxicity. For example, almost 100 per cent of inhaled lead-containing fine particles are absorbed into the body as compared to a much smaller percent of ingested lead. Internal factors also can affect absorption, particularly from the gastrointestinal tract. In the case of lead absorption, iron and calcium deficiencies, which are common in children in inner city areas where lead is prevalent, both produce an increase in absorption of ingested lead. The matrix of the exposure agent also may have an effect. For example, the rate at which benzene in gasoline is absorbed through the skin will likely be increased by oxygenated components of the gasoline mixture; and the absorption of dioxins from contaminated soil can vary enormously (Umbreit et al. 1986). Often, a single route of absorption is dominant. But, in many instances, more than one route is important. For example, exposure to chlorinated disinfection products in drinking water systems, or gasoline contamination of well water through a leaky underground storage tank, is usually thought of solely in terms of the ingestion of water. However, during showering there is likely to be both inhalation and transdermal absorption, and if groundwater is contaminated there can be offgassing from soil into the home. Epidemiological studies of the potential adverse consequences of water contamination need to take all of these exposure routes into account (Arbuckle et al. 2002).
Once inside the body, distribution of the chemical occurs through different pathways. In part, this depends upon the route of absorption. Most compounds absorbed in the gastrointestinal tract go directly to the liver and may go no further, while inhaled agents first go to the lung or other parts of the respiratory tract. Distribution also depends upon the chemical and physical properties of the agents. Small particles tend to be distributed deep within the respiratory tract while larger particles get filtered out in the nose or upper respiratory tract. Chemicals that are poorly soluble in water, e.g. oils, usually distribute within fatty tissues. Only certain types of compounds are able to penetrate from the blood to the brain. Distribution will often depend upon organ-specific factors, such as a specific pump located in the thyroid gland that facilitates uptake of iodine and which makes the thyroid particularly vulnerable to the adverse impact of radioactive iodine.
Metabolism in the narrowest sense of the term refers to alteration of chemicals by the body. The major metabolic function of the body is to alter food into energy or structural materials. Metabolism of xenobiotics is often protective, converting unwanted absorbed materials into chemical forms that are readily excretable. Thus, a fat soluble agent can often be converted into water-soluble agents capable of being excreted in the urine. However, for certain classes of chemicals, metabolism is central to toxicity through conversion of relatively inactive compounds into harmful agents. Various carcinogens, including polycyclic organic hydrocarbon components of soot and the leukemogen benzene, require metabolic activation.
All organs appear to have metabolic capability, often related both to organ function and to susceptibility to toxic agents. Understanding the specifics of the enzyme and enzyme families responsible for metabolism within cell types is important to the question of why chemicals have specific effects in specific organs.
Genomics and proteomics applied to metabolism is often known as ‘metabolomics’. In the case of benzene, about 50 per cent of the body burden is exhaled unmetabolized and about 50 per cent is metabolized into potentially toxic metabolites. Slowing down benzene metabolism leads to an increase in the relative amount that is exhaled rather than metabolized, and thus a decrease in bone marrow toxicity An apparent genetically-determined increase in benzene metabolism to toxic metabolites, or a decrease in the detoxification of these metabolites, increases hematological risk in humans—with both polymorphisms together appearing to be at least additive and perhaps multiplicative in increasing risk (Rothman et al. 1997; Kim et al. 2007).
Excretion from the body can occur through a variety of different routes, primarily the gastrointestinal tract for unabsorbed compounds and for compounds dissolved in bile; and the urine for water soluble agents of appropriate molecular weight and charge. Significant loss of volatile compounds can occur through the respiratory tract. Other routes of excretion include sweat and lactation, the latter unfortunately putting the infant at risk.
Risk assessment has evolved from two separate streams of toxicological reasoning: Originally for toxic agents implicitly or explicitly assumed to have a threshold; and then for carcinogens. The safety assessment of chemicals developed from simplified approaches such as studies on laboratory animals in which the dose capable of killing 50 per cent of the animals (the LD50) was determined. This observed dose was used as a basis for extrapolating to permissible levels in humans, often using three separate ten-fold ‘safety factors’. These protective factors were based on the concern that humans could be more sensitive as a species than were the laboratory animals; that there was a greater variability in sensitivity among humans than among genetically inbred laboratory animals all raised in a similar environment; and that there would be adverse non-lethal effects that should be avoided. More recently, a presumptive ten-fold safety factor has been added specifically to protect children in recognition of their greater risk to certain chemicals (National Research Council 1993).
The inherent assumption in the ‘safety factor’ approach is that there is a threshold dose level below which there are no adverse effects. As discussed above, increased understanding of the mechanisms of carcinogenesis has led to the recognition that a single mutation could be the basis for the entire cancer process. As each molecule of a carcinogen at least theoretically could cause this mutation, a threshold could not be assumed, and, as a simplification, there is no level of exposure that is without risk, however tiny.
Almost all DNA damage is repaired by efficient cellular processes. Some unrepaired mutations are lethal to the cell—as dead cells do not reproduce they cannot be the basis for cancer or for inherited abnormalities. The majority of mutations are silent in that they have no discernible effects. Accordingly, the risk of any one molecule actually causing cancer is infinitesimally small—literally trillions of molecules of carcinogens are inhaled with every cigarette, yet only a minority of cigarette smokers develop cancer. Yet, the assumption that the risk is not zero has a major impact on communicating to the public about cancer risk due to chemical and physical carcinogens.
There are circumstances in which cancer causation does depend upon exceeding a threshold level of a chemical (e.g. the mechanism by which saccharin causes bladder cancer in laboratory animals appears to proceed through the precipitation of saccharin in the bladder which requires a dose sufficient to exceed the physicochemical processes determining saccharin solubility). However, the prudent management of cancer risk usually assumes that the carcinogen is ‘guilty until proven innocent’ of having no risk- free level. In essence, the burden of proof is on industry to scientifically demonstrate that their cancer-causing chemical does have a threshold.
In the late 1970s, the US Environmental Protection Agency (EPA) developed a Carcinogen Assessment Group that developed many of the basic approaches to cancer risk assessment now in use. Three concurrent and related events led to the adoption of formal approaches to risk assessment by EPA and other federal agencies. Some of these agencies, such as the FDA, had their own risk assessment processes, with FDA’s exploration of the safety of food additives under the leadership of Dr Arnold Lehman being particularly of note (Stirling & Junod 2002). The three events were:
(1) In 1980, a US Supreme Court decision narrowly overturned OSHA’s attempt to develop a more stringent benzene standard following OSHA’s recognition that benzene was a carcinogen. The court’s decision called for a risk assessment as a means to determine the extent of harm on which the agency should base its decision.
(2) The original appointee of President Reagan to head the EPA resigned in disgrace in 1983, in part because of the perception that she had distorted scientific findings in order to favour her political positions.
(3) The NAS released its report, known as the Red Book, that laid out a formal process for the assessing of risks (NRC 1983).
The definitions of the four major components of risk assessment are shown in Table 8.7.2.
Table 8.7.2 Components of risk assessment
(1) Hazard identifi cation: The determination of whether a specifi c chemical or physical agent is causally linked to a specifi c endpoint of concern; i.e. specifi city, or the second law of toxicology.
(2) Dose–response evaluation: The determination of the relation betweenthe magnitude of exposure and the probability of occurrence of the specific endpoint of concern; i.e. the dose makes the poison, or the second law of toxicology.
(3) Exposure evaluation: The determination of who and how many people will be exposed; through which routes; and the magnitude, duration, and timing of the exposure.
(4) Risk characterization: The description of the nature and often the magnitude of the human risk, including attendant uncertainty.
Hazard is an intrinsic property of a substance or situation. Risk depends on both hazard and exposure; e.g. rattlesnakes are intrinsically hazardous to humans, garter snakes are not—but if one is living in an area with no rattlesnakes there is no exposure and therefore no risk. Hazard for a specific endpoint can be related to exposure of the target organ; e.g. asbestos is a hazard for lung cancer as inhaled fibres get into the lung, but asbestos is not thought to be a hazard for liver cancer as asbestos fibres are unlikely to penetrate the GI tract to the liver.
A weight of evidence approach is often used by regulatory and quasi-scientific bodies to identify a hazard. In essence, a panel of scientists is asked to judge whether sufficient evidence exists to identify an agent or condition as having a risk of a specific effect in humans, or in some other target such as an ecosystem. The US approach to permitting the marketing of a new chemical is to have an internal EPA scientific group review the chemical structure and other data submitted by industry. Similarly, FDA relies heavily on an advisory committee process to review evidence of efficacy and toxicity before approving a new pharmaceutical agent or medical device.
Formal weight-of-evidence approaches have been particularly useful in evaluating potential human carcinogens. One well-known process for hazard identification for of a carcinogen is that of the International Agency for Research on Cancer (IARC) of the World Health Organization. IARC convenes expert panels that during a week-long meeting evaluate the evidence for carcinogenicity of specific chemical compounds or defined mixtures (e.g. diesel fuel, wood dust). The effort is focused on the weight of the evidence for carcinogenicity based upon carefully framed criteria considering animal toxicology, epidemiology, mechanistic information, and exposure data—but not on the potency of the compound as a cancer causing agent. IARC has recently increased the weight it places on understanding toxicological mechanisms in assigning its score (Cogliano 2004). This information is used internationally to decide governmental regulatory approaches at the workplace or general environment. In the United States, the IARC ranking has no official status but carries much weight with US regulators as does a similar process used by the National Toxicology Program for its semi-annual Report on Carcinogens (National Toxicology Program 2007).
Note that relatively few chemicals are capable of causing cancer in humans. Of the perhaps 70 000–1 00 000 chemicals in commerce, well less than one hundred are known human carcinogens. To a large extent this represents the success of environmental health science in providing tools that guide chemical manufacturers away from new chemicals that are potentially carcinogenic. Early application in the chemical development process of simple test batteries evaluating the potential for mutagenesis or other predictors of cancer causation provides a responsible chemical industry with the means to avoid producing carcinogens or other potentially harmful products—and the means to avoid the regulatory and toxic tort consequences of harming the public. The value of this primary preventive approach depends upon the availability of effective toxicological test batteries. Such tests are based upon a basic understanding of the chemical and biological processes underlying toxic effects. Unfortunately, the investment in using standardized test batteries for high production volume chemicals, and the major increase in such investments due to the new EU REACH legislation (see below) has not been accompanied by recognition of the need to develop better and more effective tests to protect the public. Newer advances in molecular toxicology provide many opportunities to improve these test batteries (NAS 2007).
A key issue facing risk management is how to proceed in the presence of hazard data with no evidence of toxicity. Rapid advances in analytical chemistry coupled with the fragmentation of authority, at least in the United States, among different environmental agencies, has made this issue particularly challenging. The CDC National Center for Environmental Health Division of Laboratory Sciences is perhaps the world’s leading analytical laboratory capable of detecting ever smaller amounts of an increasing number of chemicals in blood and other biological fluids. This analytical capacity for blood and urine samples has been coupled to the National Health and Nutrition Examination (NHANES) survey which also provides medical and sociodemographic information (Centers for Disease Control and Prevention 2005). This work provides an opportunity for surveillance for environmental exposures that is of major public health significance. Environmental organizations have emphasized the importance of body burden to risk management (e.g. see Environmental Working Group 2007). However, the health impact, if any, of our body burden for most of these chemicals remains unknown. The disclaimer language of the CDC report puts it very well:
The measurement of an environmental chemical in a person’s blood or urine does not by itself mean that the chemical causes disease. Advances in analytical methods allow us to measure low levels of environmental chemicals in people, but separate studies of varying exposure levels and health effects are needed to determine which blood or urine levels result in disease (Centers for Disease Control and Prevention 2005).
Unfortunately, there appears to be little coordination with the National Institute of Environmental Health Sciences or with the US Environmental Protection Agency to obtain these needed studies, reflecting the fragmentation of the US federal approach to environmental risk.
The key issues in dose–response evaluation involve how to extrapolate from the high doses at which an effect is observed in an animal or epidemiological study, to the usually much lower levels of risk which are of public or policy concern. Crucial to extrapolation are assumptions about the shape of the dose–response curve, i.e. threshold, linear non-threshold, sublinear, or supralinear. It must be emphasized that the levels of risk desired by our society, for example in the range of one in ten thousand to one in one million lifetime, are usually too low to be scientifically verifiable. This is particularly true as the endpoints of concern cannot be solely attributed to the environmental hazard under consideration. The following example is given in part to justify my contention that environmental risk assessment primarily is aimed at approaching problems of broad societal concern at risk levels too low to be scientifically verifiable. It also provides an opportunity to give an example of a risk assessment.
Based upon extrapolation from both epidemiologic and animal studies, the potency of benzene is estimated by the USEPA to result in a range of 2.2–7.8 in one million increase in the lifetime risk of leukaemia of an individual who is exposed for a lifetime to 1 µg/m3 benzene in air (USEPA 2007). A reasonable average benzene outdoor level for the US population is approximately 3 µg/m3, which would predict a risk of 6.6–23.4 in one million lifetime caused by this benzene exposure. Regulatory approaches that decrease that outdoor background level by two-thirds to 1.0 µg/m3 benzene nationwide would be estimated to decrease the risk of benzene-induced leukaemia by two-thirds. This would mean that nationwide there would be 4.4–15.6 less cases of leukaemia lifetime for every one million Americans, or approximately 10 in one million lifetime. Assuming a 70-year lifetime, and 350 million Americans, one can estimate that there would be 50 fewer cases of leukaemia a year nationwide as a result of a two-thirds decrease in outdoor benzene levels. This is a very small percent of the 24 800 new cases of leukaemia in 2007 estimated by the American Cancer Society. While preventing that number of leukaemia cases is socially desirable, there are no current epidemiological or animal toxicology methods that could scientifically validate these assumptions. Note the further complication that our unregulated and highly variable indoor exposure to benzene, as well as to many other volatile organic compounds, far exceeds outdoor exposure for most of the US population. In fact, the major reason for a decrease in personal benzene exposure in the US has been the decline in cigarette smoking for smokers, and its restriction from public places for non-smokers.
The challenge posed by extrapolation from animals to humans is increasingly being met by advances in understanding of the dynamics of absorption, distribution, metabolism, and excretion of external chemicals in humans, including the use of ‘metabolomics’. The ability to use metabolomics and toxicokinetics to increase understanding of the relevance of animal data to human dose–response evaluation enhances the value of animal toxicology for dose–response evaluation.
Exposure evaluation is central to the management of environmental risks. Understanding the pathways of exposure allows interdiction of the exposure pathway—prevention of human exposure to a harmful chemical is synonymous with prevention of human risk. New advances in the field of exposure science are beginning to have major impact in our understanding and preventing risk. These advances are particularly crucial to understanding aggregate and cumulative risk (USEPA 2003; International Life Sciences Institute 1999). Aggregate risk takes into account the different pathways of exposure for the same chemical (see ‘Distribution’, above). Cumulative risk describes the multiple effects of different agents through different routes, in essence an assessment of the impact of the soup of external synthetic and natural chemicals in which we all live. Cumulative risk assessment is particularly pertinent to environmental justice considerations.
The central importance of exposure assessment in understanding risk is exemplified by investigations of potential adverse health consequences due to inhalation of pollutants resulting from the World Trade Center terrorist event. Careful evaluation of disease endpoints in relation to exposure pathways is central to unravelling the highly political and litigious issue of whether responders or the general public have an increased incidence of disease from inhaling the dust generated by the explosion or the clean up. New protocols and tools to assess exposure resulting from man-made or natural disasters are being developed (Lioy et al. 2006).
Many challenges are presented through the seemingly straightforward process of characterizing the risk estimated through the hazard identification, dose–response evaluation, and exposure evaluation steps. First, those doing the characterization are given an opportunity to put their ‘spin’ on the findings, e.g. the public is likely to respond differently to the characterization that something is ‘99 per cent free of risk’ than to the numerically equivalent characterization that there is a ‘one percent likelihood of a serious consequence including death’. (For a detailed discussion of risk communication, see Chapter 8.8 of this volume, by Baruch Fischhoff.) There is also the challenge of characterizing who is at risk—reporting the risk in terms of the entire exposed general public can trivialize the risk to a highly sensitive subpopulation, such as asthmatics. Further, risk can be displaced from one country to another, as was recently observed when the dumping of hazardous waste, illegally sent from Europe to the Ivory Coast, reported to have caused 9000 acute illnesses and 6 deaths in Abidjan (Greenpeace 2006). Compounding the issue is that the Ivory Coast, not having its own expertise, had to use its scanty funds to hire a European company to retrieve, ship, and process the toxic waste (United Nations Environmental Program 2006).
There is also a long-standing debate on the extent to which numerical uncertainty, rather than a simple qualitative statement of the major sources of uncertainty, should be a routine part of risk characterization. Those in favour of routinely providing numerical boundaries that quantify the extent of uncertainty point out that the many estimates and default assumptions in a risk assessment provide wide ranges of uncertainty that should be presented to the risk manager and the general public. Those in favour of a more restricted use of quantitative uncertainty analysis, including this author, point out that most risk assessments are scoping activities aimed at considering alternatives or developing priorities. Further, major societal decisions are made on numerical estimates for which no uncertainty factors are given (e.g. the gross domestic product, unemployment estimates). There is no disagreement that the qualitative issues underlying uncertainty in a risk analysis should always be transparent to the risk manager and to the affected stakeholders.
The future of risk assessment
Risk assessment as a formal process to evaluate environmental agents has been evolving, particularly during the last few decades where more sophisticated approaches to cancer risk assessment and to cumulative and aggregate risk have developed. Using molecular toxicology to replace standard default assumptions is particularly promising (NAS 2007). Just as in other natural sciences, newer advances in data handling and informatics provide the opportunity to assess larger and more complex databases. Advances in epidemiologic methodology using biological indicators of exposure and effect based upon ecogenetics and other molecular biological techniques should be particularly fruitful (Omenn 2000). Conceptually, our genetic make-up is what loads the gun—but it is the environment that pulls the trigger. Identification of subpopulations sensitive to environmental factors will challenge regulatory and legal interpretation of the many environmental health laws that are aimed at protecting susceptible populations. Global harmonization of risk assessment has been under way for decades and will particularly be needed to avoid the use of environmental health principles as a façade for trade barriers. Attempts are under way to apply to toxicology the formal evidence-based approaches now coming into use in medicine. It will be a challenge to use such processes for risk assessments that depend heavily on extrapolation to levels of risk below those that are readily observable, i.e. the evidence will be indirect. Reviews of recent directions in risk assessment can be found in a series of articles commemorating the twentieth anniversary of the NAS Red Book (Johnson & Reisa 2003) and in Goldstein (2005).
The regulatory approach to protecting worker health from toxic chemicals is often based upon both a measurable workplace standard and a subtle measure of effect. Thus for benzene, there is a 1 ppm time-weighted average workplace air standard as well as a requirement for routine blood counts. The latter informs the former both in terms of whether unmeasured exposures may be occurring, and whether a reconsideration of the allowable external standard is needed. In contrast, environmental standards are almost always measures of external pollutant emissions or ambient levels. Such standards are surrogates for the desired goal of avoiding adverse consequences to human health and the environment. Achieving a level of scientific knowledge that would permit the direct evaluation of subtle biological precursors of adverse effects would be a desired route to develop emission standards that are truly protective. Much work in this area is in progress under the rubric of environmental health indicators.
Human history of protecting against the consequences of environmental agents in essence is the history of catching up on the adverse effects of otherwise beneficial new technology—starting with the human use of fire. One of the more challenging new technologies with potential for beneficial and adverse consequences is that of nanotechnology. Decreasing the size of particles can result in unexpected new physicochemical properties, in part due to a very high surface-to-volume ratio (Helland et al. 2007). The debate is unresolved about whether current toxicological testing schemes and regulatory processes are adequate to protect against the potential harm of nanotechnology products.
The precautionary principle and/or/versus risk assessment
The precautionary principle has been advanced as a new approach to environmental risk that, at least to some of its advocates, is a replacement or at least a supplement for risk assessment. Impetus to utilize the precautionary principle was given by the 1992 Rio Declaration on the Environment and Development which provided the definition shown in Box 8.7.1 (United Nations Environmental Programme 1992).
There are many variants of this definition and an extensive literature devoted to developing a more rigorous definition of the precautionary principle. To some, the precautionary principle is merely a means to build more public health protection into quantitative risk analysis, with additional prudent defaults and safety factors to protect at risk populations, and a further focus on uncertainty. To others, the precautionary principle is a new way of addressing environmental risk which is more democratic than risk assessment, better allows for dealing with complexity and uncertainty, and more likely to provide timely and preventive interventions (Martuzzi 2007; Tickner & Ketelson 2001). Some of the important approaches advocated by the precautionary principle, such as transparency and involvement of stakeholders, has also been advocated by many under the rubric of risk assessment and management. For example, the Presidential/Congressional Commission on Risk Assessment and Risk Management (1997), mandated under the 1990 US Clean Air Act amendments, developed a framework for environmental health risk management that has six steps: Formulation of the problem within the context of public or ecosystem health; analysis of the risks; determination of options for risk management; action on the decision; and evaluation of the outcome—all to be performed collaboratively and transparently with stakeholders. Of note is that the precautionary principle puts the burden of proof on corporations to prove the safety of their products rather than on governments or concerned citizens to prove harm. The European Union’s advocacy of this principle includes incorporation of a statement in support of the precautionary principle in its founding documents, although without any definition being advanced.
Two examples that provide practical insight into the often confusing debate about the precautionary principle and/or/versus risk assessment are those of the 1990 US Clean Air Act amendments concerning hazardous air pollutants (HAPs), and the new EU legislation that is in the process of redoing how the EU, and the rest of the world, tests the safety of chemicals, known as Registration, Evaluation, Authorisation and Restriction of Chemical Substances (REACH).
The 1990 amendments to the US Clean Air Act provided a complete overhaul of the regulation of hazardous air pollutants (HAPs; in essence all air pollutants other than those for which outdoor air pollutant standards are set, such as ozone and particulates). Previously, the burden of proof was on the government to show that a specific HAP was harmful. Frustration with the torturous process that had regulated relatively few air pollutants led to Congress providing a list of 185 air pollutants that were to be regulated unless an industry was able to prove safety to the satisfaction of EPA, also after a torturous regulatory process—a switch in the burden of proof consistent with the precautionary principle. Further, the 1990 Clean Air Act amendments replaced risk assessment as the guiding principle for regulatory control of emissions by the requirement for a maximum available control technology for all sources—irrespective of the extent of risk imposed. Although clearly consistent with the precautionary principle, the principle was not mentioned in the debates.
REACH was developed after a long and often rancorous debate, with its proponents focusing on the precautionary principle as a rationale for the new legislation. REACH imposes significant burdens on industry to develop data, assess risk, and provide information about virtually all chemicals in use, including constituents of product mixtures. No distinction is made between newly developed chemicals or those long available in commerce—a contrast with the US Toxic Substances Control Act whose weakness in this regard has led to the inadequate testing of compounds such as methyl tert-butyl ether (MTBE) before their inappropriate release into the environment (Goldstein & Erdal 2000). REACH is heavily dependent upon developing a toxicological data base on virtually every compound and constituent to which exposure might occur. The cost is estimated at about 3–6 billion US dollars over the first 11 years for obtaining the data and registering the compounds. Risk assessment, based on both toxicity and exposure, is used extensively throughout the process, including setting priorities for data needs and making decisions on regulatory approaches. Unfortunately, there is no provision for obtaining the research needed to improve the underlying science on which effective toxicological testing is based.
These two examples suggest that the precautionary principle is both a significant aspect of US regulation, and that even a risk management approach derived so directly from the precautionary principle can be firmly based on risk assessment. Hammitt et al. (2005) reviewed EU and US regulations and concluded that despite the great attention given to the precautionary principle in the EU, there is no significant difference in the extent of precaution in US and EU regulations.
In addition to definitional issues, there are other major concerns about the precautionary principle. These are summarized in Table 8.7.3. First, what does the precautionary principle add to standard public health concepts? The precautionary principle is very welcome as an enthusiastic restatement of these concepts which provides an impetus and rallying point for actions that protect public health and the environment, even if nothing new is added to our understanding of the forces responsible for public health action and inaction. Perhaps of greater potential consequence is the treatment of science by some of the major advocates of the precautionary principle, a treatment which at times borders on deconstructionism (Martuzzi 2007; Goldstein 2007). It is true, but also trite and usually trivial, that scientists have values which inform their activities.
Table 8.7.3 Questions about the precautionary principle
♦ What does it add to standard public health concepts?
♦ I s it true that complex scientific questions are unsolvable and, if so, is the precautionary principle needed to act in the face of scientific uncertainty?
♦ In view of its frequent use to justify trade barriers, is it still possible to advocate the precautionary principle as an antidote to biased decision-making?
Even in the face of uncertain science, there is no evidence that the precautionary principle is needed to decrease risk. One example often used of scientific uncertainty for which the precautionary principle is said to be pertinent is that of the health and environmental risks of endocrine disruptors, a particularly challenging problem in view of the need to consider the interactive effects of multiple chemicals with a wide range of additive, synergistic, and antagonistic interactions (Kortenkamp 2007). Yet the United States banned the production of PCBs in 1976 despite the opposition, then and now, of industry on the grounds of uncertain science. The continual decline in body burdens of PCBs and dioxins have been accomplished based on regulatory decisions that were made without recourse to the precautionary principle. Advancing the science needed for decision-making must remain a major goal for environmental public health, including actions taken under the precautionary principle (Foster et al. 2000; Goldstein & Carruth 2003; Grandjean et al. 2004).
Unfortunately, the precautionary principle has been tainted through misuse by the European Union to justify agricultural trade barriers. The examples go well beyond the US, Canada, and other countries winning well-publicized World Trade Organization judgements against the EU on hormone-treated beef or genetically modified foods—judgements that specifically took the EU to task for using the precautionary principle to replace or ignore public health science. Perhaps most egregious is the EU’s use of the precautionary principle to increase the stringency of their aflatoxin standards to well below those of any other nation or international organization (Goldstein 2007). Aflatoxin is produced by a fungus that grows on agricultural produce, particularly when wet. To the benefit of EU agricultural interests, US$700 million/year worth of agricultural produce has been excluded from sub-Saharan Africa, the world’s poorest nations,. The FAO/WHO Codex Alimentarius Commission’s Joint Expert Committee on Food Additives found no significant health benefit for the more stringent EU aflatoxin standard which, based upon risk analysis, would decrease the amount of cancer in 500 million Europeans by one case every other year. The recent failure of the Doha round of trade talks, a particular loss to sustainable development of less developed countries, has been in part ascribed to the failure of the rest of the world to be willing to trust the EU not to use the precautionary principle to manufacture reasons to avoid free trade in agricultural products. Clearly, the precautionary principle is not a protection against the misuse of science.
The primary missing ingredient in the approach to ever more complex environmental challenges, including such broader issues as global warming, is a systems-based approach incorporating the best science focusing on the most important questions. Unfortunately, the fragmented national and international approaches to environmental issues are producing piecemeal efforts that are falling further behind in protecting public health and the environment. Perhaps the need to respond to the challenges of global climate change will lead to a more systematic international effort.
Understanding the web of environmental cause and effect relations is an increasing challenge in a shrinking globe. Advances in toxicology, filtered through an appropriate appreciation of the optimal approaches to analyse and present risks to an involved public, are crucial to protecting public health and the environment.
Arbuckle, T.E., Hrudey, S.E., Krasner, S.W. et al. (2002). Assessing exposure in epidemiologic studies to disinfection by-products in drinking water: Report of an international workshop. Environmental Health Perspectives, 110 (Suppl 1) 53–60.Find this resource:
Centers for Disease Control and Prevention (2005). Third national report on human exposure to environmental chemicals. Atlanta, GA.Find this resource:
Cogliano, V.J. (2004). Current criteria to establish human carcinogens. Seminars in Cancer Biology, 14, 407–12.Find this resource:
Environmental Working Group (2007). Body burden. http://www.ewg.org/featured/15. Accessed Dec 9, 2007.
Foster, K.R., Vecchia, P., and Repacholi, M.H. (2000). Science and the precautionary principle. Science, 288, 979–81.Find this resource:
Goldstein, B.D. (2003). Risk characterization and the red book. Journal of Human and Ecological Risk Assessment (August 2003 special issue to commemorate the 20th anniversary of the NRC Red Book). 9, 1283–9.Find this resource:
Goldstein, B.D. (2005). Advances in risk assessment and communication. Annual Review of Public Health, 26, 141–63.Find this resource:
Goldstein, B.D. (2007). Problems in applying the precautionary principle to public health. Occupational and Environmental Medicine, 64, 571–4.Find this resource:
Goldstein, B.D. and Carruth, R.S. (2003). Implications of the precautionary principle to environmental regulation in the United States: Examples from the control of hazardous air pollutants in the 1990 Clean Air Act Amendments. Law & Contemporary Problems. 66, 247–61.Find this resource:
Goldstein, B.D. and Carruth, R.S. (2003). Implications of the precautionary principle: Is it a threat to science? European Journal of Oncology, 2, 193–202.Find this resource:
Goldstein, B.D. and Erdal, S. (2000). MTBE as a gasoline oxygenate: Lessons for environmental public policy. Annual Review of Energy and the Environment, 25, 765–802.Find this resource:
Goldstein, B.D. and Gallo, M.A. (2001) Paré’s law: The second law of toxicology. Toxicological Sciences, 60, 194–5.Find this resource:
Grandjean, P., Bailar, J.C., Gee, D. et al.(2004) Implications of the Precautionary Principle in research and policy-making. American Journal of Industrial Medicine, 45, 382–5.Find this resource:
Greenpeace (2006). Toxic waste in Abidjan. http://www.greenpeace.org/international/news/ivory-coast-toxic-dumping/toxic-waste-in-abidjan-green
Hammitt, J.K., Wiener, J.B. Swedlow, B. et al. (2005). Precautionary regulation in Europe and the United States: a quantitative comparison, Risk Analysis, 25, 1215–28.Find this resource:
Helland, A., Wick, P., Koehler, A. et al. (2007) Reviewing the environmental and human health knowledge base of carbon nanotubes. Environmental Health Perspectives, 115, 1125–31.Find this resource:
International Life Science Institute (1999). A framework for cumulative risk assessment; Workshop report. ILSI Risk Science Institute, Washington, DC.Find this resource:
Johnson, B.L.and Reisa, J.J. (2003). Essays in commemoration of the 20th anniversary of the National Research Council’s risk assessment in the federal government: Managing the process. Human Ecological Risk Assessment, 9, 1093–9.Find this resource:
Kim, S., Lan, Q., Waidyanatha, S. et al. (2007) Genetic polymorphisms and benzene metabolism in humans exposed to a wide range of air concentrations. Pharmacogenetics and Genomics, 17, 789–801.Find this resource:
Kortenkamp, A. (2007). Ten years of mixing cocktails: A review of combination effects of endocrine-disrupting chemicals. Environmental Health Perspectives, 115(Suppl 1), 98–105.Find this resource:
Lioy, P., Pellizzari, E., and Prezant, D. (2006). The World Trade Center aftermath and its effects on health: Understanding and learning through human exposure science. EnvIronmental Science and Technology, 40, 6876–85.Find this resource:
Martuzzi, M. (2007). The precautionary principle: In action for public health. Occupational and Environmental Medicine, 64, 569–70.Find this resource:
National Research Council (1983) Risk assessment in the federal government: Managing the process. National Academy Press, Washington, DC.Find this resource:
National Research Council (1993) Pesticides in the diets of infants and children. National Academy Press, Washington, DC.Find this resource:
National Research Council (2007). Toxicity testing in the 21st century. National Academy Press. Washington, DC.Find this resource:
National Toxicology Program (2007). Report on carcinogens. http://ntp.niehs.nih.gov/index.cfm?objectid=72016262-BDB7-CEBA-FA60E922B18C2540 Accessed Dec 1, 2007.
Omenn, G.S. (2000). Public health genetics: an emerging interdisciplinary field for the post-genomic era. Annual Review of Public Health, 21, 1–13.Find this resource:
Presidential/Congressional Commission on Risk Assessment and Risk Management (1997). Framework on Environmental Health and Risk Management. Final Report. Vol. 1Find this resource:
Rothman, N., Smith, M.T., Hayes, R.B. et al. (1997) Benzene poisoning, a risk factor for hematological malignancy, is associated with the NQO1 609C –>T mutation and rapid fractional excretion of chlorzoxazone. Cancer Research, 57, 2839–42.Find this resource:
Stirling, D. and Junod, S. (2002). Profiles in toxicology: Arnold J. Lehman. Toxicological Sciences, 70, 159–60.Find this resource:
Tickner, J. and Ketelson, L. (2001). Democracy and the precautionary principle. Science and Environmental Health Network, 6, 1–6. http://www.sehn.org/Volume_6-3.html. Accessed Nov 14, 2007Find this resource:
Umbreit, T.H., Hesse, E.J., and Gallo, M.A. (1986). Bioavailability of dioxin in soil from a 2,4,5-T manufacturing site. Science, 232, 497–9.Find this resource:
United Nations Environmental Programme (1992) Rio Declaration on environment and development, Principle 15. http://www.unep.org/Documents.Multilingual/Default.asp?DocumentID=78&ArticleID=1163; Accessed, December 22, 2007
United Nations Environmental Programme Press Release (2006). Liability for Cote d’Ivoire hazardous waste clean-up. http://www.unep.org/Documents.Multilingual/Default.asp?DocumentID=485&ArticleID=5430&l=en Accessed December 22, 2007
University of Pittsburgh European Union Center of Excellence and Graduate School of Public Health (2007). REACH, A New EU Approach to Chemical Safety: Lessons for the US. Conference Chair, BD Goldstein. http://www.ucis.pitt.edu/euce/events/policyconf/07/PDFs/ReachReport.pdf Accessed December 22, 2007
US Environmental Protection Agency Integrated Risk Information System: Benzene CASRN 71-43-2 http://www.epa.gov/iris/subst/0276.htm#carc; Accessed November 28, 2007
US Environmental Protection Agency (2003). Framework for Cumulative Risk Assessment. EPA/630/P-02/001F. Risk Assess. Forum, Washington, DC http://www.epa.gov/fedrgstr/EPA-PEST/1999/November/Day-10/6043.pdf
Woodruff, T.J., Axelrad, D.A., Kyle, A.D. et al. (2003). America’s children and the environment. Measures of contaminants, body burdens and illnesses. US Environmental Protection Agency, EPA 240-R-03-001Find this resource: