Section 5: Potential of high throughput in vitro approaches
For decades we have endured a poor chemicals policy that has created a situation where too many chemicals in the marketplace have too few data to judge their safety. The lack of adequate government authority to require testing, along with its costs and concerns about use of laboratory animals, has meant that chemical safety assessment has not even come close to keeping up with the ever-expanding number of chemicals and their myriad uses in products and materials. As a result, we are unnecessarily exposed to harmful or untested chemicals and have to spend significant resources ameliorating problems that could and should have been prevented.
Scientific research has made significant strides in elucidating the many ways chemicals can affect human health and environment. We now know that some chemicals present at very low doses in our bodies can have adverse health impacts, that the timing of exposure is critical in defining an ultimate health effect, and that diversity in the population—such as genetics, age, and gender— makes us differentially susceptible and vulnerable to chemical exposures.
Inadequate chemicals policy alongside new insights gleaned from scientific research is driving EPA to develop new approaches to fill data gaps on chemicals that reflect a 21st-century understanding of the biological activity of chemicals. High-throughput in vitro testing (HT testing), such as that being developed in EPA’s ToxCast program, is one of the technological solutions being most intensively pursued. HT testing is discussed in greater detail in Section 2 of this primer, and ToxCast is discussed in Section 3 of this primer.
HT testing hold great promise, but like all testing approaches, has limitations as well as strengths, and if used inappropriately could actually set us back, not forward. We discuss here some of the key potential benefits associated with HT testing; Section 6 of this primer discusses some of the key limitations and challenges of HT testing.
Conventional toxicity testing methods generally involve dosing laboratory animals with a chemical of interest and after some period of time—hours to days to months to years—looking to see whether an adverse outcome, for example, a tumor, has developed. Here the scientist is observing the ultimate downstream consequence of what is presumed to be a chemical’s interference with the proper function of one or more biological pathways. In contrast, HT methods primarily focus on the pathways themselves and aim to “catch” an early indicator of hazard: the perturbation of the pathway, rather than the ultimate consequence of that perturbation. This approach requires much less time—a matter of seconds to minutes. Not only does the effect happen sooner after exposure, but it can often be observed in something less than the whole animal, e.g., in a culture of cells or even a solution of cell components.
Because these HT tests can be done in a very small volume of solution, it also means that many chemicals can be put through a battery of HT assays simultaneously and at relatively low cost. Moreover, the process can be automated, indeed even carried out by robots. Hence, thousands of chemicals can be analyzed in hundreds of assays all in a period of time far shorter than would be required to detect most adverse outcomes in intact laboratory animals. Given the massive backlog of chemicals with little or no safety data, the speed of HT tools could be very valuable, at least in screening and prioritizing chemicals by level of potential concern.
Because of the ethical problems associated with human testing, as well as the simple fact that we live so long, traditional toxicological methods use laboratory animals to assess the toxicity a chemical and predict its effects on people. According to the seminal National Academy of Sciences report “Toxicity Testing in the 21st Century: A Vision and a Strategy,” use of such animal “models” is possible because in general human biology is similar to that of test animals. Nonetheless, while animal studies have served as important and useful tools, uncertainty is introduced by the need to extrapolate from animal data estimates of risk to humans. Moreover, there are some chemicals that elicit toxic effects in one species and not in another. For example, thalidomide is toxic to human fetuses, but rats are resistant to its effects.
In addition to testing chemicals on cells originally derived from laboratory animals, EPA’s ToxCast HT assays can be conducted on human cells, which are grown in culture. This may improve the ability of such tests to predict whether a chemical will be toxic to humans. Ideally, this could help lower the likelihood of a cross-species “false negative,” that is, missing an effect because it happens not to occur in the lab animal chosen for a given test but would occur in a human – or, conversely, a “false positive,” that is, seeing an effect in the animal model that for some reason would not occur in people.
Multiple cell types and life stages
HT testing methods offer the potential to look for effects of chemicals on different cell types (e.g., liver cells, kidney cells, etc). This may discern different kinds of toxicity, including those resulting from a chemical’s ability to disrupt a process that only takes place in certain cell types or organs. Some HT assays even use combinations of cell types taken directly from human tissues, in an effort to mimic responses of, and interactions between, cells types that are involved in the body’s reaction to a particular disease or disorder (e.g., asthma).
HT testing methods also hold potential to look for effects of chemicals at different life stages. A particularly exciting potential application of HT tests is in evaluating chemical effects on early life stages, including fetal development. For example, the Texas-Indiana Virtual STAR Center is using mouse embryonic stem cells to determine how chemicals may affect key biological pathways during early fetal development. This kind of research is still at an early stage, but holds the promise of providing a means to rapidly screen a large number of chemicals for developmental toxicity.
Chemical testing in laboratory animals is typically done at high doses to ensure that, if an adverse effect occurs, it can be detected in a relatively small number of animals in a relatively short period of time. These concentrations are often orders of magnitude higher than what a person would actually experience. Methods are then used to extrapolate any observed effects from such high-dose exposure to lower concentrations more representative of “real-world” exposure. Extrapolation from high- to low-dose effects raises often contentious questions about the appropriate dose-response relationship and whether low-dose effects differ from those seen at high doses. An advantage of HT methods is that a wide range of doses, including very low doses, can be tested. Such capabilities may also assist in resolving disputes around chemicals’ ability to cause different effects at low doses than they cause at high doses.
Human biomonitoring data confirm common sense in that they reveal that we are exposed to a complex mixture of chemicals. Such exposures extend from early fetal development through adulthood. High-throughput assays offer a means to test exposures to multiple chemicals and at multiple doses. Attempting to do so in traditional animal tests would prove prohibitive in terms of time, cost and use of laboratory animals.
In crisis situations where there is limited time to evaluate a chemical or mixture before its use, batteries of quick high-throughput tests might help to make a more informed decision. However, even in these situations care should be taken to clearly communicate any limitations and uncertainties associated with these decisions and the data informing them. For example, EPA used some of its ToxCast assays to examine potential endocrine-disrupting effects of dispersant chemicals used to clean up the BP oil spill. Given the crisis state of the situation (and putting aside the fact that the testing came after millions of gallons of the dispersants had already been used, begging the question of why more thorough testing hadn’t been conducted well before then), this information was helpful. However, EDF expressed concern (see here and here) regarding the poor communication of the results of such assays that was perceived as having effectively exonerated these chemicals from having any endocrine-disrupting activity – let alone other effects – despite the significant limitations of the available assays. The important lesson here is that the new technologies shouldn’t be given explicit credit beyond their actual capabilities in any situation, crisis or otherwise.
High-throughput technologies hold promise for informing safer chemical design and selection. These assays could flag potential toxicity concerns for new chemicals during early research, design, and development phases. Many of the HT technologies used in ToxCast and related programs in fact originate from the pharmaceutical industry, where they have been used for many years in drug discovery to screen out drug candidates that appear ineffective or show indications of hazard, and to push drugs that show more potential toward further development and evaluation. Efforts are already underway to integrate HT tools into green chemical design. A workshop held in September 2011 brought expert scientists together to discuss how HT and other computational technologies can be used for safer chemical design.
To learn about some of the barriers to full implementation of this technology, proceed to Section 6: Challenges and Limitations of High-Throughput In Vitro Testing