top of page
  • Writer's pictureKirk Hartley

New Science, TSCA, and Chemical Regulation

Here’s a pop quiz. Out of 83,000 or so chemicals, how many chemicals or chemical classes has EPA regulated under TSCA?




Drum roll please; the answer is:

5  (I did not forget any numbers).

Surprised? Me too. But that’s what you learn when you read this post on the Mass Tort Defense blog and/or then go read the December 2, 2009 testimony of John Stephenson, GAO, Director, Natural Resources and Environment. He said, at 9:

” In fact, since Congress passed TSCA in 1976  –over 33 years ago–  EPA has issued TSCA regulations on only five existing chemicals or chemical classes.”)

That result seems especially pathetic when one considers that the EU is busy implementing its comprehensive REACH program of chemical regulation, as explained here by the EU.

Will things change here? That’s not my area of expertise, so I will not offer a prediction. But, the testimony also is noteworthy for its focus on how science is now focused on cellular level events and so the dependence on epidemiology is lessening. The topic was covered by Linda Birnbaum, Ph.D., Director, National Institute of Environmental Health Sciences, National Institutes of Health, andDirector, National Toxicology Program, U.S. Department of Health and Human Services. Pasted below are some key excerpts from her testimony, beginning with the conclusion:

“We are poised to move forward into an era of a new kind of toxicological testing that is less expensive and also gives us an improved understanding of the actual effects on humans. Toxicology is advancing from a mostly observational science using disease-specific models to a better predictive science focused upon a broad inclusion of target-specific, mechanism-based, biological observations. This means using alternative assays targeting the key pathways, molecular events, or processes linked to disease or injury, and incorporating them into a research and testing framework. The NTP is laying the foundation for this testing paradigm in partnership with the National Human Genome Research Institute and the EPA. They are using quantitative high throughput screening assays to test a large number of chemicals. The resulting data are being deposited into publicly accessible relational databases. Analyses of these results will set the stage for a new framework for toxicity testing. (emphasis added)

She also said:

“Environmental health science has made tremendous strides since the original passage of the Toxic Substances Control Act, or TSCA. Our understanding of chemical toxicity has been challenged by the new science of epigenetics, which is the study of changes to the packaging of the DNA molecules that influence the expression of genes, and hence the risks of diseases and altered development. Studies indicate that exposures that cause epigenetic changes can affect several generations. This new understanding heightens the need to protect people at critical times in their development when they are most vulnerable to this kind of toxicity. (emphasis added)

The concept of “windows of susceptibility” is an important area. Research has revealed the heightened vulnerability of fetal, infant and child developmental processes to disruption from relatively low doses of certain chemicals. Established first for neurodevelopmental toxicants like PCBs, and lead and other metals, this concept also applies to hormonally active agents (endocrine disrupting chemicals). In our NIEHS Breast Cancer and Environment Research Program, co-funded with the National Cancer Institute, researchers are investigating whether periods of susceptibility exist in the development of the mammary gland, when exposures to environmental agents may impact the breast and endocrine systems that can influence breast cancer risk in adulthood. *** There are other susceptibilities to consider. For some types of chemicals and health effects, there may be excess risk from specific genes or chronic diseases. For example, the level of a person’s risk of bladder cancer from smoking has been shown to depend in part on whether or not that individual’s genome contains variants in specific detoxification enzymes. The existence of these subtle variations in susceptibility must be factored into overall toxicity assessments.

Scientists believe that other chemicals such as some PCBs and furans may cause cancer in a similar manner. The question for public health officials was how health standards could be adjusted to take into account the fact that people are always exposed to mixtures of dioxin-like compounds, not just one at a time. To address this problem, a large body of work led to the development of a method to estimate toxicity of mixtures of dioxin-like compounds based upon toxic equivalency factors, or TEFs. To estimate the overall toxicity of a mixture, the contaminants’ weighted contributions are added together, adjusting for the fact that some compounds are more toxic than others. The additive methodology has been tested and confirmed by studies done by the NTP, EPA, and others. TEF methodology has also been extended to other health endpoints, including reproductive and developmental, immune, and neurological.

Differences in routes of exposure must also be considered. For example, hexavalent chromium compounds have been shown to cause lung cancer in humans when inhaled, but it was not known how these compounds behaved when ingested. Hexavalent chromium was tested by the NTP because of concerns over its presence in drinking water. The NTP studies showed that a compound containing hexavalent chromium causes cancer in laboratory animals following oral administration in drinking water, confirming the need to protect people from oral routes of exposure.”


1 view0 comments

Recent Posts

See All
bottom of page