After previously writing about how guidelines are not able to protect human health and the environment from the harmful effects of pollution. Regarding the costs, the poor correlation between guidelines and risk, the predictability of risks when organisms are exposed to multiple chemicals at once, and the difficulty of assessing chemicals at relevant concentrations are some of the reasons why this is the case. Chemicals such as atrazine, phthalates, and 17ß-estradiol are not accounted for sufficiently within Australian guidelines to prevent harmful environmental effects and therefore are unlikely to be accounted for in sampling programs that aim to assess such pollution.
But if guidelines do not suffice, what does? A lot of time and effort goes into determining the thresholds and limits outlined within environmental guidelines, with these limits vastly speeding up the assessment process. How can pollution be assessed in a time- and cost‑effective manner if such guidelines are not used? This problem is one that has seen active engagement by experts around the world for the last decades. In recent years, some important developments have occurred and are now available to assess the impacts of pollution in the ways that are not possible using current Australian guidelines. This blog summarises some of the most important.
Metabolomics is the study of organisms’ metabolic activity, and in the field of environmental chemistry and ecotoxicology, allows for the direct assessment of chemical impacts to organisms using biological evidence. This differs from the assessment of risk by proxy when comparing chemical concentrations against guidelines. While the assessment of chemical concentrations against guidelines is a proxy for risk of harm to human health and the environment, metabolomics assesses the biological evidence of harm or the precursors to harm such as dysfunctional biological processes. This evidence is collected from the tissues or biofluids of organisms ranging from bacteria through to animals by analysing these materials for biochemicals (i.e., metabolites)—the chemicals produced by biological activity.The drivers of different metabolite concentrations (i.e., inflammation, decreased lipid transport) are then assessed against the presence of potential toxic chemicals also detected in organisms to constrain evidence of harm.
Far from being an obscure science, metabolomics has been used in Australia to constrain the impacts of PFAS pollution on turtles in ways that were not possible using the traditional guideline approach. Metabolomics laboratories that can conduct the analysis described above are also available in all major Australian cities, with services also available from research institutions such as CSIRO and Macquarie University. Interpretation of metabolomics data is a specific skill set that few outside the field currently have, with such interpretation available as part the metabolomics analysis service.
Effects based monitoring is similar to metabolomics in that both are used to assess the biological impacts of pollutants on biological processes. While metabolomics assesses biochemical changes in organism processes because of the presence of pollutants, effects-based monitoring sees cell lines exposed to polluted media or an extract from polluted media. The specific impacts of cell line exposure to these pollutants is then measured. Both in vitro bioassays using mammalian cell lines and well-plate based in vivo assays can be used as part of effects-based monitoring.
Say, for example, that the impact of a biosolid sample on the environment is under consideration via effects-based monitoring. Just as with chemical analysis, a liquid extract is removed from the biosolid sample. Instead of being sent to a GC- or LC-MS for quantification and identification of specific chemicals, however, the extract (measuring volumes of around 40 μL!) is introduced to the different bioassays specific to the effect under consideration. Such tests are available for a variety of effects including hormone receptor-mediated effects (e.g., activation of the estrogen receptor, ER), activation of xenobiotic metabolism (e.g., aryl hydrocarbon receptor, AhR), reactive toxicity (e.g., genotoxicity) and apical effects (e.g., cytotoxicity) tests. Depending on the assay, the results of these tests can indicate estrogenic, androgenic or glucocorticoid activity, or genotoxic effects from sample media on the selected cells.
While effects-based monitoring results show impact, the implications of this impact still need to be deduced. This is where effect-based trigger values that reflect acceptable levels of harm come in. Effect-based trigger values have been deduced for end-points ― things like estrogenic activity ― in water, but are yet to be developed for soils or biosolids. While effects-based monitoring is not a standard service offering from commercial Australian chemical laboratories, this service line is considered relatively easy to develop. In the meantime, Griffith University offers effects-based monitoring services at costs ranging from $40 per sample for tests to assess bacterial toxicity (microtox tests), or photosynthesis inhibition, through to $500 per sample to assess a full suite of biological impacts.
A typical pollution or contamination assessment requires the assessor to identify what chemicals are to be assessed. Samples to be used to assess pollution impacts at a petrol station, for example, are likely to see chemicals such as benzene, petroleum hydrocarbons, and polycyclic aromatic hydrocarbons scheduled for analysis. The problem with this scheduling is that only the chemicals one is looking for are measured. If there had been a fire at that petrol station that wasn’t publicised, there’s a good chance other hazardous chemicals like PFAS will occur because of the fire-fighting foam used to fight petrol fires. Because there is a lot of vehicular traffic through petrol stations, there’s also the potential for tire dust to be generated, along with associated toxic chemicals like 6PPD-quinone. And if the petrol station is in a rural area, any number of noxious weeds are likely to grow, with the use of hormone altering herbicides such as atrazine occurring as a result. The scheduling of only petroleum hydrocarbon chemicals for analysis does not account for the presence of these other chemicals in samples from the petrol station — that is the atrazine, the 6PPD-quinone, or the PFAS, yet alone the currently unknown next future pollutant! The associated pollution/contamination assessment therefore cannot constrain the risk of harm to human health from the other chemicals present in the soil or water ― only from the chemicals for which chemical analysis has been conducted.
This is where non-target analysis (NTA) comes in. This high-resolution mass spectrometry technique operates in an unbiased manner from which we can determine the most significant compounds in a sample. Non-target analysis can be used on individual samples or for site studies and uses advanced statistical approaches to look for differences between contaminated and control samples, with tentative, probable, or confirmed detection of chemicals in samples reported. Results can then be used to identify chemical and pollution sources, and to identify chemical exposure pathways. And, while non-target analysis does not allow for quantification of chemical concentrations, results can be used to inform standard chemical analysis that does present chemical concentrations. The Australian Laboratory for Emerging Contaminants at the University of Melbourne is available for collaborative research projects using NTA and commercial samples can be analysed at Eurofins.
We are exposed to synthetic chemicals from the moment of birth to the moment of death. It’s inescapable, wherever you live on the planet. The exposome is the word used to describe this lifetime of chemical exposure and was coined almost 10 years ago. The assessment of exposure to these chemicals either at the same time (i.e., combined exposure), or exposure to chemicals while a response to other chemical(s) is still occurring (i.e., cumulative exposure), is considered within cumulative risk assessment. In such assessments, the reactions of chemicals with each other and on different parts of organisms at the same time are considered. While this concept is not new—the cumulative risks from metals to organisms is allowed for in the ASC NEPM for example ― current European guidelines present a framework for such assessment across the broad family of chemicals to which organisms are now exposed and outline how such chemicals can be grouped to allow for realistic (i.e., combined and cumulative) assessment of hazard and risk.
Currently, pollution or contamination assessment involves sending a sample to a laboratory for the quantification of the chemicals specified for analysis. The immediate future holds something quite different, with a combination of metabolomics used to identify actual chemical impacts on biota; effects-based monitoring to screen for samples that have the potential to cause such effects; non-targeted screening to identify the chemicals that actually occur in these samples; and cumulative risk assessment to allow for the calculation of thresholds for chemicals that co-occur or to which co-exposure will occur over organisms’ lifetimes. So, while current Australian guidelines have not protected humans or the environment from the sub-lethal and toxic effects of the chemicals being used all around the world every day, new techniques are available that go a long way towards doing just that.