Copyright:
|
Disclaimer: The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of UNESCO and do not commit the Organization. |
Citation: Ashbolt, N., Pruden, A., Miller, J., Riquelme, M.V. and Maile-Moskowitz, A. (2018). Antimicrobal Resistance: Fecal Sanitation Strategies for Combatting a Global Public Health Threat. In: J.B. Rose and B. Jiménez-Cisneros, (eds) Water and Sanitation for the 21st Century: Health and Microbiological Aspects of Excreta and Wastewater Management (Global Water Pathogen Project). ( A. Pruden, N. Ashbolt and J. Miller (eds), Part 3: Specific Excreted Pathogens: Environmental and Epidemiology Aspects - Section 2: Bacteria), Michigan State University, E. Lansing, MI, UNESCO. https://doi.org/10.14321/waterpathogens.29 Acknowledgements: K.R.L. Young, Project Design editor; Website Design: Agroknow (http://www.agroknow.com) |
Last published: October 19, 2018 |
Over the past decades, the growing number of deaths due to antimicrobial resistant infections is beginning to rival those from traditional water, sanitation and health (WaSH) related diseases, such as diarrhea. Environmental pathways associated with water and sanitation systems are an important dimension of the global effort to control antimicrobial resistance (AMR). Yet, as discussed in other chapters, control of enteric pathogens should remain the primary focus of any sanitation system. Here, we describe the global occurrence of AMR bacteria within human and animal excreta, environmental amplification and fate of AMR bacteria within sanitation systems, and techniques for the assessment of AMR. Antibiotic resistance genes (ARGs) may be passed on and taken up by virtually all bacteria, via free DNA (transformation), bacteriophage infection (transduction) and cell-to-cell transfer (conjugation); with most acute concern when in association with infectious pathogens. No accepted AMR target for environmental monitoring is in routine use, but various promising ‘AMR indicator targets’ are discussed, including extended-spectrum beta-lactamase E. coli and an important mobile genetic element used by bacteria for ARG uptake, the class 1 integron. In general, treatment reduction of AMR follows reduction of bacterial pathogens, yet often to a lesser degree. This leads to the potential for ARGs to spread more broadly across bacterial species within environmental niches. Hence, it is important to reduce general loads of bacteria, co-selecting chemical stressors (e.g. antibiotics, biocides), ARGs, and mobile genetic elements in final products, not just pathogens, to reduce the potential uptake and spread of AMR.
Antimicrobial resistance (AMR) is one of the greatest human health challenges of our time, and is predicted to result in more deaths than those from diarrheal illnesses within the next ten years (WHO, 2016) and may become the leading cause of death by 2050 (O’Neill, 2016a). “Antimicrobials” is a broad term encompassing any agent that kills or inhibits microbes, including bacteria, viruses, and parasites. Some antimicrobials; including heavy metals, quaternary ammonium compounds, and other sanitizers, may be used topically or for general disinfection and hygiene purposes, whereas others are formulated specifically as pharmaceuticals. “Antibiotics” are a subset of antimicrobial pharmaceuticals that specifically kill or inhibit bacteria and traditionally indicates natural compounds, although the term also commonly is meant to indicate synthetic forms as well. Antibiotics, in particular, have come to be relied upon globally as critical life-saving drugs that cure deadly bacterial infections. A wide range of classes of antibiotics have been developed and marketed since penicillin was first discovered in the 1920s, ranging from broad-spectrum antibiotics that target various classes of Gram-negative and Gram-positive bacteria, to narrow-spectrum, which ideally target only the pathogen of interest. Resistance occurs when bacteria develop mutations and/or share their antibiotic resistance genes (ARGs) with other bacteria through a process called horizontal gene transfer (HGT). Bacteria that carry ARGs are better able to survive antibiotic therapy, while their non-ARG competitors are diminished. This makes antibiotic treatment a double-edged sword in which it can provide a vital cure for bacterial illnesses, while use, overuse, and misuse contributes to increasing rates of antibiotic resistance and failure of these drugs to work. Compounding the issue of AMR is that virulence factors are often associated with ARGs and transferred together via HGT (Giraud et al., 2017). Generally, antibiotic resistance has been observed to emerge in pathogenic bacteria within a few years of new antibiotics being released onto the market, with resistance rates steadily climbing. Table 1 provides key examples of bacterial pathogens and antibiotic resistance trends. Several countries and global entities, such as the World Health Organization (WHO), monitor antibiotic resistance trends in the clinical setting. A list of relevant surveillance programs and databases (as of December 2016) is provided in Table A.1.
To date, there is no global surveillance database for monitoring trends in environmental ARGs. This is, in part, the result of the constant stream of newly discovered genes, but also reflects the inability to conduct molecular analyses in many areas of the world. There are, however, numerous reviews and case studies reporting ARG incidence within clinical isolates (e.g., (Poirel et al., 2005; Kazmierczak et al., 2016). For example, Kazmierczak et al. (2016) reported on a global survey of metallo-beta-lactamase (MBL)-encoding genes among carbapenem-resistant bacteria isolated from clinical samples from 40 countries (2012-2014). The distribution of NDM-, VIM-, IMP-, SPM-type MBL enzymes was 44.2%, 39.3%, 16.5%, and 0% among MBL-positive Enterobacteriaceae. In contrast, the distribution of NDM-, VIM-, IMP-, SPM-type MBL enzymes was 1.0%, 87.7%, 11.3%, and 0% among MBL-positive Pseudomonas aeruginosa. The authors report geographic variations in prevalence as well, with NDM-types more common in the Balkans, Middle East, and Africa; VIM-types more common in Europe and Latin America; and, IMP-types more common in Asia-Pacific. To date, no MBL-positive isolates have been detected in Ireland, Denmark, Netherlands, Sweden, or Israel. Given the rapidly changing scene in AMR detected in various countries, Table 1 simply provides a snapshot of antibiotic resistance rates associated with human infections across a range of regions. Overall, there is a growing pattern of novel AMR pathogens first reported in a single country, with varying rates (rapid or slow) of transfer by human/food carriers to other parts of the world. With respect to environmental surveillance, the WHO, EU, and selected countries in Asia and Africa have initiated a pilot program that targets extended-spectrum beta lactamase (ESBL)-producing Escherichia coli screened from routinely cultured E. coli identified in water quality studies (Matheu et al., 2017).
The rise of antibiotic resistance has become a well-recognized global public health threat, with several countries and international bodies beginning to maintain surveillance databases (Table A1) and develop strategies for combatting its spread (WHO, 2014; Office of the President, 2015; O'Neill, 2016b). In particular, global organizations, such as the WHO, have emphasized the need for concerted and coordinated efforts aimed at surveillance that include environmental pathways (WHO, 2015). Such surveillance can aid in our understanding of the main causes of resistance and identify management options to limit its spread across international borders, particularly via travel, import/export of food products, and movement of people and their excreta. Of particular concern are sub-lethal doses given to humans and animals that select for resistant strains (Andersson and Hughes, 2014) and residual antimicrobials and ARGs that are released into the environment (Grenni et al., 2018). Hence, the European Union has taken one key step by banning the use of antibiotics in livestock for purposes other than direct disease treatment, though it is clear that such bans alone will not stop the spread of antibiotic resistance (Kalmokoff et al., 2011; Marshall and Levy, 2011; Massow and Ebner, 2013; Bondarczuk et al., 2016; Di Cesare et al., 2016). In particular, enforcement of policy, offering viable alternatives to antibiotics, and identifying practices to prevent livestock illness in the first place are key to reducing antimicrobial use. Here we emphasize the need to consider strategies to contain the spread of AMR that are synergistic with other general environmental and pathogen reduction benefits when developing and implementing sanitation technologies, which in many regions may also include animal manures.
Tables 2 to 5 summarize ARGs of clinical concern to last-resort antibiotics; however, the evolution of antibiotic resistance is dynamic and this list is by no means exhaustive. The advantage of targeting these genes is that as they may raise a red flag of direct concern to human health, as treatment failure is more likely when pathogens carry these types of resistance. Overall, each of the ARGs corresponding to the WHO (2017) list of AMR bacteria of medium to high concern have been reported in human/animal excreta. Therefore, the environmental release of these genes may provide an effective pathway of transmission unless adequate sanitary management is in place (see Table 4 for a summary of treatment efficacies).
Sanitation is a logical critical control point to aid in reducing the spread of antibiotic resistance. Human and animal waste-streams contain antibiotic resistant bacteria (ARBs), ARGs, antibiotics, metals, and other potential agents that could exert selection pressure for AMR. Depending on how the waste is treated and handled, resistance levels can increase or decrease (Marti et al., 2013; Larson, 2015; Bengtsson-Palme et al., 2016; Bondarczuk et al., 2016; Holman et al., 2016; Qian et al., 2016). Ideally, sanitation technologies can be adapted to serve their intended purpose of minimizing human exposure to fecal pathogens, while also reducing the potential spread of ARGs to human pathogens or to the reservoir of resistance in the natural human, aquatic, and soil microbiomes. However, in order to synergistically achieve these goals, it is critical to understand the nature of risk posed by antibiotic resistance and how it differs from pathogens and fecal indicators that have traditionally served as treatment targets.
Figure 1 illustrates how fecal contamination pathways in the environment may also serve as dissemination routes for the spread of AMR. The spread of AMR is distinct, however, as the DNA that confers resistance (i.e., ARGs) can be spread among different (including non-pathogenic) species of bacteria by HGT mechanisms, including conjugation (mating between bacteria) and transduction (via bacteriophage infection). ARGs from dead organisms, existing as free DNA in the environment, may also be assimilated by downstream bacteria by a process called transformation – hence disinfection of excreta alone may not be totally effective at preventing the spread of AMR. Also, natural and engineered stressors (such as disinfectants and disinfection by-products (Zhang et al., 2017) can induce mutations and select for biocide resistance and co-select for ARB leading to AMR (Baharoglu et al., 2013; Culyba et al., 2015). Sanitation technologies should ideally aim to reduce the conditions for selection and HGT (including to clinical strains) of ARGs and to physically destroy ARGs where possible (Bouki et al., 2013; Al-Jassim et al., 2015; Bengtsson-Palme et al., 2016). Mixing pathogenic bacteria within environments containing high densities of active bacteria and in the presence of selective and stress agents, such as antibiotics and metals, may increase the potential for horizontal transfer of ARGs (Abraham, 2011; Andam and Gogarten, 2011). Mixing of waste streams with high concentrations of antibiotics, such as from pharmaceutical manufacturing facilities or feedlot manures where sub-therapeutic concentrations of antimicrobials are used, with those containing human pathogens, such as domestic waste, is not recommended (Sidrach-Cardona et al., 2014). Segregated treatment of hospital waste has also been suggested as a “hot spot” control strategy (Rodriguez-Mozaz et al., 2015).
Figure 1. Environmental pathways of AMR showing sanitation as critical control points (red arrows) for dissemination of ARBs and ARGs. Also highlighted are likely hotspots for horizontal gene transfer (HGT). Environmental reservoirs include drinking water sources (groundwater, shallow wells, surface water), recreation/bathing water sources, irrigation (crop, turf), and biosolid/compost/manure storage or land application
An important avenue for focused scientific effort is in the development of human health risk assessment models specifically tailored to antibiotic resistance. Microbial risk assessment, including quantitative microbial risk assessment (QMRA), serves to estimate the probability of human infection, given a defined exposure dose and exposure route(s) (Ashbolt et al., 2013). However, new models are needed that consider HGT and the fact that resistant infection following exposure may not be immediate. For example, elevating resistance levels among non-pathogenic environmental bacteria (e.g., through ineffective sanitation measures or those using high microbial activity) could increase the probability of transferring ARGs to native bacteria and human pathogens in the environment (especially if waste streams are mixed) or potentially to pathogens on human skin or within the gut microbiota itself. The ultimate “risk” then is defined not just as an infection itself, but as failure of antibiotics to cure an infection, or “treatment failure”.
Developing risk models with the goal of informing the management of antimicrobial resistance will take time and will require elements of dynamic disease transmission modeling not traditionally used in QMRA. Thus, we are wise to proceed in parallel with the advancement of mitigation technologies that conservatively target both pathogen and ARG reduction and ideally are low-cost and work within the framework of existing sanitation goals (Pruden et al., 2013).
Environmental and clinical reservoirs of resistance are linked and employ conditions that exert selection pressure, or that are conducive to HGT and exacerbate the spread of antibiotic resistance. A significant body of scientific literature has grown in the last decade, documenting how human activities along with animal manure management can serve to increase background levels of resistance in soil and water environments (Singer et al., 2006; Cantas et al., 2013; Rizzo et al., 2013b; Blaak et al., 2015a; Sharma et al., 2016; Singer et al., 2016; Zhu et al., 2017). Together, there is substantial evidence that environmental routes of resistance dissemination can contribute to evolution of resistant pathogens that ultimately appear in clinics and hospitals (Taylor et al., 2011; Hölzel et al., 2012; Ma et al., 2016a).
In practice, it can (fortunately) be difficult to detect clinically-relevant genes in environmental matrices, which can make them poor targets for certain applications, such as assessing the likely benefits of various sanitation technologies for mitigating the spread of ARGs (Table 2). For this reason, more commonly detected genes in the environment, such as the sulfonamide and tetracycline ARGs, are popular among researchers (Bengtsson-Palme et al., 2016; Pei et al., 2016). While resistance to these antibiotics is rarely a serious clinical concern because their corresponding resistance determinants have become widespread, they can provide informative targets for predicting how ARGs may respond to treatments or behave in the environment. For example, Pruden et al. (2012) reported a near perfect correlation between the sul1 sulfonamide ARG and upstream densities of livestock operations and wastewater treatment plants. Therefore, such commonly occurring genes may serve as “AMR indicator genes”. HGT markers or determinants (Table 5), are not technically ARGs, but are considered to be indicative of the potential for ARGs to be transferred among bacteria, which is arguably the ultimate concern (Gillings, 2014; Culyba et al., 2015; Sharma et al., 2016). If ARGs stay confined within a non-pathogenic host, then this is not as much of a concern as if they are transferred, or have the potential to be transferred, to a pathogen. Targets include gene markers for plasmids, particularly the highly transferrable plasmids such as those within certain incompatibility “inc” groups, integrons, transposons and other mobile genetic elements, all of which have been noted in some cases to carry several ARGs (Chang et al., 2016; Folster et al., 2016; Saito et al., 2016).
Recently it was reported that, similarly to sul1, the intI1 gene encoding class 1 integrase is a strong indicator of “pollution” (Gillings et al., 2015), including resistance to fluoroquinolones, trimethoprim/sulfamethoxazole, amoxicillin/clavulanate, piperacillin/tazobactam, and presence of multidrug-resistance E. coli (Kotlarska et al., 2015). Also, the European COST Action group recommended a strategy of monitoring a mixture of clinical ARGs, indicator ARGs, and gene transfer markers, and an international cross-comparative study led by the NORMAN network that is currently underway (Berendonk et al., 2015; COST, 2017; NORMAN, 2017).
Special consideration is needed for the monitoring of antibiotic resistance, particularly for assessing the effectiveness of sanitation technologies and tracking any significant change in the spread of resistance via environmental routes. Monitoring methods largely fall into two classes: 1) culture-based methods and 2) molecular methods. The pros and cons of these methods for tracking antibiotic resistance in the environment have been extensively reviewed (Luby et al., 2016; McLain et al., 2016). Here we provide a brief overview and highlight some key points in the context of local sanitation systems.
When monitoring for AMR it is critical to recognize that, just as antibiotics are largely natural or naturally-derived compounds, there is a ubiquitous background level of antibiotic resistance for certain ARGs (Rothrock et al., 2016). Microbes have evolved the ability to both produce antibiotics (e.g., to ward off competitors), as well as the ability to resist antibiotics (Davies, 2006; Martinez, 2008; Forsberg et al., 2012; Culyba et al., 2015; Westhoff et al., 2017). While it is true that antimicrobial resistance is a natural phenomenon, what has changed in the modern era are the sheer concentrations and loadings of antibiotics and other selective agents to which microbes are being exposed. Elevated levels of antibiotics are a direct result of mass industrial production, use in humans, companion animals and livestock, and corresponding release and excretion into the environment. Thus, ideally, culture-based and molecular-based monitoring technologies are designed to identify changes in the kinds and levels of these resistance indicators against a relevant background.
In terms of culture-based techniques, some consensus is emerging around E. coli as a highly suitable target (Blaak et al., 2015b; Liang et al., 2015), although many other potentially useful bacterial targets, such as Klebsiella spp. (Berendonk et al., 2015), fecal enterococci (Berendonk et al., 2015), and bacteria that grow in aquatic/soil environments such as Pseudomonas aeruginosa (Santoro et al., 2015) or various Aeromonads (Varela et al., 2016) exist. However, E. coli is a practical choice given that it is already the most widely monitored target as an indicator of fecal pollution and thus methodologies are already standardized and infrastructure is more likely to be in place to implement monitoring campaigns based on E. coli (Matheu et al., 2017).
Minimum inhibitory concentrations (MICs) for most antibiotics are largely defined for susceptible E. coli, making it relatively straightforward to either incorporate antibiotics into E. coli selective media, or perform MIC breakpoint assays on isolated bacteria. The latter can be accomplished in 96 well trays using the Kirby-Bauer disk diffusion assay (Bauer et al., 1966; CLSI, 2015). This enables assessment of antibiotic resistance under defined conditions: using a viable strain phenotypically expressing resistance in a manner that can be directly compared to known MICs. Further advantages are that E. coli is generally a fecal-associated organism (thus maintaining relevance to tracking fecally-derived sources of antibiotic resistance) (Ashbolt et al., 2001). Importantly, some E. coli strains are known pathogens and many strains are also known to be capable of receiving and transferring genes within or between species (Kotlarska et al., 2015).
There are numerous resistant pathogens of major global concern (WHO, 2017), several of them summarized in Table 1; it is unknown to what extent the behavior of resistant E. coli is representative of other resistant pathogens, particularly those that grow well in water/sanitation environments, such as Aeromonas spp., Arcobacter spp., and P. aeruginosa. A further general downside of culture-based techniques is that they will not provide information about the broader microbial ecological behavior of ARGs, given that environmental samples will typically contain billions of microbes and their mobile genetic elements, with culture-based techniques capturing only a small fraction. Methods such as heterotrophic plate counts incorporating antibiotics into their culture media can provide insight into the behavior of broader groups of bacteria than group-selective media, but still will capture only culturable bacteria, a tiny fraction of the true bacterial community (Bartram et al., 2004). All will suffer from not knowing the identities of the isolated bacteria and thus not being able to differentiate acquired resistance from intrinsic resistance (Cox and Wright, 2013; e.g., a Gram-positive organism growing in the presence of an antibiotic targeting Gram-negatives is “intrinsically resistant”). Also, culture-based methods are generally extremely laborious and time-consuming and thus not ideally suited for extensive monitoring or certain research applications.
Molecular-based methods present the advantage of directly targeting ARGs as the presumptive agents of resistance while also circumventing biases associated with culture-based techniques. However, the simple presence of a gene does not mean it is functional or capable of being expressed. ARGs can be transferred horizontally, thus transcending their bacterial hosts. Further, given that they strongly correlate with anthropogenic inputs (Gaze et al., 2011; Pruden et al., 2013; Rizzo et al., 2013b; Ahammad et al., 2014; Graham et al., 2014; Singer et al., 2016), ARGs have been described as “pollutants” in their own right (Pruden et al., 2006). In Table 6 several available molecular methods for antibiotic resistance monitoring are summarized. Just as there are tens of thousands of species of bacteria in an environmental sample, there appears to be thousands of different types of detectable ARGs. A potential problem in only targeting ARGs via molecular methods is that such genes may not be expressed and/or passed on to pathogens of concern. Expression in cultured isolates provides more definitive information on functionality of ARGs within a viable host (Wichmann et al., 2014; Ma et al., 2016a; Bengtsson-Palme et al., 2017; Surette and Wright, 2017; Zhu et al., 2017). This brings to question, which ARGs and/or mobile genetic elements to monitor?
As indicated in Table 6, there are several available methods including qPCR and numerous other assays which are used for ARG targets. In general, there are three categories of relevant gene targets: 1) ARGs of direct clinical concern; 2) indicator ARGs; and 3) determinants for gene mobilization. ARGs of clinical concern include those encoding resistance to last-resort antibiotics, such as vancomycin, carbapenems or colistins (Hocquet et al., 2016; Mediavilla et al., 2016; Sharma et al., 2016; EFSA, 2017; Al-Tawfig et al., 2017).
Ideally, all ARGs would be monitored, both in terms of types present, their relative abundances, their propensity to be horizontally transferred (i.e., occurring on a mobile genetic element such as a plasmid or transposon), and the types of bacterial hosts in which they are present. This is precisely what is sought to be achieved via the new and emerging field of metagenomics (Pal et al., 2016). Through application of next-generation DNA sequencing technology (e.g., Illumina sequencing, pyrosequencing) and most recently third-generation DNA sequencing technology (e.g., PacBio or MinION), DNA extracted from environmental samples can be fragmented and directly sequenced. Through bioinformatics pipelines, the DNA reads can then be compared against available databases of known ARGs, such as RESFINDER for BLAST analysis (Zankari et al., 2012; Zankari et al., 2013), MGMAPPER for mapping of reads (https://cge.cbs.dtu.dk/services/MGmapper/)(Petersen et al., 2017), and MEGARes (Lakin et al., 2017), Antibiotic Resistance Gene-ANNOTation (ARG-ANNOT) (Gupta et al., 2014), the Antimicrobial Resistance Database Project (Liu and Pop, 2009), the Comprehensive Antimicrobial Resistance Database (McArthur et al., 2013; Jia et al., 2017), the Structured Antibiotic Resistance Gene Database (ARGs-OAP; (Yang et al., 2016) or deepARG (Arango-Argoty et al., 2017). In this manner, a profile of the types and numbers of ARGs detected in a sample can be obtained and compared with other samples using various bioinformatics techniques and graphical representations. Metagenomics has been successfully applied in this manner for monitoring ARGs in wastewater treatment plants (Schluter et al., 2008; Wang et al., 2013; Yang et al., 2014; Li et al., 2015a; Munck et al., 2015; Zhang et al., 2015a; Bengtsson-Palme et al., 2016; Hu et al., 2016; Karkman et al., 2016; Ma et al., 2016b), biosolids (McCall et al., 2016; Tao et al., 2016; Rowe et al., 2016; Tang et al., 2016), manure (Agga et al., 2015), soil (Yan et al., 2016), rivers (Garner et al., 2016; Rowe et al., 2016), sediments (Cummings et al., 2011), and estuaries (Port et al., 2012). However, next-generation DNA sequencing technologies are still costly and require a high level of expertise, currently restricting metagenomic analysis to the realm of research, although that may soon change.
To cut costs, metagenomic studies often sequence multiple samples per lane, with a shallow sequencing approach multiplexing ten wastewater activated sludge samples per Illumina lane or flow cell and successfully being able to detect and compare ARG profiles (Cai and Zhang, 2013). However, deep sequencing (e.g., one sample per Illumina lane), which is even more costly, may be required to filter out dominant and housekeeping genes and identify rare ARGs of interest. Deep sequencing is also typically necessary to link ARGs with host bacteria and genetic elements, along with sophisticated genome assembly techniques, which require a high level of expertise, are not standardized, and still error prone. DNA sequencing costs are predicted to decrease significantly in the coming years and new user-friendly technologies are currently in the pipeline (Schmidt et al., 2017). In particular, third-generation DNA technologies reduce cost and produce longer reads, which will facilitate assembly and thus identifying which hosts carry ARGs and if they are associated with mobile genetic elements. Thus, the metagenomic approach may soon be a widely accessible gold standard for ARG monitoring.
While metagenomic methods are still under development, quantitative polymerase chain reaction (qPCR) has become a well-established tool for monitoring ARG targets of interest. Using one of several available fluorescence-based assays, a real-time PCR instrument, and appropriate standard curve, it is possible to precisely quantify the ARG/determinant of interest. Such quantitation has been of value for quantifying anthropogenic inputs of ARGs to the environment (Pruden et al., 2012; Graham et al., 2016) and assessing the effectiveness of waste treatment technologies (Ma et al., 2011; Narciso-da-Rocha and Manaia, 2017). The disadvantage of qPCR is that, while it is less time-demanding than culture-based techniques, it is realistically only possible to include a handful of ARGs in any monitoring scheme. This necessitates selecting appropriate ARG monitoring targets.
Antibiotic resistance has existed on earth for millennia, and evolved with bacteria; thus, it is important to benchmark success of AMR control/mitigation with respect to an appropriate background (Rothrock et al., 2016). Background distributions of various levels of ARGs exist, along with various mechanism described above for their selection and transfer. Therefore, the concern addressed in this chapter with respect to environmental reservoirs of ARGs is the intensification (‘hot-spot’) development of the resistome and potential for vertical (within the species) or horizontal (between species) gene transfer within the environment (von Wintersdorff et al., 2016) and ultimately to clinically-relevant bacteria associated with sanitation-related technologies.
While evidence for the role of environmental pathways for AMR of clinical relevance exists today (Quintela-Baluja et al., 2015), it has not yet become a high priority for healthcare professionals. This apparent lack of awareness has also impacted on financing studies to clarify the role of the environment, with few funded projects thus far focusing on food, livestock wastes, and companion animals (e.g., EU Project EFFORT, http://www.effort-against-amr.eu/page/activities.php, Songe et al., 2017). Therefore, we present examples of AMR environmental reservoirs that particularly highlight the scientific plausibility/concern should mitigation/reduction approaches not be considered with sanitation systems. While most understanding of AMR is associated with bacteria and their bacteriophages, enteric viral, parasitic protozoan and helminth pathogens could also develop antimicrobial resistance, but are not capable of horizontal gene transfer in the sanitation environment the way that bacteria are. Table 4 provides a comprehensive summary of likely efficacy of AMR reduction by sanitation systems at the time of writing this Chapter. However, it is important to note that some report on a culture-basis while others by molecular methods and information about efficacies of these treatments is evolving.
Ample evidence exists for native (autochthonous) bacteria in the environment taking up and maintaining ARGs (Walsh et al., 2011; Cantas et al., 2013), such as vancomycin-resistant Enterococcus faecium (VREfm) (Sacramento et al., 2016). Indeed, the study of antibiotic-resistance mechanisms in environmental bacteria is shedding light on novel pathways of resistance found in pathogens (Spanogiannopoulos et al., 2014). Particular concern may come from spore-forming clostridia, given the persistence of their spores in soil systems (Gondim-Porto et al., 2016) and the increasing resistance within pathogens like Clostridium difficile (Zaiss et al., 2010; Garner et al., 2015). However, as with a range of bacterial pathogens, non-pathogenic sub-species or clades are likely to exist in the environment that are not only poorly documented, but would confound the relevance of detections, as in this case for C. difficile with ARGs in the environment (Janezic et al., 2016). Furthermore, ongoing genetic studies are leading to bacterial reclassifications, with C. difficile now assigned to a new genus, Clostridioides difficile (Lawson et al., 2016).
Amongst the various determinants associated with ARG capture, uptake and transfer within bacteria (Singer et al., 2016), class 1 integrons (e.g. intI1, integrase of class 1 integrons) are often involved (Stalder et al., 2012). Class 1 integrons routinely contain mobile antibiotic and biocide-resistance genes (Stokes and Gillings, 2011) and are described as part of the “mobilome” (Tian et al., 2016). For example, class 1 integrons were validated as a proxy for anthropogenic ARG inputs to the Thames River basin by Amos et al. (2015), who modeled various contributing factors impacting environmental resistome presence and determined that wastewater effluent was the major source. Class 1 integrons may also reflect the history of ARG input to soil, as seen in sludge amended soils (Burch et al., 2014), which may respond in a similar way to soils impacted by open defecation or applied excreta following a range of treatment options.
Given that ARGs and corresponding bacteria and environments in which they have been identified have been fairly widely surveyed at this point and are quite numerous, here we focus on exemplar scenarios (e.g., worldwide spread of mcr-1 gene resistance within a year (Liakopoulos et al., 2016)) and opportunities to limit the potential enrichment of “hot-spots” for ARG amplification (Pruden et al., 2013). The efficacy of such interventions could then be tracked with respect to the prevalence of AMR surrogates, such as class 1 integrons or other “indicator” ARGs identified in Table 3 (e.g. (Spanogiannopoulos et al., 2014; Blaak et al., 2015a), and minimizing the environmental release of antibiotics, biocides and metals that are known to increase selection for AMR (Di Cesare et al., 2016; Singer et al., 2016).
In addition to whole cells containing ARGs, there is a need to consider extracellular ARGs. As only focusing on genes within allochthonous (i.e., introduced) bacteria (or other cellular pathogens) may miss development or release of important ARGs. Hence, in addition to the use of molecular methods to assess the environmental resistome, as described above, we need to consider extracellular ARG uptake, by naked (transformation) and bacteriophage (transduction) mechanisms. While novel gene uptake by transduction is generally considered important (Ross and Topp, 2015), there are mixed views as to the significance of ARGs within environmental bacteriophages on the development of environmental AMR due to misinterpretation from sequence information (Enault et al., 2017) and given the high concentration of active host cells generally needed to provide interactions, as seen in clinical environments (Stanczak-Mrozek et al., 2015). Nonetheless, environmental transduction has been demonstrated (Anand et al., 2016) and the persistence of ARGs is clearly influenced by the greater persistence of bacteriophages in the environment versus ARB (Calero-Cáceres and Muniesa, 2016) or novel superspreaders (Keen et al., 2017). Therefore, sanitation processes that are focused on enteric virus nucleic acid elimination may also be effective in reducing the release and presence of bacteriophage/plasmid-mediated environmental ARGs.
Furthermore, naked DNA uptake of ARGs (transformation) is also possible during or after inactivation of pathogens and their subsequent release of ARGs (genomic or plasmid-borne). For example, advanced oxidation processes generate reactive oxygen species (ROS), which can damage cell membranes and elicit cellular SOS responses. The SOS response has been shown to increase integrase activity and the rate of gene recombination, increase the rate of HGT (Beaber et al., 2004; Guerin et al., 2009; Baharoglu et al., 2012), and increase competence which in turn may promote plasmid transformation in wastewater treatment (Ding et al., 2016). Other environmental stresses (Aertsen and Michiels, 2006), such as heat shock (Layton and Foster, 2005), starvation (Bernier et al., 2013), high hydrostatic pressure (Aertsen et al., 2004), and high pH, as well as the presence of antimicrobials, disinfection chemicals or UV have also been shown to induce the SOS response (Poole, 2012).
Given the above general discussion of likely mechanisms for environmental ARG amplification and spread, some guidance is presented below to highlight possible management options to reduce environmental AMR spread via sanitation systems.
In general, manures and sewage sludge (biosolids) are recognized matrices with the highest concentration of ARGs and antimicrobials, possibly up to 1000-times the concentrations present in wastewater effluents (Munir et al., 2011). Therefore, it is most important to control ARG release from these excreta-related solids. Significant reductions in ARGs are possible via bio-drying sludge (10-15 day process) compared to traditional composting (30-50 days). For example, Zhang et al. (2016a) demonstrated by molecular methods, some 0.4 to 3.1 log10 reductions in ARGs and a similar level of reduction in mobile genetic elements with bio-drying. The success in reductions was related to changes in the microbial communities that developed (microbiomes), which largely reflected physiochemical changes, such as pH, available nutrients, temperature, and moisture content (Zhang et al., 2016a). Hence, manipulation of the microbiome, as also seen in anaerobic digestion and composting (Youngquist et al., 2016), influences the fate of ARGs. With regards to persistent spore-forming bacteria as indicators it seems that the fecal indicator Clostridium perfringens may be a conservative indicator for ARG-containing C. difficile spores with regards to thermal (composting) treatment (Xu et al., 2016).
A recent review by Youngquist et al. (2016) suggests that mesophilic anaerobic digestion virtually eliminates ARB when assayed using culture-based methods (Beneragama et al., 2013). However, ARGs are readily moved across viable bacteria in the community, most of which are unlikely to be culturable on standard agar plates. This highlights the importance of utilizing direct measures (such as sequence-based resistome or qPCR assays) to detect ARGs. While most of these molecular-based methods fail to discriminate between dead and living targeted cells, quantitative changes can still be followed. For example, Christgen et al. (2015) demonstrated that a combination of anaerobic digestion followed by aerobic polishing provided the most reduction in ARGs identified by sequencing in an evaluation of six different treatment trains for treating domestic wastewater. Nonetheless, while the anaerobic-aerobic sequencing treatment of domestic wastewater effectively reduced aminoglycoside, tetracycline, and β-lactam ARG levels relative to anaerobic units, sulfonamide and chloramphenicol ARG levels were largely unaffected by any treatment and there was also a general increase in multi-drug resistance presence in all effluents (Christgen et al., 2015). Hence, further treatment or containment of effluent would be necessary to minimize potential AMR issues, as subsequent soil application may not result in effective removal across the range of ARG and their determinants (Burch et al., 2014). Despite the genetic burden in carrying a functional integrase, modeling indicates that the presence of this gene enables a population to respond rapidly to changing selective pressures, so maintenance of class 1 integrons is no surprise (Engelstadter et al., 2016).
In summary, ARG transfer and potential increase within the native microbiota is very likely in any sanitation system where microbial activity is encouraged (such as anaerobic digestion, trickling filters, aerobic reactors, compost, stored urine or wastewater lagoons), and in general, because of the higher solids content including microorganisms, sludge/biosolids/biofilms that support high density growth. Key factors for AMR transfer include selecting factors (antimicrobial, biocide and heavy metal concentrations), biotic processes (biofilm growth, high bacteriophage density, mobile genetic elements, etc.), and certain abiotic conditions (pH, temperature, moisture content, sunlight) that favor microbial activity. Specific issues with different treatment (unit) processes are discussed next.
Treatment technologies that provide benefits for inactivating bacterial pathogens and which also may help to minimize the spread of antibiotic resistance.
Most sanitation processes involve bacterial activity and given the above discussion on inevitable mobilization of ARGs to members of the resident microbial community, we need to focus on actions documented to reduce ARGs or influential mobilome elements, as recently reviewed (Bouki et al., 2013; Rizzo et al., 2013b; Sharma et al., 2016). Common unit processes are now briefly reviewed below so as to give a sense of which issues to consider, in addition to traditional focus on pathogens.
If lime or similar (fly ash) types of alkali compounds are added to dry sanitation systems and the pH exceeds 10, then much of the above discussion on pH and ammonia effects would be expected to be applicable in terms of reducing ARGs occurrence. Desiccation may also be important via inactivation of microbial processes and, in general, 12 months storage time is recommended for pathogen control (Schönning et al., 2007).
For collected urine (yellow water), there is a high likelihood of residual antimicrobial compound presence (i.e., selecting factors), along with antibiotic-resistant urinary tract bacterial pathogens (Ejrnaes, 2011). Hence, minimizing transfer to the highly active bacteria community within separated urine streams is important, but largely an unreported aspect to date (Pynnonen and Tuhkanen, 2014; Bischel et al., 2015). Current pathogen control regulations for source-diverted urine recommend around six months of storage for pathogen inactivation (Höglund et al., 2002; Tilley, 2016); however, reductions in antimicrobials may only be some 42-99% for anti-tuberculosis drugs and < 50% for some antivirals and antibiotics (Jaatinen et al., 2016). Therefore, additional treatments, such as UV alone or in combination with peroxydisulfate, are recommended to further eliminate antimicrobials in collected urine (Zhang et al., 2016b). However, based on the principles described above, the native microbiota within stored urine would be expected to accumulate ARGs, hence soil application or further treatment is recommended to reduce AMR issues.
Sediments within sanitation wetland/pond systems and receiving water sediments may be “hot-spots” for AMR development (Cummings et al., 2011), due to increased microbial activity and influx of wastewater-borne ARGs compared to free-waters above. Nonetheless, constructed wetlands have been shown to effectively reduce ARGs (log10 reductions of 0.26-3.3) and antimicrobials (Huang et al., 2015; Chen et al., 2016) and thus could provide a net protective effect prior to effluent reuse applications in agriculture.
While conventional WWTPs do not appear to reduce the (normalized) integron copy number, they do reduce the diversity of gene cassette arrays measured in the raw wastewater (Stalder et al., 2014), the plasmid resistome (Szczepanowski et al., 2009), and ARGs generally by some 33-98% (Tao et al., 2014). These findings imply aerobic treatment may be beneficial with respect to abating ARGs, but not a complete barrier to AMR. To reduce the cost of aeration, a combined anaerobic-aerobic system is also effective in reducing many but not all ARG types (Christgen et al., 2015), as discussed in Section 3.2.1.
High pH treatment is often used to sanitize biosolids, which is also effective in reducing ARGs (Munir et al., 2011). Taking this knowledge has also led to the application of anaerobic fermentation at pH 10 (Huang et al., 2016). Not only were ARGs reduced (compared to a neutral pH control) by some 0.4 to 1.4 log10 units, but the high pH also reduced the microbial community structures of potential ARG hosts and ARG-associated naked DNA and bacteriophages (Huang et al., 2016). In a broad comparison of the effects of various wastewater biosolids stabilization technologies (air drying, aerobic digestion, mesophilic anaerobic digestion, thermophilic anaerobic digestion, pasteurization, and alkaline stabilization) alkaline stabilization was amongst the most effective for accelerating decay of intI1, tet(X), tet(A), tet(W), sul1, erm(B), and qnrA following soil amendment (Burch et al., 2017). So, providing another example of how changing the microbiome may also assist in reducing AMR risk, which could easily be applied by, for example, a lime-treatment stage for excreta-related solids prior to use.
Sanitation residuals applied to land, even following appropriate sludge/manure treatment to reduce AMR issues (see 3.2.1), could result in re-amplification of ARGs. Therefore, it is of interest to understand how to manage soil amendments containing sludge/manure to encourage further biodegradation. For example, when biosolids were added to sandy and silty-loam soils, from a group of five ARGs and intergrase of class 1 integron (intI1), the half-life decay rates were considerable slower than reported for wastewater treatment unit operations such as anaerobic digestion; ranging from 13 days (for erm(B), with 100 g of biosolids/manure per kg soil) to 81 days (intI1 at 40g.kg-1) (Burch et al., 2014; Fahrenfeld et al., 2014; Sharma et al., 2016).
Processes that stress, but do not kill, targeted bacteria provide a mechanism to select for stress-resistant biotypes, including those with enhanced uptake of ARGs. For example, using an E. coli model, Guo et al. (2015) demonstrated that moderate to high doses of UV (>10 mJ.cm-2) or chlorine (>80 mg Cl min.L-1) greatly suppressed ARG transfer, but lower levels of chlorination (up to 40 mg min.L-1) led to a 2-5 fold increase in conjugative ARG transfer. Similar increased risk of ARG transfer by chlorination has also been reported by others (Rizzo et al., 2013a). The other common oxidant, ozone, also appears less effective, providing 4-log pB10 plasmid removal efficiency at 127.15 mg.min L-1, which was 1.04- and 1.25-fold higher than those required for ARB (122.73 mg.min L-1) and a model non-antibiotic resistant bacterial strain, E. coli K-12, (101.4 mg.min L-1), respectively (Pak et al., 2016).
However, when using molecular-methods to collectively assay an array of genes comprising the “resistome,” UV treatment has been reported to only reduce tetX and 16S rRNA genes by 0.58 and 0.60 Log10 units, respectively, with other genes reduced 0.36-0.40 even when the dose was increased to 250 mJ.cm-2 (Zhang et al., 2015b). Hence, Zhang et al. (2015) recommended a sequential UV/chlorination treatment, to enhance ARG removal, which has also been shown to be effective with 0.05-2.0 mg.L-1 chlorination (Lin et al., 2016a).
Another biocide used in sanitation is ammonia nitrogen (NH3-N) (Fidjeland et al., 2015), which can also be used in combination with chlorination to enhance ARG removal (1.2-1.5 log10 reduction at a Cl2:NH3-N ratio over 7.6:1) (Zhang et al., 2015b).
Hence, with due consideration of modes of activity, both UV and chlorination can be effective in reducing ARGs and mobile genetic elements rather than co-selecting for them (Lin et al., 2016b). Overall, known benefits of such disinfection processes for pathogen reduction likely outweigh lesser established concerns regarding potential to enhance AMR.
An important consideration when using molecular methods to assess the effectiveness of disinfectants is that it is essential to employ as long a qPCR amplicon product as possible (e.g., 1,000 bp) in order to capture sufficient DNA damage and for the kinetics to be meaningful (McKinney and Pruden, 2012). Further, a re-growth step following disinfection and before molecular analysis can aid in determining what the net effect of disinfection will be downstream, in terms of selection of potentially more resistant strains.
Comments
none
No comment