__T I WORLD BANK TECHNICAL PAPER NUMBER 51 Integrated Resource Recovery Wastewater Irrigation in Developing Countries Health Effects and Technical Solutions Hillel I. Shuval, Avner Adin, Badri Fal ^ / Eliyahu Rawitz, and Perez Yekutiel l 1, UNDP Project Management Report Number 6 P ()A joint contr-ibution by the United Nations Development Programme and the World Bank ~oto the Intemnational Drinking Water Supply and Sanitation Decade WORLD BANK TECHNICAL PAPERS No. 1. Increasing Agricultural Productivity No. 2. A Model for the Development of a Self-Help Water Supply Program No. 3. Ventilated Improved Pit Latrines: Recent Developments in Zimbabwe No. 4. The African Trypanosomiases: Methods and Concepts of Control and Eradication in Relation to Development (No. 5.) Structural Changes in World Industry: A Quantitative Analysis of Recent Developments No. 6. Laboratory Evaluation of Hand-Operated Water Pumps for Use in Developing Countries No. 7. Notes on the Design and Operation of Waste Stabilization Ponds in Warm Climates of Developing Countries No. 8. Institution Building for Traffic Management (No. 9.) Meeting the Needs of the Poor for Water Supply and Waste Disposal No. 10. Appraising Poultry Enterprises for Profitability: A Manual for Investors No. 11. Opportunities for Biological Control of Agricultural Pests in Developing Countries No. 12. Water Supply and Sanitation Project Preparation Handbook: Guidelines No. 13. Water Supply and Sanitation Project Preparation Handbook: Case Studies No. 14. Water Supply and Sanitation Project Preparation Handbook: Case Study (No. 15.)Sheep and Goats in Developing Countries: Their Present and Potential Role (No. 16.)Managing Elephant Depredation in Agricultural and Forestry Projects (No. 17.)Energy Efficiency and Fuel Substitution in the Cement Industry with Emphasis on Developing Countries No. 18. Urban Sanitation Planning Manual Based on the Jakarta Case Study No. 19. Laboratory Testing of Handpumps for Developing Countries: Final Technical Report No. 20. Water Quality in Hydroelectric Projects: Considerations for Planning in Tropical Forest Regions No. 21. Industrial Restructuring: Issues and Experiences in Selected Developed Economies No. 22. Energy Efficiency in the Steel Industry with Emphasis on Developing Countries No. 23. The Twinning of Institutions: Its Use as a Technical Assistance Delivery System No. 24. World Sulphur Survey ( ) Indicates number assigned after publication. (List continues on the inside back cover) Integrated Resource Recovery UNDP Project Management Report Number 6 INTEGRATED RESOURCE RECOVERY SERIES GLO/80/004 NUMBER 6 This is the sixth in a series of reports being prepared by the Resource Recovery Project as part of a global effort to realize the goal of the United Nations International Drinking Water Supply and Sanitation Decade, which is to extend domestic and community water supply and sanitation services throughout the developing world during 1981 to 1990. The project objective is to encourage resource recovery as a means of offsetting some of the costs of community sanitation. Volumes published to date include: Recycling from Municipal Refuse: A State-of-the-Art Review and Annotated Bibliography Remanufacturing: The Experience of the United States and Implications for Developing Countries Aguaculture: A Component of Low Cost Sanitation Technology Municipal Waste Processing in Europe: A Status Report on Selected Materials and Energy Recovery Projects Anaerobic Digestion: Principles and Practices for Biogas Systems Other proposed volumes in this series include reports on: Composting Demand Analysis Transferable Technologies Ultimate (marine) Disposal Cover photographs (clockwise from top): Reuse of effluents from Mexico City in the State of Hidalgo; more than 50,000 ha are irrigated with effluents, which makes it the largest such scheme in the world. High-quality treated effluent for irrigation from a waste stabilization pond complex in Amman, Jordan. Effluents are used in sprinkler irrigation of nonedible industrial crops in Israel. Aquaculture using wastewater yields about 8 tons of fish per ha per year in India. Treated effluent in Mexico City, where 4.5 m /sec are reused for landscape irrigation, recreational impoundments, and industry. Seedlings used to reforest desert lnads in coastal Peru are irrigated by treated effluents from the San Juan, Lima, waste stabilizations ponds. WORLD BANK TECHNICAL PAPER NUMBER 51 Wastewater Irrigation in Developing Countries Health Effects and Technical Solutions Hillel I. Shuval, Avner Adin, Badri Fattal, Eliyahu Rawitz, and Perez Yekutiel The World Bank Washington, D.C., U.S.A. Copyright (© 1986 The International Bank for Reconstruction and Development/THE WORLD BANK 1818 H Street, N.W. Washington, D.C. 20433, U.S.A. All rights reserved Manufactured in the United States of America First printing May 1986 This is a document published informally by the World Bank. In order that the information contained in it can be presented with the least possible delay, the typescript has not been prepared in accordance with the procedures appropriate to formal printed texts, and the World Bank accepts no responsibility for errors. The publication is supplied at a token charge to defray part of the cost of manufacture and distribution. The World Bank does not accept responsibility for the views expressed herein, which are those of the author(s) and should not be attributed to the World Bank or to its affiliated organizations. The findings, interpretations, and conclusions are the results of research supported by the Bank; they do not necessarily represent official policy of the Bank. The designations employed, the presentation of material, and any maps used in this document are solely for the convenience of the reader and do not imply the expression of any opinion whatsoever on the part of the World Bank or its affiliates conceming the legal status of any country, territory, city, area, or of its authorities, or conceming the delimitation of its boundaries or national affiliation. The most recent World Bank publications are described in the annual spring and fall lists; the continuing research program is described in the annual Abstracts of Current Studies. The latest edition of each is available free of charge from the Publications Sales Unit, Department T, The World Bank, 1818 H Street, N.W, Washington, D.C. 20433, U.S.A., or from the European Office of the Bank, 66 avenue d'16na, 75116 Paris, France. Hillel I. Shuval is director, and Badri Fattal and Perez Yekutiel are on the staff, of the Environmental Health Laboratory of the School of Public Health and Conmmunity Medicine at the Hebrew University-Hadassah Faculty of Medicine, Jerusalem. Avner Adin is with the Division of Human Environmnental Sciences of the School of Applied Sciences and Technology at Hebrew University. Eliyahu Rawitz is with the Departmnent of Soil and Water Sciences of the Faculty of Agriculture at Hebrew Universiy. Library of Congress Cataloging-in-Publication Data Wastewater irrigation in developing countries. (World Bank technical paper, ISSN 0253-7494 ; no. 51) (UNDP project management report ; no. 6) (Integrated resource recovery ; no. 6) Bibliography: p. 1. Water reuse--Hygienic aspects--Developing countries. 2. Sewage irrigation--Hygienic aspects-- Developing countries. 3. Water--Purification. I. Shuval, Hillel I., 1926- . II. Series. III. Series: UNDP project management report ; no. 6. IV. Series: Integrated resource recovery series ; no. 6. RA598.5.H43 1986 363.7'28 86-7742 ISBN 0-8213-0763-0 v ABSTRACT This report summarizes information on practices of wastewater reuse for agriculture in developing and developed countries around the world and reviews the" public health and technological aspects of- irrigation with wastewater. It evaluates the potential health effects from such reuse and proposes effective and economic methods of control that are particularly suited to developing countries. A theoretical model is developed, based on a review of available credible epidemiological studies and reports, to assist in predicting the degree of risk of disease transmission associated with various wastewater reuse practices. The empirical evidence and the model suggest that the highest risk of pathogen transmission, infection, and sickness is associated with the helminths, followed in order by bacterial infections and last by viral infections. The model provides a basis for evaluating control options. Although certain health risks are clearly associated with the use of raw wastewater in agriculture, the epidemiological evidence assembled for this study also suggests that the very stringent wastewater irrigation standards developed in many of the industrialized countries are overly restrictive. This study suggests a guideline for unrestricted wastewater irrigation based on an effluent with less than one nematode egg (Ascaris or Trichuris) per liter and a geometric mean fecal coliform concentration of 1,000100 ml. Technological and policy options for reducing and controlling any health risks of wastewater reuse in agriculture are evaluated here. In particular, multicell stabilization ponds with 20 days' detention time effectively remove bacterial, viral, and helminth pathogens in a low-cost, robust, easy-to-operate system that is especially suitable for developing countries. Appropriate wastewater treatment in combination with controlled irrigation techniques and restrictive cropping practices represent effective remedial measures. This study provides a rational basis for the development of a sound economic approach to wastewater irrigation in developing countries. Such an approach helps to conserve water and nutrient resources, promotes agricultural development, and contributes to pollution control. - vii - FOREWORD In 1981, a three-year Global Research and Development Project on Integrated Resource Recovery (Waste Recycling) was initiated as Project GLO/80/004 by the United Nations Development Programme through its Division for Global and Interregional Projects. The World Bank, through its Water Supply and Urban Development Department (WUD), agreed to act as executing agency. The primary project goal is to achieve economic and social benefits through sustainable resource recovery activities in the developing countries by recycling and reusing solid and liquid wastes from municipal and commercial sources. Increasing recognition of the need for technical and economic effi- ciency in allocating and utilizing resources and the role that appropriate recycling can play in the water and sanitation sector have led this project to be included in the formal activities of the United Nations International Drinking Water Supply and Sanitation Decade. The reuse of domestic wastewater and the recycling of other human wastes in agriculture can produce significant economic benefits and help defray the large costs of municipal waste manage- ment. The recycling of human wastes to add nutrients to, and improve the physical quality of, the soil is an ancient practice. In its modern form, the reuse of wastewater effluents for irrigation of crops offers attractive benefits, such as increasing water supplies for productive agricultural use, adding valuable fertilizers and micronutrients to maintain soil fertility, and reducing pollution of surface water sources. Possible negative effects to people who consume edible crops con- taminated by uncontrolled wastewater irrigation practices or to farmers who are directly exposed to wastewater irrigation have to be carefully evaluated so that remedial measures can be taken to assure that the public reaps the full benefits from a water recycling development with a minimum risk. This report identifies the known, credible, quantifiable health effects, particularly for conditions relevant to developing countries, and presents the recommended specific remedial measures as the main operational outputs. The study has been carried out for the World Bank/UNDP by Hillel Shuval under the guidance of Charles Gunnerson. Comments and remarks on this report are most welcome. S. Arlosoroff, Chief Applied Technology (WUD) UNDP Projects Manager ix - CONTENTS List of Tables ...................... xv List of Figures . . .xix Preface and Acknowledgments......... xxv List of Terms and Abbreviations ..xxix Chapter 1 Historical, Present, and Potential Reuse of Wastewater in Agriculture ........... ........................................ 1 Early Major Wastewater Irrigation Projects ................ 1 Present Status of Interest in Wastewater Reuse ................. . 4 Examples of Current Wastewater Reuse Practices in Agriculture around the World ............. ................... 8 United Kingdom ................................................ 8 United States ..................* ........................................... 9 Israel ...................................................9 India ..... 11 Federal Republic of Germany .....12 Latin America ...... *... ....13 Republic of South Africa . . .14 North Africa and the Middle East . . 14 Central Africa . . .16 Southeast Asia ...17 Japan ...17 Soviet Union ....... . 17 China .................. 17 Australia ..17 Future Trends in Wastewater Reuse in Agriculture ....... . 18 Chapter 2 Enteric Pathogens in Wastewater and Their Survival in Soil, Crops, and in the Air . ..... 27 Pathogens in Excreta ............27 Viruses .......... ............ 27 Bacteria ........... ................ 29 Protozoa ...*. ..e... ............. 31 Helminths ...... *.................... 31 Survival of Indicators and Pathogens ..33 In Feces, Night Soil, and Sludge . .37 In Water and Sewage. 37 In Soil . .. . . . . . .. . .. . . .. . . .. . . . . ... 38 On Crops ....... ................. ...* 38 Overall Pathogen Removal Efficiency of Wastewater Processes .... 38 Dispersion of Aerosolized Enteric Pathogens . .............. 43. General Conclusions ............ .. ... 44 x - Development of a Conceptual Epidemiological Approach ............ 45 Excreted Load ............................................. 46 Latency ............................................................ 46 Persistence ................................................... 47 Multiplication ............................ 000 47 Infective Dose ........ ............ 49 Host Response ..................... 50 Nonhuman Hosts .............. ................ 51 Categories of Excreta-Related Infections ........................ 52 Proposed Model to Predict the Relative Effectiveness of Pathogens in Causing Infections through Wastewater Irrigation 56 Chapter 3 Health Effects Associated with Wastewater Irrigation: Early Reports, Opinions, and Policies ........... ................ 58 The Nineteenth Century ......... . .. . . . . .. ..................... . ............... 58 First Half of the Twentieth Century ............................. 60 More Recent Opinions and Statements of Policy ................... 62 Chapter 4 Evaluation of Epidemiological Evidence of Human Health Effects Associated with Wastewater Irrigation ... ................ 66 Intervening Factors That Influence the Level of Environmentally Transmitted Disease Associated with Wastewater Reuse ......... 66 Criteria and Guidelines for Evaluating Epidemiological Studies .. 67 Epidemiological Studies on the Health Effects on the General Population Consuming Edible Crops, Dairy Products, or Meat Exposed to Wastewater Applications ......... 68 Ascariasis and Trichuriasis among Inmates in Tara Prison, Egypt ....... ................................. 68 Ascariasis in Darmstadt, Germany .............................. 71 Epidemiological Evidence for Helminth Transmission by Vegetables Irrigated with Wastewater in Jerusalem ............ 73 Cholera Outbreak in Jerusalem 1970--The Case for Transmission by Wastewater-Irrigated Vegetables ........................... 77 Typhoid Fever and Sewage Irrigation in Santiago, Chile ........ 81 Transmission of Disease to Humans by Meat or Dairy Products from Cattle or Sheep Grazing on Wastewater-Irrigated Fields .. 85 Evaluation of Epidemiological Studies on the Health Effects on Agricultural Workers Directly Exposed to Contact with Wastewater Irrigation ...... . . .......... .. . 87 Intestinal Parasitic Infections Associated with Sewage Farm Workers--India ................. ................. 87 The Epidemiological Significance of Urban Sewage in the Spread of Possible Zooparasitic Infections ............... .... 92 Sewage Workers' Syndrome ........................... ...... O... 92 Disease Rates among Copenhagen's Sewer Workers .... ............ 93 Health Risks of Human Exposure to Wastewater in Three Cities of the United States ......................... ..... ........ 94 - xi - Evaluation of the Health Risks Associated with the Treatment and Disposal of Municipal Wastewater and Sludge at Muskegon, Michigan ............... . ........... ..... 96 The Health of Sewage Treatment Plant Workers in Canada ........ 98 Cholera Outbreak in Jerusalem, 1970: The Effects on Wastewater-Irrigation Workers ..... . ..................... 100 Wastewater Used in Agriculture That Causes Disease or Infection in Nearby Nonagricultural Population Groups ........ . 100 The Use of Wastewater in Irrigation District 03, Tula, State of Hidalgo, Mexico ... ............... * * .... o........ ...... 101 Use of Wastewater for Irrigation in District 03 and 88 and Its Impact on Human Health, Mexico ..o..o ................ 102 Health Effects of Aerosols Emitted from an Activated Sludge Plant, Skokie, Illinois ....... o...o ..........o ... 104 Acute Illness Differences with Regard to Distance from the Wastewater Treatment Plant in Tecumseh, Michigan .... o..... 105 Health Effects from Wastewater Aerosols at a New Activated Sludge Plant (John Egan Plant), Schaumburg, Illinois . ........ 105 Wastewater Aerosols and School-Attendance Monitoring at an Advanced Wastewater Treatment Facility, Durham Plant, Tigard, Oregon ...................... .................... 107 An Evaluation of Potential Infectious Health Effects from Sprinkler Application of Wastewater co Land, Lubbock, Texas 107 Risk of Communicable Disease Infection Associated with Wastewater Irrigation in Agricultural Settlements in Israel 109 Health Risks Associated with Wastewater Utilization in Agricultural Settlements in Israel: a Historical Epidemiological Study . ............................. ........... 110 A Prospective Epidemiological Study in Agricultural Communi- ties Exposed to Aerosols from Sprinkler Irrigation in Israel.. 112 General Conclusions as to Quantifiable Health Effects Associated with Wastewater Irrigation, with Particular Reference to the Developing Countries ........o ......o..... 116 Potential Transmission of Other Diseases by Wastewater Irrigation 121 Discussion ..... .. . .... o .............. .......o......... 123 Health and Economic Implications Associated with Diseases Found To Be Transmitted by Irrigation with Raw Wastewater .......... 131 Ascariasis ......................... ............ o ......... .... ...... .o.. 131 Trichuriasis .... ... . ...132 Ancylostomiasis (Hookworm Disease) .. ............. ............ 133 Taeniasis and Cysticercosis (Beef and Pork Tapeworm Disease) .. 135 Cholera ....osis (BaiarDysenter.......y). ..... o .......... 135 Typhoid Fever ..... ............................................ 136 Shigellosis (Bacillary Dysentery) ............. ................................. 136 Enteric Viruses ...... . -.-... .. .......................o................. 136 Conclusions o..... ... . oo.... ..*......... o......... .... .................. 137 - xii - Chapter 5 Wastewater Characteristics and Treatment for Irrigation ... 138 Wastewater Characteristics and Problems in Irrigation Systems Associated with Water Quality ........................ 138 General Characteristics of Sewage ............................. 138 Irrigation System Problems Associated with Water Quality ... ... 139 Characteristics of Effluents from Conventional Wastewater Treatment Facilities ......................................... 142 Stabilization Ponds, Subsequent Treatments, and Effluent Quality 143 Water Quality in Wastewater Reservoirs for Agricultural Irriga- tion (A Case Study: Naan Reservoir, Kibbutz Naan, Israel) ... 154 Parasite Removal through Wastewater Treatment Processes .... ..... 158 Concentration of Protozoans and Helminths in Wastewater .... ... 158 Removal of Parasites by Sedimentation ......................... 160 Removal of Parasites by Conventional and Polishing Treatment .. 163 Removal of Parasites by Stabilization Ponds ................... 164 Costs . ........................................................... 167 Waste Stabilization Pond Design ................................. 169 Anaerobic Ponds ............................................... 169 Facultative Ponds ....... ...................................... 173 Maturation Ponds .............................................. 173 Physical Design of Ponds ................ .. .................... 174 Night-Soil Ponds ........ .............. ........................ 176 Illustrative Example for Irrigation Purposes ....... .. ........... 177 Effluent Treatment for Drip Irrigation Systems ...... .. .......... 183 Typical Strainers ..................... ........................ 183 Clogging of Strainers ...... ............. ...................... 187 Granular Pressure Filters . .................. ..... ............. . 189 Chapter 6 Wastewater Irrigation Practice ............... ............ 190 Introduction .................................................... 190 Application Rates ..................... ........................ 190 Some Basic Principles of Irrigation .......... .. ............... 191 Similarities and Differences between Effluent and "Normal" Irrigation Water ........ ............. ........................ 192 Crop Selection Considerations and Criteria ........ .. ............ 202 Suitability of Crop to General Conditions ....... .. ............ 202 Constraints on Crop Growth ............... .. ................... 202 Public Health Constraints ........................... ......... . 204 Characteristics of Irrigation Systems Relevant to Effluent Irrigation .......................................... 205 Surface Irrigation ........... . ................... ......................... . 206 Sprinkler Irrigation ....... ............. ...................... 233 Drip Irrigation ............................................... 253 Comparative Costs of Various Irrigation Methods ....... .......... 265 - xlii - Chapter 7 Technical and Policy Options for Remedial Measures .... ... 268 General Approach ....................... ......................... 268 Agronomic Techniques ..................... ....................... 269 Restricting Crops ........ ............. ........................ 269 Modification and Control of Irrigation Techniques ..... ........ 269 Recommended Guidelines for Restrictions on Types of Crops Irrigated with Wastewater .................................... 270 Disinfection of Wastewater-Contaminated Crops ...... .. ........... 272 Point-of-Use Disinfection ...... .................................... 272 Central Market Disinfection Stations .......................... 272 Improving the Occupational Health of Sewage Farm Workers . ...... 273 Prophylactic or Chemotherapeutic Medical Treatment ...* .......... 273 Immunization ........................ .......................... 273 Chemotherapy ........................ ........................... 274 Nutritional Supplement of Iron ...... ... ............ 274 Wastewater Treatment ......................... ................... 275 Optimal Level of Treatment ............... .. ................... 275 Lower Levels of Treatment ............... .. .................... 278 Intermediate-Level Treatment .............. .. .................. 278 Palliative Measures and Stages ..... ........................... 278 Evaluation of Technical and Policy Options for Remedial Measures 279 Advantages of Centrally Managed, Engineered Environmental Interventions ........................... 279 Other Options ................................................. 280 Conclusions ................................................... 280 Chapter 8 The Economic Evaluation of Wastewater Reuse in Irrigation ................................ ................... 282 Analytical Framework ............................................ 282 Economic Evaluation of Irrigation ............................... 283 Wastewater Characteristics and Treatment ...................... 284 Wastewater Irrigation ......................................... 285 Wastewater Treatment Costs ...................................... 287 Land Value Considerations ..................................... 287 Hypothetical Model .............................................. 289 Discussion ...................................................... 293 Recommendations ................................................. 293 Chapter 9 Summary and Conclusions ............................... O.. 296 Objectives ...................................................... 296 Benefits of Wastewater Utilization in Agricultural Irrigation ... 296 Negative Effects of Wastewater Utilization in Agricultural Irrigation ................................... 297 - xiv - History of Wastewater Irrigation ................................ 297 Epidemiological Factors in Human Health Effects with Wastewater Irrigation .................... ... ............ 298 Pathogen Survival in the Environment .............. ... ......... 298 Intervening Factors ............................ ............... 298 Evidence of Quantifiable Health Effects ............ ... ........ 299 Implications for Developing Countries ............. ... ......... 300 Other Pathogens Potentially Transmitted by Wastewater Irrigation ................................. .................. 301 Wastewater Treatment Technology as a Remedial Measure in Reducing the Health Effects of Wastewater Irrigation ......... 301 Agricultural Irrigation Methods ...... .................. ......... 303 Technical and Policy Options for Remedial Measures ...... ... ..... 304 Conclusions ... .................................................. 305 References . ......................................................... 307 - xv - LIST OF TABLES Table 1-1 California State Department of Health standards for the safe and direct use of reclaimed waste- water for irrigation and recreational impoundments ....... 6 Table 2-1 Viral pathogens excreted in feces . . 28 Table 2-2 Bacterial pathogens excreted in feces . . 30 Table 2-3 Protozoal pathogens excreted in feces .................... 30 Table 2-4 Helminthic pathogens excreted in feces . . 34 Table 2-5 Survival times of excreted pathogens in feces, night soil, and sludge at 20-300C .............................. 36 Table 2-6 Survival times of excreted pathogens in freshwater and sewage at 20-30°C ....... 36 Table 2-7 Factors affecting survival time of enteric bacteria in soil .................................................. 39 Table 2-8 Survival times of excreted pathogens in soil at 20°C ..... 39 Table 2-9 Survival times of excreted pathogens on crops at 20-30°C ...... 40 Table 2-10 Enteric pathogen removal efficiencies of wastewater treatment processes ...... 40 Table 2-11 Possible output of selected pathogens in the feces and sewage of a tropical community of 50,000 in a developing country ............. ....... 42 Table 2-12 Environmental classification of excreted infections ..48 Table 2-13 Basic epidemiological features of excreted pathogens by environmental catee gr.............................. 54 Table 2-14 Epidemiological characteristics of enteric pathogens .... 56 - xvi - Table 3-1 Suggested treatment processes to meet the given health criteria for wastewater reuse ..................... 64 Table 4-1 Prevalence of Intestinal Parasites in Sewage Farm Workers and Control Groups, India ........................ 88 Table 4-2 Relation of hookworm egg count to hemoglobin and hematocrit values at Halim, Indonesia .................... 134 Table 5-1 Typical characteristics of sewage from Indian cities ..... 140 Table 5-2 Typical domestic sewage characteristics in the US ........ 141 Table 5-3 Relative efficiencies of sewage treatment operations and processes ............................................ 144 Table 5-4 Calculated effluent characteristics from different operations and processes ................................. 145 Table 5-5 Imhoff tank effluent: chemical and bacteriological characteristics at Fort Devens, Massachusetts, land treatment site .................................. *................ 146 Table 5-6 Bacterial removal during primary wastewater sedimentation ..... ....................................... 146 Table 5-7 Expected values of properly designed stabilization ponds in Southern Africa .. . .. .......... 147 Table 5-8 Experimental results of effluent from a series of five stabilization ponds in Brazil ......................... .... 148 Table 5-9 Experimental results of effluents from four facultative ponds in parallel in Brazil ........................... 149 Table 5-10 Mean experimental results of effluents from anaerobic ponds in Brazil ......#..............a..... .........so......... 150 Table 5-11 Effluent quality at various treatment steps in the Dan Region Wastewater Reclamation Project ................ 153 Table 5-12 Organic material concentration and its removal in Naan purification-storing system .. ......... .... ........... 156 Table 5-13 Removal efficiency of bacteria in oxidation pond- reservoir system .. ................. ....... ............ . 158 - xvii - Table 5-14 Discrete gravitational settling of parasites in water.... 162 Table 5-15 A comparison of the removal of cysts and eggs of enteric parasites in various sewage treatment processes .......... 165 Table 5-16 Ratio of construction cost of conventional plants to cost of a pond treatment plant of the same capacity..* ... 167 Table 5-17 Relative 1983 costs of wastewater treatment facilities in South Africa .............................................. * ............ 168 Table 5-18 Approximate per capita requirements for a waste stabilization pond system serving a total population of 30,000-100,000 ................ 0.... 0.0.................................... 170 Table 5-19 Annual costs of sewage treatment in India, 1970 .......... 171 Table 5-20 Advantages and disadvantages of various sewage treatment systems ...... ............ 172 Table 5-21 Treatment pond example, phase 1 .............. ........... 180 Table 5-22 Treatment pond example,' phases 2 and 3 ................... 181 Table 5-23 Treatment pond example, cost estimates ............ ...... 182 Table 5-24 Aerated lagoon vs. oxidation ponds (complete systems)-- area and cost estimates ................ .............. ....... 184 Table 6-1 Limits of boron in irrigation water for crops of various sensitivities, based on toxicity symptoms of plants grown in sand culture ........................... 195 Table 6-2 Yield decreases of various crops to be expected due to salinity of irrigation water ................ . .............. 203 Table 6-3 Comparison of boron and heavy metal concentration standards with effluent irrigation water, alfalfa fodder, and milk ..... . ....... ...................... . 206 Table 6-4 Investment per hectare and total annual cost per hectare of various irrigation methods in California and Israel... 267 Table 7-1 Tentative Microbiological Quality Guidelines for Treated Wastewater Reuse in Agricultural Irrigation .............. 277 Table 8-1 Approximate economic incremental NPV/ha - irrigation .... 284 - xviii - Table 8-2 Impact of sewage disposal ............................... 285 Table 8-3 Wastewater agricultural nutrients ....................... 286 Table 8-4 Estimated present value treatment cost vs. land value ... 288 Table 8-5 Model assumptions -- general ............. ................ 289 Table 8-6 Hypothetical models of wastewater irrigation ............ 292 - xix - LIST OF FIGURES Fig. 1-1 Location of sewage farms of Paris in 1904 .............. ................................... 3 Fig. 1-2 Agricultural reuse of municipal wastewater in Israel 1963-82 ................................................. 10 Fig. 1-3 Generalized annual global precipitation (mm) ............ 20 Fig. 1-4 Climate diagrams for areas with wastewater irrigation, North and South America ....................i.....*..see 21 Fig. 1-5 Climate diagrams for areas with wastewater irrigation, Asia and Europe .. ....... .............. 22 Fig. 1-6 Climate diagrams for areas with wastewater irrigation, Africa and Australia .................... ...... 23 Fig. 1-7 Aridity index for Asia . ..................... 24 Fig. 1-8 Aridity index for North and South America ............... 25 Fig. 1-9 Aridity index for Africa and Australia .................. 26 Fig. 2-1 Persistence of selected enteric pathogens in water, I wastewater, soil, and on crops ........................... 41 Fig. 2-2 Minimal infective dose of selected enteric pathogens .... 50 Fig. 2-3 Involvement of other vertebrates in the transmission of human excreted infections .52 Fig. 3-1 Effect of water purification on death rate from typhoid fever in Detroit, Michigan, 1900-1933 ........... 65 Fig. 4-1 Prevalence of parasitic infections in'Tara Prison, Egypt, 1925 ................................................... 69 Fig. 4-2 Prevalence of parasitic infections in two communities in Egypt, 1925 . ..... ................ 70 Fig. 4-3 Wastewater irrigation of vegetables and Ascaris prevalence in Darmstadt, Berlin, and other cities in Germany in 1949 . .............. 72 -xx Fig. 4-4 Municipal drainage areas of Jerusalem and plots irrigated with raw wastewater up to 1970 ................ 74 Fig. 4-5 Relationship between Ascaris-positive stool samples in population of western Jerusalem and supply of vegetables and salad crops irrigated with raw waste- water in Jerusalem, 1935-1982 ....... .................. 75 Fig. 4-6 Weekly distribution of cholera cases in Jerusalem, August-October 1970 ................. ............. 78 Fig. 4-7 Hypothesized cycle of transmission of Vibrios cholerae from first cholera carriers introduced from outside the city, through wastewater-irrigated vegetables, back to residents in the city ............................. 80 Fig. 4-8 Seasonal variation in typhoid cases in Santiago and the rest of Chile (average rates, 1977-1981) ................ 83 Fig. 4-9 Typhoid fever in Santiago and the rest of the country, 1973-1984 ............ ................................... 83 Fig. 4-10 Prevalence of parasitic infections in sewage farm workers and controls from various regions of India ...... 89 Fig. 4-11 Intensity of parasitic infection in sewage farm workers and controls in various regions of India ............... . 91 Fig. 4-12 Geographical distribution of Ancylostoma duodenale ...... 125 Fig. 4-13 Geographical distribution of Necator americanus ... 126 Fig. 4-14 Geographical distribution of Taenia saginata ............ 127 Fig. 4-15 Geographical distribution of Taenia solium .............. 128 Fig. 4-16 Global spread of cholera, pandemic El Tor variety, from Celebes to Africa, 1961 to 1975 .............. ........... 129 Fig. 5-1 Schematic layout of Dan Region Reclamation Project - Stage I ................................................ 152 Fig. 5-2 Schematic layout of Naan Wastewater Reservoir ........... 155 Fig. 5-3 Relative and iotal removal of BOD prior to and in the Naan Wastewater Reservoir .157 Fig. 5-4 Removal of coliform bacteria in oxidation pond- reservoir system . ........ .... ............159 - xxi - Fig. 5-5 Schematic layout of oxidation ponds system for effluent irrigation (after Arthur 1983) and construction phases 178 Fig. 5-6 Suggested layout of equally sized oxidation ponds in series ....... ........................................... 179 Fig. 5-7 Strainers for irrigation systems, Type I ........ ......... 185 Fig. 5-8 Strainer for irrigation systems, Type II ............... 186 Fig. 5-9 SANOMAT-FILTOMAT strainer, Type III ..................... 188 Fig. 6-1 Classification of irrigation water quality according to electrical conductivity and sodium hazard ............ 197 Fig. 6-2 Wild flooding from field laterals ............... ......... 208 Fig. 6-3 Graded rectangular border checks for small grains, forage and row crops, and contour checks for orchards ... 210 Fig. 6-4 Various methods of delivering water from a farm ditch to border checks or irrigation furrows ............. 211 Fig. 6-5 Alfalfa valve mounted on concrete riser supplied by underground pipeline ................................ 211 Fig. 6-6 Cross section of furrows showing flow path of water into ridges ... 2.... ........... ...215 Fig. 6-7 Orchard hydrant mounted on underground concrete pipeline 217 Fig. 6-8 Schematic view of the mass balance of an elementary water volume at any point along an advancing irrigation stream .................................................. *... 219 Fig. 6-9 Sample map of a field-test layout ..... ................... 222 Fig. 6-10 Profiles of water layer on land surface and depth of penetration into soil at equal time intervals during the advance stage ................ ............... .......... 223 Fig. 6-11 Example of a water advance curve in a furrow or border check ........................ ............ ............. 0.... 223 Fig. 6-12 Typical relation between soil infiltrability and time for initially dry soil ...........I...................... 225 - xxii - Fig. 6-13 Variation of infiltrability along a plot due to differences in intake opportunity time, just after advance stream has reached end of plot ..... ............. 225 Fig. 6-14 Plot of infiltration stage after advance stream has reached downstream end of field ..... .................... 226 Fig. 6-15 Complete advance-recession diagram, showing intake opportunity time (IOT) at various distances ............. 226 Fig. 6-16 Effect of input stream on advance curves on an infinitely long land surface (ql>q2>q3) . ................ 228 Fig. 6-17 "Moving the set" of a lateral in a hand-move system ..... 236 Fig. 6-18 "Moving the set" of a lateral in a tractor-tow system.... 237 Fig. 6-19 "Moving the set" of a lateral in a roll-move system ..... 238 Fig. 6-20 A center-pivot irrigation system ........................ 239 Fig. 6-21 A typical medium-pressure impact sprinkler ............. . 240 Fig. 6-22 The "20 percent law" of pressure differential applied to a single sprinkler lateral, as used in a hand-move system, and to a number of laterals working simultaneously, as in a solid-set or tow-move system, or in a drip installation ............................... 243 Fig. 6-23 a. Effect of wind on water distribution from a single sprinkler and the resulting depth of wetting of the soil. b. Effect of wind velocity and direction relative to lateral direction on water distribution pattern and ground coverage of a solid-set or tow-move sprinkler set . ..................................................... 245 Fig. 6-24 a. Typical application distribution of a sprinkler designed for a spacing between adjacent sprinklers that produces overlapping patterns. b. Two-dimensional schematic of how overlap between adjacent sprinklers along a lateral produces uniform water distribution ..... 247 Fig. 6-25 Sample page from sprinkler catalog with recommended operating conditions ....... ............................. 248 Fig. 6-26 Alternate-row placement of drip laterals in row crop .... 255 - xxiii - Fig. 6-27 Flow pattern of water in the soil, and typical shape of wetted soil volume produced by a single drip emitter placed on the soil surface next to a plant . ............. 256 Fig. 6-28 Comparison of soil moisture regimes produced by 14-day and 3-day irrigation frequencies ........................ 258 Fig. 6-29 Comparison of the water potential regimes corresponding to the water content regimes of Figure 6-28, and indication of the additional potential decrease due to soluble salts ............... 0 .................... 258 Fig. 6-30 a. Longitudinal section of long-path, in-line drip emitter. b. Section through a narrow-orifice button emitter o.*.*.......... es........................................ 262 Fig. 7-1 Generalized removal curves for BOD, helminth eggs, excreted bacteria, and viruses in waste stabilization ponds at temperatures above 200 C .276 Fig. 8-1 Model structure......... .... ......... s 295 - 0(V - PREFACE AND ACKNOWLEDGMENTS The reuse of wastewater for agricultural irrigation offers many attractive benefits, including reduced pollution of water sources; increased water supplies for productive agricultural use; and the addition of valuable fertilizers and micronutrients to maintain soil fertility. However, the possible negative effects to the health of farmers directly exposed to wastewater irrigation, to the public consuming edible crops contaminated by uncontrolled wastewater irrigation practices and those consuming milk and meat derived from animals exposed to wastewater-irrigated pasture lands, or to population groups residing near wastewater-irrigated fields have to be carefully evaluated so that remedial measures can be taken to ensure that the public reaps the full benefits of a water recycling project. This paper does not deal directly with the use of wastewater sludge or night soil in agriculture or aquaculture since these subjects are covered by two companion reports sponsored by the World Health Organization and the United Nations Environment Program through the International Reference Centre on Waste Disposal at Diibendorf, Switzerland (Blum and Feachem 1985; Cross and Strauss 1985). The purpose of this report is to identify the known, credible, and quantifiable health effects of wastewater reuse, particularly for the developing countries. The recommendation of specific remedial measures suitable to the developing countries is the main goal of this report. Note: A separate bibliography has been prepared--"Wastewater Reuse, Emphasizing Health Aspects: A Selective Bibliography"--which covers some 1,000 articles, books, and reports. Free copies can be obtained by writing to the Publications Group, Water Supply and Urban Development Dept. (N-713), The World Bank, 1818 H Street, N.W., Washington, D.C. 20433. This study has been carried out for the World Bank as Executing Agency for the United Nations Development Programme-Integrated Resource Recovery Recycling Project (Waste Recycling) (GLO/80/004-July 1981) under the guidance and supervision of Charles Gunnerson, who has provided invaluable advice throughout the study. In addition, many other staff members of the World Bank provided important input to this report, particularly, S. Arlosoroff and John Kalbermatten. Chapter 8, "The Economic Evaluation of Wastewater Reuse in Irrigation," was written by Frederick Wright and Edward F. Quicke of the World Bank (WUD) staff. Their vital contribution is hereby acknowledged. Support in gathering important data and assistance in arranging field visits were provided by Bernd Dieterich and his staff at the Environmental Health Division of the World Health Organization (WHO), Geneva. Many individuals in WHO Regional Offices and the WHO-International Reference Center on Waste Disposal (IRCWD), Diubendorf, Switzerland, also made valuable con- tributions to this study. Jacobo Finkelman, WHO, Mexico, provided a critical - xxvi - field evaluation of epidemiological studies in Mexico, and Richard Feachem and his colleagues at the London School of Hygiene and Tropical Medicine supplied vital, hard-to-obtain documents and reports. Furthermore, most of Chapter 2 is based on work by Feachem and his group as reported in their authoritative World Bank study, Sanitation and Disease: Health Aspects of Excreta and Wastewater Management (1983). Other agencies that provided valuable assistance include the United Nations Development Programme; the U.S. Environmental Protection Agency; the Water Research Centre of the United Kingdom; the National Environmental Engineering Research Institute, Nagpur, India; the Asian Institute of Technology, Bangkok, Thailand; the Pan American Center for Human Ecology and Health, WHO, Mexico; and the Gordon McKay Library of Applied Science, Harvard University. The following individuals have reviewed this manuscript and have provided many useful comments and suggestions, which have been incorporated in the final text: A. Al-Khafaji, Deborah Blum, David Cook, B. Cvjetanovic, Richard Feachem, Fredrick L. Colladay, Bernhard H. Liese, Sr., James Listorti, D.O. Lloyd, Duncan Mara, Letitia Obeng, Carlo Rietveld, Gunner Shultzberg, J. Srivastava, Martin Strauss, P.M. Tamboli, A. Thys, and A. Zavala. This volume, together with the two reports prepared by the IRCWD mentioned earlier, were reviewed in depth and evaluated at a meeting of environmental specialists and epidemiologists convened by the World Health Organization and the World Bank in July 1985 (The Engelberg Report 1985). That group of experts endorsed in principle the findings and conclusions of these studies and supported the recommendations for control strategies. The meeting resulted in a high degree of coordination of policy and approach between the WHO and the World Bank on the matter of wastewater and sludge reuse in agriculture and aquaculture. The recommendations of the Engelberg Report have been incorporated in this volume. Some of the material in this report has been drawn from unpublished reports of studies supported by the U.S. Environmental Protection Agency that have not been subjected to the agency's peer and policy review. This material does not necessarily reflect the view of the Agency and no official endorse- ment should be inferred, specifically with respect to the information in Chapter 4 based on Camann et al. (1983), Fattal et al. (1981), Shuval et al. (1983), and Fattal (1984), and Shuval and Fattal (1985). Tables and figures herein for which no sources have been cited are, for the most part, the work of the authors; a few figures have been taken from commercial catalogs. Special mention must be made of the contribution of my colleague Rachel Perlman Cohen--Librarian of the Environmental Science Library at the Hebrew University--in preparing the separate bibliography. Thanks are due to the Environmental Science and Engineering Program, Division of Applied Sciences, Harvard University, for the fine facilities they - xxvii - provided me with during my sabbatical, which enabled me to complete the writing and editing of this report in a quiet and scholarly environment. The untiring efforts of Christine Lawton and Miriam Hornoff in typing and proofreading major portions of this manuscript are greatly appreciated. The credit for the skillful job of editing and refining of the text goes to Deirdre Murphy, who served as production editor of this volume, and to the copy editor, Venka Macintyre. Their important contribution is gratefully appreciated. Hillel I. Shuval Jerusalem, Israel I - xxix - LIST OF TERMS AND ABBREVIATIONS Activated Sludge A biological conventional wastewater treatment process Aerobic Without the presence of oxygen Anaerobic With the presence of oxygen B.O.D. Biochemical oxygen demand; a measure of the organic strength of sewage CDC Center for Disease Control (USA) CEC Cation exchange capacity; a characterization of soil related to ability to absorb dissolved salts COD Chemical oxygen demand--a measure of the organic strength of sewage Coliforms Normal bacteria of the enteric tract of mammals used as an indicator of fecal pollution E. coli A more specific fecal indicator organism, see coliforms Economic Calculated by taking the net present value (NPV) at a incremental NPV/ha specific discount rate (interest factor) for the stream of incremental (with project less without project) economic benefits less incremental costs (investment and operating) estimated over the life of the project. The result is then divided by the project area in hectares. Endotoxins Toxic compounds formed by bacteria EPA Environmental Protection Agency (USA) Facultative A system that functions both aerobically and anaerobically Geometric Mean See Log Mean ha Hectare 10,000 M2 2.5 acres HAV Infectious hepatitis Type A; a virus disease ID50 Infective dose or the number pathogens required to infect 50 percent of persons who ingest them IRCWD International Reference Center on Waste Disposal Dubendorf, Switzerland - xxx - K20 A potassium salt used as a chemical fertilizer Kibbutz Collective agricultural settlement in Israel; plural, kibbutzim km Kilometer Log Mean The mean value of a series of numbers based on calculation; the mean of the logarithms of the numbers Logl0 Removal Remov41 efficiency expressed in logl0 units; i.e., 4=10- = 99.99 percent removal m2 Square meter MCM Million cubic meters MGD MilLion gallons per day MOH Medical Officer of Health--a senior public health official Morbidity Rate The rate at which illness from a specified disease occurs in a community; usually expressed as cases/100,000 population Mortality Rate Death rate from a given disease; see morbidity rate Maturation Ponds See Polishing Ponds N Nitrogen NEERI National Environmental Engineering Institute, at Nagbur, India NH3 Ammonia Night Soil Human excreta--feces and urine N-K-P Nitrogen-potassium-phosphorus used as a chemical fertilizer NPV Can be defined as the present worth of benefits less the present worth of costs. The present worth of an amount in any specific year can be calculated using P = F/(1+i)n, where P = future value, i = discount rate (interest factor), n = period in the future. Therefore, the NPV of a stream would require calculating the present worth for each period and then summing them up. - xxxi - Oxidation Ponds See Stabilization Ponds P205 A phosphorus salt used as a chemical fertilizer PAHO Pan American Health Organization, the regional office of the WHO for the Americas pH A measure of acidity Polishing Ponds Additional stabilization ponds following a wastewater treatment system to provide additional treatment Primary Treatment Usually sedimentation of wastewater Prospective Study A method of studying the health effects of an environ- mental factor by making current observations for a given time period of the morbidity or mortality rates on exposed and control population groups (see Retrospective Study) PVm3 Represents the discounting of a physical flow over a period of years and is used to allow easier comparison of a stream of quantities by converting them into a single number. Retrospective Study A method of studying the health effects of an environ- mental factor by observing the past morbidity or mortality rates of the exposed and control population group (see Prospective Study) Seroepidemiology A study of the rate of infection of a disease in a population group by analyzing blood samples for antibodies to the disease S.S. Suspended solids Secondary Treatment Usually a biological wastewater treatment process following sedimentation Stabilization Ponds An open pond system used to treat wastewater where algae, bacteria, and sunlight provide natural purification TDS Total dissolved solids TSS Total suspended solids UNDP United Nations Development Programme UNEP United Nations Environment Programme WHO World Health Organization CHAPTER 1 HISTORICAL, PRESENT, AND POTENTIAL REUSE OF WASTEWATER IN AGRICULTURE Night soil has been used to fertilize crops and replenish depleted soil nutrients since ancient times in China and in other areas of Asia. The earliest sewage farms documented in the literature appear to be those of Bunzlau, Germany, which were in operation in 1531 (Gerhard 1909), and those of Edinburgh, Scotland, which were active around 1650 (Stanbridge 1976). With the advent of the water carriage sewerage system, interest in wastewater farming or land application increased, particularly in Europe after the First Royal Conmission on Sewage Disposal in England gave its official blessing to the practice. In its report of 1865 the commission stated, "The right way to dispose of town sewage is to apply it continuously to the land and it is by such application that the pollution of rivers can be avoided." During this same period in Europe, the conservationist movement provided another important thrust in support of the reuse of wastewater by advocating that land application become part of a policy for resource recycling and returning nutrients to the soil. In 1868, Victor Hugo gave voice to this point of view in Les Miserables: "All the human and animal manure which the world loses . . . by discharge of sewage to rivers . . . if returned to the land, instead of being thrown into the sea, would suffice to nourish the world."' Thus, the initial impetus for using wastewater in agriculture or through land application arose from policies of preventing pollution in rivers and conserving water and nutrients to improve agriculture. These early motivations for wastewater reuse remain strong to this day, although the emphasis has changed somewhat as a result of experience, scientific advances, and economic considerations. This report analyzes the benefits of wastewater reuse as an economical way of increasing the amount of available water and returning important nutrients to the soil, both of which are essential for the development of agriculture. It also reviews the benefits of reducing pollution in water courses through the effective natural purification of wastewater applied to the land. However, the main task here is to evaluate the potential negative effects on human health from the agricultural use of wastewater and to propose remedial measures. It is only when the public can reap the full benefits of wastewater reuse without su'ffering harmful effects that such a practice can become a truly successful development policy. EARLY MAJOR WASTEWATER IRRIGATION PROJECTS With the publication of the report of the First Royal Commission in England, land treatment became one of the principal means of sewage -2- disposal. Sewage farms were established as early as 1650 in Edinburgh and later in London, Manchester, and other major cities of the United Kingdom. The local government boards, which controled the funds expended by local authorities, enforced this policy by requiring land treatment, or broad irri- gation, prior to the disposal of effluent to rivers. By 1875, there were some fifty land treatment sites in Britain (Jewell and Seabrook 1979). This policy was modified in 1898 by the newly appointed Sewage Disposal Commission of Great Britain, which recommended that "filters of artificial construction with proper safeguards . . . be relied upon to purify sewage without broad irrigation" (Fuller 1912). Broad irrigation also became popular in other parts of Europe during the late 1800s and early 1900s. Paris, for example, had sewage farms as early as 1868. These are described by M. Bechman, who was city engineer at the turn of the century. According to his account, experiments in farming with the sewage of Paris were initiated at the town of Gennevillier. The farmers of the area welcomed the use of sewage on their farms, and by 1872 some 900 ha (2,225 acres) were being irrigated with wastewater. In 1889, the Chamber of Deputies in the Senate passed a law permitting the system to be extended to the area of Acheres on the border of the forest of Saint-Germain. This project was put into operation in 1895, but the new purification fields, like the earlier ones at Gennevillier, were only able to receive a portion of the total sewage flow of Paris; most of the flow was still being discharged into the Seine River. By 1904, however, the great intercepting sewers of Paris had stopped discharging into the Seine altogether, and all the dry weather flow was applied to sewage farms, which by then had a total area of 5,300 ha. About one-third of the area was owned by the city and rented to tenant farmers on the stipulation that they receive fixed quantities of sewage at all times of the year and raise therir crops under permanent supervision and control, The remaining land was owned by individuals who received the sewage for agricultural purposes according to their needs (see Fig. 1-1). The city of Berlin established its first sewage farm in 1876 and gradually increased the total area so that by 1910 some 17,200 ha (43,000 aires) were devoted to sewage farming and the city was treating about 310,000 m (77 million U.S. gallons) a day (Rocchling 1911). Under Berlin's system, twelve pumping stations delivered the sewage to eight farms, three of which were to the south and five to the north of the city. The northern farms were 4-17 miles from the center of the city and the southern farms 8-17 miles away. Part of the area was used for farming operations managed by the city authorities, and a smaller portion was let to private farmers for the production of vegetable crops for the city markets (the more important crops were reported to be rye, wheat, barley, oats, corn, potatoes, beets, and carrots). In addition, cattle were grazed on grasslands irrigated with wastewater. Part of the effluent from the land filtration areas was conducted through ditches to fish ponds, which in 1910 had an area of 16 ha (40 acres). A major portion of the area was used for direct infiltration beds rather than sewage farming. According to Rocchling (1911), the overall operation was not a profitable one and had to be subsidized through city taxes. At a later stage, the wastewater was treated by conventional methods prior to agricultural irrigation. -3- Foutolge N _____ \ 6el ,t=ve9rc * H1erbl xdy *Le P1enids Boucherd Mece Abrenvil U Ie hdn | Irrigable Lands belonging to City. on C Irrigable Lands belonging to Individuals. ' Fig. 1-1. Location of sewage farms of Paris in 1904. The city of Melbourne, Australia, established its first large sewage farm--Werribbee Farm--in 1897 and successfully grazed sheep and cattle on the still in operation and today irrigates some 10,000 ha with the effluent of its stabilization pond system, which is the largest in the world. .s ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~~~ Another early example of, a major program for the disposal of city wastewater through planned sewage farming comes from Mexico City, which in 1904 established an organized irrigation district nearby in the arid Valley of Mexico; it utilized the city's untreated wastewater to irrigate large areas. This project expanded through the years under careful government control, so that by 1984 the area being irrigated had grown to about 42,000 ha. Sewage irrigation was also under way in the United States in these early years. It was practiced as early as 1871 in Lenox and Worcester, Massachusetts, and 1876 at the state asylum near Augusta, Maine (Chase 1964). According to Fuller (1912), by 1904 the country had some fourteen municipal sewage farms or broad irrigation projects serving a population of about 200,000, and a number of institutional plants of some size were in operation. Early municipal sewage irrigation projects near Chicago and Los Angeles had to be abandoned, however, because of the rapid growth of the two cities and their suburbs in the direction of sewage-irrigated lands. Apparently the health authorities intervened when the odor from these sites became a nuisance (Fuller 1912). Nonetheless, in March 1910, California's Monthly Bulletin of the State Board of Health recommended, "In California -4- where water is so valuable for irrigation, the use of sewage for broad irrigation should be considered" (Ongerth and Jopling 1977). It seems that wherever the economic motivation for wastewater reuse has been strong, the practice has had a sounder basis for survival. As in the United States, many of the early broad irrigation and sewage projects in Europe were eventually abandoned because urban development had encroached upon the sewage farm areas. The problems with odor and concerns about public health--particularly about the possible transmission of disease from vegetable crops irrigated with raw sewage--were largely responsi- ble for the decline of sewage farming. Another disadvantage in temperate areas with plentiful rainfall was that with the cessation of sewage irrigation during heavy rain storms, raw sewage was frequently discharged into neighbor- ing streams, or crops were injured from oversaturation of the irrigated land areas. This was a minor factor in the more arid western areas of the United States, however, and thus sewage farming has continued there right up to the present. When it was discovered that wastewater could be treated by biological processes that require much less land, broad irrigation and sewage farms fell into decline in both Europe and the United States. Because of a combination of factors, most consulting engineers and public health authorities came to believe that sewage farming was an undesirable and unsanitary practice of the past: land-use patterns had changed as a result of urban growth; the alternative wastewater treatment technology was based on intensive civil engineering and mechanical components; more people became aware of the hygienic and public health considerations associated with unregulated sewage irrigation of vegetable crops--particularly salad crops usually eaten uncooked--and there was greater public sensitivity to "sanitary" nuisances, especially the unpleasant odors from poorly regulated, overloaded land infiltration areas and flood farming plots. By' 1912, the trend away from sewage farming was already evident: "The present outlook is that broad irrigation or sewage farming is decidedly on the wane with little prospects of adoption even in the arid districts except perhaps for an occasional project where local conditions are unusually favorable" (Fuller 1912). Eventually, sewage farming was almost completely abandoned in most areas of the highly urbanized industrial countries of the western world. At the same time, reclamation and recovery became discredited, and few engineers or scientists showed any interest in the systematic study of the engineering, agronomic, microbiological, and public health aspects of wastewater reuse in agriculture. All this changed after World War II, how- ever, when a new thrust of scientific and engineering interest in wastewater reuse developed in both the industrialized and the developing countries. PRESENT STATUS OF INTEREST IN WASTEWATER REUSE Following World War II, the possibilities of wastewater treatment and disposal through land application gained increasing attention from those who -5- saw this as a method of preventing river pollution and of increasing water resources in areas suffering from insufficient overall water supplies. The more arid developing countries were particularly interested in the major economic benefits that could be gained by utilizing wastewater as a water resource for agricultural development. Willem Rudolf and his group at Rutgers University provided a strong scientific basis for the renewed interest in the microbiological and public health aspects of wastewater irrigation in agriculture (Rudolf, Falk, and Ragatzkie 1950, 1951). Their field experiments on the survival of pathogens in the soil and on sewage-irrigated vegetable crops stimulated research scientists in many areas of the world to initiate investigations along similar lines. Their goal was to provide a rational basis for evaluating the health risks from the microbial contamination of crops. To this day, their widely circulated series of articles reviewing the literature on pathogen survival in soil and on agricultural crops remains the pioneering work in the field. A major contribution was also made by a number of state health departments in the United States, which established guidelines and regulations to control the sanitary aspects of wastewater reuse in agriculture. The state of California's pioneering regulations in this field were first issued in 1918 and later modified and made more stringent (Ongerth and Jopling 1977). They provided design engineers, public health authorities, and farmers in the United States and throughout the world with a carefully worked out, rational basis for reintroducing wastewater irrigation in agriculture as a socially acceptable and sanitary practice that could meet the strictest public health criteria (see Table 1-1). That California led the way in this field is not so surprising in that the climate and geographic features of this area made agricultural utilization of wastewater an attractive solution to town and city wastewater disposal problems. The seasonal rain distribution patterns in many parts of California and other western states make it necessary to use consid- erable supplemental irrigation during long periods of the year when climate conditions are appropriate for growing crops. These same arid and semiarid zones have few flowing streams with a sufficient capacity to serve as natural repositories for even well-treated wastewater effluent. Thus, land disposal through wastewater reuse in agriculture has provided almost the only feasible, relatively low-cost alternative for disposing of wastewater from municipal areas in a sanitary manner that would minimize the pollution of the region's waterways. All these factors, coupled with the rapid urban growth of the region and the need to increase agricultural production, helped to revive the interest of the agricultural community and municipal planners in sewage farms. Furthermore, the regulations developed by the State Health Department provided a strict code of practice that helped to reestablish the credibility of wastewater reuse in agriculture in the western part of the United States. Soon thereafter a similar trend developed in many of the rapidly developing countries faced with water shortages and having insufficient waterways to properly dilute and dispose of municipal wastewater. TABLE 1-1 California State Department of Health standards for the safe and direct use of reclaimed wastevater for irrigation and recreational impoundments Description of minimum required wastewater characteristics Secondary coagulated Coliform Secondary filtered b/ MPN/100 ml and and median Use of reclaimed wastewater Primary-/ disinfected disinfected (daily sampling) Irrigation Fodder crops x No requirement Fiber crops x No requirement Seed crops x No requirement Produce eaten raw, surface irrigated x 2.2 Produce eaten raw, spray irrigated x 2.2 Processed produce, surface irrigated x No requirement Processed produce, spray irrigated x 23 Landscapes, parks, etc. x 23 Creation of impoundments Lakes (aesthetic enjoyment only) x 23 Restricted recreational lakes x 2.2 Nonrestricted recreational lakes x 2.2 a. Effluent not containing more than 1.0 ml/liter/hr settlable solids. b. Effluent not containing more than 10 turbidity units. Source: After Ongerth and Jopling in Shuval (1977), p. 230. -7- One of the basic steps that the California State Health Department took was to restrict the use of partially treated sewage to crops that are generally cooked before being consumed. (See Table 1-1 for a summary of the main elements of the California reuse standards.) As a result, the practice of growing salad crops such as lettuce, cucumbers, and tomatoes was effectively discouraged. In addition, California introduced wastewater treatment standards of 2.2 coliform/100 ml, which could only be achieved with much difficulty, that is, through the complete biological treatment of the wastewater, followed by heavy chemical disinfection with agents such as chlorine.- The microbiological quality of the effluent used for irrigating such crops would then be more or less parallel to that required for drinking water. In reality, such a standard was almost unattainable in most normal wastewater treatment systems. California's crop restrictions and wastewater treatment requirements were copied almost in entirety by many other states. Furthermore, these standards were either replicated or used as the basis for similar regulations in many of the developing countries that had gained their independence shortly after World War II and that had a strong interest in using wastewater for agriculture purposes and economic development. During the past thirty-five years, investigations have been launched in all scientific aspects of wastewater reuse, with intensive studies being pursued by researchers at universities and governmental research institutes, and by consulting engineers and municipalities. During this period, some 1,000 research papers, reports, and monographs have been published on every aspect of wastewater, from its composition to its agricultural, municipal, industrial, and recreational value. A number of governments have even officially approved of wastewater land application or wastewater reuse as part of their water pollution control policy and water resources management program. In 1952, the Israel Ministry of Health, for example, published health guidelines for wastewater reuse in agriculture that were based largely on the California standards (Shuval 1980). Then in 1956 Israel established wastewater reuse as a national policy of water resource conservation in its First National Water Plan. Under the 1956 Israel Water Law, this water-short country provided a legal basis for its wastewater reuse policy by nationaliz- ing all water resources, including wastewater effluents from municipal and industrial sources. The Republic of South Africa has made water reuse for agricultural and municipal purposes an official policy of the Office of Commission for Water Research, and it has also established guidelines relating to the health aspects of reuse. Of the developing countries, India was onesof the first to recognize the importance of wastewater reuse in agriculture, both as a water pollution control strategy and as a way of increasing irrigated agricultural areas. Irrigation is badly needed to augment crop production in that country, which suffers from severe problems of malnutrition. Wastewater reuse for agriculture and for industrial purposes has been strongly promoted in India's governmental policy. - 8 - The United States, through the Clean Water Act of 1977, empowered its Environmental Protection Agency to promote wastewater treatment and disposal systems that utilize land treatment processes to reclaim and recycle wastewater. The government promoted this policy in part by providing financial incentives that would encourage land application of wastewater. A long-term goal of this legislation was to develop a national plan for wastewater treatment, including land application of wastewater, that was to achieve "zero" discharge of wastewater to rivers by 1985. The overall goal was to drastically reduce the pollution loads on the waterways of the country. Thus, the United States began to act on the principles first set forth by the First Royal Commission on Sewage Disposal in England some 110 years earlier. Some of those principles had in the meantime been abandoned in their country of origin. During the past 100 years, then, the concept of land application and wastewater reuse has gone through a complete cycle. Starting with official blessing and enthusiastic initiation of land application and sewage farming projects in England, Europe, and the United States, it soon became almost the sole method of disposing of municipal wastewater. In the early years of the twentieth century, however, projects were often ill-conceived, inadequately funded, and poorly regulated, and thus were eventually abandoned. Subse- quently, the concept of rease fell into disrepute. Today, wastewater reuse is becoming widely accepted once again, except that now it is based on more rational scientific and engineering principles. In some countries it is used to control water pollution, but more frequently it is seen as an economically feasible source of water in water-short areas. EXAMPLES OF CURRENT WASTEWATER REUSE PRACTICES IN AGRICULTURE AROUND THE WORLD United Kingdom Although England served as the cradle of sewage farming and land application, the number of land application sites reached a peak of some 60 in 1870, and then, for the various reasons described earlier, dropped to only a few sites in the years up to 1955. By 1980, however, the number of land application sites had climbed back to the original peak figure as a result of renewed interest (Jewell and Seabrook 1979). The British Isles, despite their reputation for rainy weather, actually face serious shortages of water. This situation will become even more severe in the future as the country's population continues to grow. Therefore, water reuse--either indirectly, through the reuse of effluent disposed of in rivers, or directly for irrigation or industrial reuse--will undoubtedly increase (Eden, Bailey, and Jones 1977). Programs have been formulated for controlled intentional indirect reuse in some river basins such as the Mardyke River, and direct reuse for industrial purposes is widely practiced. In addition, major research efforts on various aspects of water reuse are now under way. -9- United States In 1940 there were only about 150 wastewater land treatment plants in the United States, but by 1980 there'were some 3,400 projects utilizing wastewater for agricultural, industrial, and recreational purposes such as irrigating golf courses and recreational lakes for boating (Jewell and Seabrook 1979). Most of these projects have received support through public funds. Although there is considerable interest in using wastewater for municipal purposes at a number of sites in the United States (Denver, Colorado is one, for example), no projects of this type have yet been approved. However, wastewater is being reused for groundwater recharge, particularly in California, where a portion of the water enters the drinking water supply after infiltration through the aquifer and dilution with groundwater. Recycling efforts in California give an idea of the extent of waste- water reuse in the more arid areas of the United States. According to Ongerth and Jopling (1977), in the late 1970s California had 850 community sewage systems serving a population of 19.4 million. The total volume of municipal and industrial wastewater was about 2.3 billion gallons per day (210 million cubic meters [MCM] per year), and almost 70 percent of the wastewater generated in the coastal areas was discharged into saline water. About 7 percent of the total wastewater flow was reused through planned reclamation operations., The mineral quality of roughly twio-thirds of the total wastewater flow produced by the state is suitable for reuse,*whereas the other one-third contains dissolved solids in excess of 1,500 milligrams per liter and would require demineralization for unlimited agricultural reuse. According to a 1975 inventory of water reclamation operations in California conducted by the Department of Health, the state had a total of 200 reclamation facilities at that time. These facilities provided reclaimed water for a number of uses: nonfood crops (including fodder, fiber, and seed crops--142 projects); landscape irrigation (including parks, golf courses, and highway landscapes--42 projects); food crops iorchards and vineyards--32 projects); planned groundwater recharge (7 prorects); ornamental lakes (5 projects); industrial use (8 projects); recreational lakes (5 projects); and wildlife habitats (3 projects). Other applications--such as groundwater recharge by injection, greenbelt irrigation, fire protection, and road compaction--have employed reclaimed water only to a limited extent. An additional seventy operations were either in the planning or construction stage or were land disposal systems that might be considered possible reclamation operations. The California survey provides detailed information on the scope of wastewater reuse in an area where climate conditions are optimal for such a practice. Israel One of the countries that has carried out a highly detailed survey of wastewater reuse projects is the State of Israel. Figure 1-2 provides information on the total volume of wastewater produced in the urban areas of Israel between 1963 and 1982. It shows the volume that is actually collected by central sewerage systems and the annual volume purified in treatment - 10 - Total Wastewater 220-- Potentia I 200- ,180-- to 160-- ~ Sewered eD 140 1 / / 100- / 1963 61 71 7 18 80 1982 YEAR Fig. 1-2. Agricultural reuse of municipal wastewater in Israel, 1963-82. Source: Office of the Water Connissioner 1983. plants, as well as the volume utilized in agriculture (Office of the Water Commissioner 1983). In 1963, the total quantity of sewage produced by the urban sector in Israel reached 120 MCM/yr. Seventy percent of this quantity was collected by central sewerage systems, 26 percent of the total was purified in treatment plants, and some 6 percent of the total was utilized in agriculture. By 1982, the total amount of potentially utilizable sewage produced in the urban areas of Israel had almost doubled at 211 MCM/yr, with 91 percent of that flow reaching central sewerage systems and 57 percent of the effluent treated by various types of treatment processes (mainly oxidation ponds). Some 50 MCM of the wastewater was utilized directly for agricultural purposes; this represents 24 percent of the potential wastewater produced by the urban sector of the country. The Office of the Water Commissioner has indicated that it intends to give first priority to the further development of wastewater reuse projects so that by the year 2000 some 80 percent of the total wastewater quantities produced will be fully utilized. The total population served by the municipal areas of Israel reached 3,270,000 in 1982. In the same year, the rural sector population reached about 580,000, and was thought to be capable of producing 40 MCM of sewage per year; 18 MCM/yr of that amount, or 46 percent, was collected by sewerage systems. Some 93 percent of the sewage collected in the rural sector is treated. Forty-one percent of the treated sewerage effluent of the rural sector is used for irrigation purposes each year. In the urban sector, 42 percent of the treated sewage was utilized in agriculture in 1982. .The total area under wastewater irrigation in Israel in 1982 was about 10,000 ha, of which 87 percent was utilized for growing cotton, 7 percent for citrus, 3 percent 'for field crops, and 1.8 percent for fruit orchards. The remain'ing 2 percent of the area was used for growing dates, olives, and miscellaneous crops. The sprinkler method is the most common irrigation technique in Israel. Under the sewage irrigation regulations established by the Israel Ministry of Health, vegetable crops that are eaten uncooked cannot be irrigated with wastewater except with a special permit specifying the effluent quality required. In 1982, some 3 percent of the total amount of the water' supplied in the country for all purposes was renovated wastewater. India Bombay initiated sewage farming in 1877 and Delhi in 1913 under the guidance of British engineers who introduced the practice to Asia (Jewell and Seabrook 1979). It has been estimated that the daily amount of sewage produced in India reaches 3.6 MCM of wastewater, of which about 50 to 55 percent is being utilized for crop irrigation (Shende and Sundaresan 1980). The major portion of the wastewater flow from Indian cities is disposed of without treatment. Some 105 of India's 3,113 cities (or 3.4 percent) have central sewerage systems. About 28 million of the 109 million persons living in those cities (or about 25 percent) are provided with sewerage systems. The remaining 3,000 urban settlements have no central sewerage systems at the present time. Shende and Sundaresan estimate that in areas with sewerage, the potential amount of fertilizer nutrients is 225 tons per day of nitrogen, 67 tons per day of phosphorous, and 135 tons per day of potash. Chemical fertilizers produced in India during 1977-78 amounted to 2 million tons of nitrogen and 0.67 million tons of phosphorous (P205). According to estimates based on India's 1978 population of 638 million, the annual production of nitrogen in human waste equals 2.5 million tons, with 0.9 million tons of phosphorous (P2O) and 0.8 million tons of potash (K20). Thus, at least theoretically, it all the human excreta produced in India were used for fertilization purposes, they would be equivalent to the amount of chemical fertilizers used in agriculture at present. However, since only about 5 percent of the total population is connected to central sewage systems, the total fertilizer contribution of wastewater in India would meet only about 5 percent of the national fertilizer needs. In areas where sewage irrigation is feasible, the wastewater can meet most, if not all, of the nutrient requirements of crops. Only slight fertilizer additions would be required, depending on the type of crops grown and their specific fertilizer demands. Studies in India indicate that the amount of nitrogen supplied by wastewater irrigation is sufficient for, if not in excess of, most crop - 12 - demands. In certain cases, supplemental doses of phosphorous are required to provide balanced fertilization of the crops, along with recommended doses of nitrogen-phosphorous and potash (N-K-P) combined with good agricultural practice. In certain urban areas, wastewater has been reused for cooling water for industri'al purposes as well as for certain types of process waters after receiving appropriate treatment to improve its quality. Out of 246 towns with a population larger than 50,000, some 190 have sewers at present; however, in terms of population served, this amounts to about 40 million people, or only 7 percent of the total population (Arceivala 1977). These figures are somewhat higher than those given by Shende and Sunderesan (1980). Arceivala reports that over 132 farms covering more than 12,000 ha utilize upwards of 1 MCM of wastewater per day. In addition, several more farms receive industrial effluents, particularly from sugar refineries, food processing, and other industries. The authorities encourage the utilization of sewage for farming purposes by extending grants and loans on special terms for sewerage schemes that include sewage irrigation. Wastewater reuse for both agriculture and aquaculture is widely practiced and is backed by governmental policy. Federal Republic of Germany According to the estimates of Muller (1977), only about 8 percent of the Federal Republic of Germany needs supplemental irrigation. In 1958, about 1,200 MCM/yr of water was used for irrigation for an area of about 250,000 ha. Of this amount, about 100 MCM/yr was derived from wastewater and used on about 10 percent of the area. An additional 800,000 ha may need irrigation in the future, but the amount of irrigation with wastewater is not expected to increase. Spray irrigation is by far the most common method of wastewater irrigation used in Germany. The largest such project is at Braunschweig, which has served as an active center for research on the engineering, agronomic, and health aspects of wastewater reuse in agriculture. About 3 percent of the total quantity of wastewater collected in sewerage systems in the Federal Republic of Germany is disposed of by irrigation. Muller (1977) assumes that, because of the climatic conditions of the area, there will be no substantial increase in the disposal and use of wastewater discharges by irrigation. He also notes that much of the river water used for irrigation is partly made up of wastewater, since many of Germany's rivers carry heavy loads of partly or fully treated wastewater. Germany has introduced special regulations to protect the health of those living in areas adjacent to wastewater irrigation projects. In general, full biological treatment is required prior to irrigation and, in certain cases, particularly in spray irrigation, this is generally followed by chlorination of the effluent. Irrigation must be stopped fourteen days before grazing on the land or harvesting of the crops is permitted. Under these regulations, vegetables for human consumption may not be irrigated with wastewater. - 13 - Latin America The type of wastewater irrigation currently practiced in Latin America can be illustrated by a few examples. Near Mexico City, the 03 Irrigation District receives about 40 m3/s of wastewater from a major portion of Mexico City and irrigates 41,500 ha with raw sewage; this sewage is held in a large reservoir having a storage capacity of many months where it undergoes sedimentation and partial treatment. According to the federal authorities in Mexico, vegetable crops for human consumption are not permitted to be irri- gated in this district. For the most part, the crops grown (e.g., grain, fodder) are not for human consumption. During a site visit to the project, it appeared that the governmental policy of regulating the types of crops grown with the partially treated wastewater was being observed by the farmers. The raw wastewater from the city of Santiago, Chile, constitutes almost 100 percent of the dry weather flow in the Rio Mapocho River. Some 16,000 ha in the river valley adjacent to the city are irrigated with this wastewater. These areas grow vegetable crops and salad crops for the city markets. Public health officials have stated that the high rates of enteric disease in Santiago, particularly typhoid fever, can be attributed in part to the consumption of vegetables irrigated by raw wastewater. Thirty-one reuse projects have been launched in the Peruvian desert coastal area, many of which use stabilization pond effluents. There is also uncontrolled use of wastewater around the capital city of Lima. The use of wastewater use for agricultural irrigation is currently being studied by consulting engineers and governmental and university agencies in Lima. The plan is to irrigate some 5,000 ha of desert land after appropriate treatment. The feasibility of using part of the treated effluent for fish culture is also being studied under a research program at the San Juan de Miraflores lagoons near Lima. This program was initiated in 1964 with the construction of twenty-one experimental waste stabilization ponds. The studies include an engineering evaluation of pond operation, evaluation of the sanitary risks of treated effluent reuse for irrigation, and a preliminary evaluation of groundwater contamination from pond infiltration. The UNDP Resource Recovery Project (GLO/80/004), in cooperation with the Pan American Health Organization (PAHO) and local groups of researchers, is investigating the possibility of growing fish and giant prawns in the final ponds, and is also looking at pathogen uptake in the harvested species. The program includes studies on reforestation; the sociocultural aspects of reuse; the creation of greenbelts and recreational parks and resource recovery at a sanitary landfill site; and the growing of animal feed--such as papaya, bananas, corn, and livestock forage--for the zoological park. The growing of aquatic plants such as duckweeds for poultry feed is also under study (Yanez 1980). The San Juan reuse study program is one example of the integrated engineering, public health, and agronomic studies that are being carried out in association with rational, socially acceptable wastewater reuse projects in developing countries. - 14 - Republic of South Africa South Africa is another country in which the reuse of effluent for agricultural, industrial, recreational, and indirect municipal purposes has become an integral part of overall water management. South Africa has an average annual rainfall of 487 mm. The total estimated surface and underground water supplies produce a total of some 60 MCM/day (Hart and Van Vuuren 1977). With the anticipated increasing demand for urban and industrial use, by the year 2000 the total water deficit will approach 20 MCM/day (i.e., some 25 percent of the total potential demand). Thus, wastewater reuse plays a major role in meeting the country's expected water deficits. A survey of the sewage effluents available and their uses in 20 major cities and towns representing a total population of 5.8 million people indicates that 1.2 MCM of effluent is treated each day. In 1977, about 32 percent of this effluent was reused: 8.7 percent for power station cooling; 16 percent for irrigation of crops, parks, trees, and sports fields; and 7.1 percent for industrial purposes. In the Vaal River Triangle, often called the industrial powerhouse of South Africa, reuse is much more intense, mainly because the water resources of this area are limited. Of the 640,000 m3 of wastewater effluent available daily, 50 percent is reused, 14.4 percent for power station cooling, 26.8 percent for irrigation, and 8.8 percent for industrial purposes. The National Institute for Water Research of the Council for Scientific and Industrial Research in Pretoria has served as a major center for research and development in the fields of wastewater reuse and development of low-cost methods of wastewater treatment. A number of studies in South Africa have also looked into the possibility of reusing wastewater for direct or indirect municipal purposes. The City of Johannesburg has been using sewage effluent for irrigation purposes since 1914, originally for disposal since irrigation was cheaper than installing purification plants. Today the council owns one of the major cattle herds in the Republic of South Africa; the herd totals 7,500 beef animals, of which 3,800 are breeding cows. Johannesburg has a northern farm in Jukskei River catchment and a southern farm in the Clip River catchment. The total areas of the farms exceeds 6,000 ha, of which about 1,800 ha are under irrigation. The irrigated land. is divided according to four uses: winter grazing, summer grazing, summer hay production, and maize for silage. South Africa has also pioneered research and projects in the reuse of effluent for industrial purposes. Present plans indicate that South Africa will continue to emphasize the reuse of wastewater for multiple purposes as part of its program of national water resources conservation. North Africa and the Middle East Unpublished documents presented at the WHO Inter-Country Seminar on Wastewater Reuse held in Bahrain in 1984 provide detailed information on recent developments in the use of wastewater in North Africa and the Middle East. Water resources are very limited in the region, and in some areas water reuse has been practiced for many years. The use of modern technology to recycle water is consistent with traditional Islamic religious precepts - 15 - (Farooq and Ansari 1981). However, there will undoubtedly be problems in introducing wastewater reuse in Islamic countries because of their strict prohibitions concerning contact with human excreta. Cairo has had a sewage farm since 1915 (Jewell and Seabrook 1979), and wastewater reuse for vegetable irrigation was practiced in Port Said as early as 1924 (Khalil 1931). Wastewater irrigation is still practiced at the Cabel El Asfar municipal sewage farm in Cairo and in other areas. In the plans for the development of new cities in desert areas of Egypt consideration is being given to the possibility of wastewater reuse in agriculture as a method of wastewater disposal, land reclamation, and of increasing crop production (the country suffers from severe shortages of food). A number of Egypt's laws regulating effluent irrigation forbid the cultivation of vegetable or fruit with effluent except for citrus. Raising cattle for milk production is also forbidden. Irrigation is one of the possible methods of effluent disposal being considered for Alexandria. Arid Tunisia, with 176 municipalities, has 28 wastewater treatment plants, most of them stabilization ponds, and has shown a strong interest in agricultural reuse of wastewater by sponsoring a workshop on this subject in 1983 (National Research Council 1983). There is a wastewater irrigation farm at La Soukra, an area of about 800 ha on the northern outskirts of Tunis, and plans to expand this farm to 2,600 ha are in progress. Another wastewater irrigation farm is located at Kairouon, where secondary effluent is used to grow vegetables, although the government's original plan was to use the effluent for grain, trees, and forage crops. The implementation of other land application projects is reported to be under way. Riyadh in Saudi Arabia uses the effluent of its municipal sewage treatment plant for crop irrigation at a site with good soil conditions some 30 km from the city. At other plants, the effluent is primarily used to irrigate landscape gardening in and around the plant site. A Committee of the Ministries of Agriculture, Health and Municipal and Rural Affairs together with representatives of the water and wastewater authorities of the big cities is developing national standards for effluent reuse. The Sultanate of Oman is interested in effluent reuse in agriculture as a means of conserving scarce water supplies. At present, however, effluent there appears to be used mainly in drip irrigation systems for shrubs and ornamental trees in a manner that will ensure no human contact. Kuwait city built a municipal treatment plant in 1956 and first used the effluent to irrigate crops at a small experimental farm. This project has since been expanded and today the effluent of the 22-MGD activated sludge plant is used to irrigate a 9000 ha sewage farm run by a private company but supervised by the health authorities. The main crops are alfalfa and other fodder crops, but some vegetables are grown under controlled conditions. The hygienic conditions to which farm laborers are exposed are strictly supervised and residential areas are kept at a distance of 2-3 km. Other effluent reuse systems are being developed. - 16 - Bahrain is planning to use the effluent from the Tubli sewage treatment plant at various farms (total area of some 800 ha). One of the main problems to be overcome here is the high salinity of the effluent, which has a total dissolveo solids (TDS) concentration in the neighborhood of 4500. This will eventually be reduced as reverse osmosis desalination plants are brought on line. Both drip in sprinkler irrigation methods are also being planned. MorocWo has also initiated wastewater irrigation projects and, because of its 'high aridity and limited natuLal water supplies, it is giving serious consideration to expanding wastewater -euse. The City of Marakesti is presently using treated effluent for groundwatt.e recharge and salinity control in an area where irrigatioti with raw sewage used to be allowed. Farmers in Teheran, Iran use raw sewerage trom the city to grow salad and vegetable crops. A master plan for sewerage and sewage disposal prepared under a UNDP project proposed that all wastewater effluent be reused for unrestricted agriculture after complete treatment in a conventional activated sludge plant followed by disinfection. That plan has not yet been imple- mented. Amman, the capital city of the Hashemite Kingdom of Jordan, disposes of its effluent into dry river beds, but part of it is used for agricultural purposes. The city authorities are considering taking steps to purify the wastewater; reuse in agriculture is one of the optional methods of disposal. Reuse in one form or another is being considered for all cities developing sewerage systems in the country. At Khartoum in Sudan, the effluent of the algar sewage treatment plant, which serves about 15 percent of the population of the capital city, is used to irrigate a 2,800-ha greenbelt area south of the city. The purpose of the greenbelt is to reduce the effect of dust storms, to control desertifica- tion of the area, and to produce wood and wood products. Eucalyptus and Meshief trees are.grown there but no food crops will be planted. There are also ongoing and proposed projects for wastewater irrigation at Akrotiri, Kamares, Famagusta, Limasol, Larnaca, and Nicosia on the island of Cyprus. In addition, many hotels recycle treated wastewater for lawn irrigation. Central Africa No detailed information is available on wastewater reuse in Central Africa. However, some reports indicate that wastewater is used for irrigation in a number of areas of Central Africa where the cities have central sewerage systems. As more central sewage systems develop, it should be possible to introduce wastewater reuse in agriculture in the more arid areas as well. Apparently local customs and beliefs in some areas of Africa mitigate against wastewater reuse. - 17 - Southeast Asia India's use of wastewater has already been discussed. In other areas of Southeast Asia wastewater is used mostly for aquaculture. In Thailand, for example, experimental studies on wastewater reuse in fish ponds have been' carried out in the Asian Institute of Technology, and in the Philippines a wastewater reuse project involving aquaculture is also under way. Japan The emphasis in Japan is on water reuse for industrial purposes, and thus direct agricultural use is not expected to develop to any great extent (Kubo and Sugiki 1977). As in Europe and the United States, however, indirect reuse of wastewater for irrigation is widespread, because much of the river water being used has high levels of wastewater flow. Many areas of Japan do not have central sewerage systems. Considerable research is therefore focused on the possibilities of safely treating night soil with heat and other processes for application in agriculture. Soviet Union Municipal effluent is widely used in irrigation in the Soviet Union. A sewage farm was established in Moscow back in 1900 (Jewell and Seabrook 1979), and Odessa is reported to have had a sewage farm in 1930 (Gromatschewskij 1930). Numerous papers and reports have been published on this subject. The Soviet government has issued regulations on the types of treatment that are required and the types of crops that may be grown with effluent. No details are available at this time as to the extent of wastewater reuse projects and plans for irrigation in the Soviet Union. China Although night soil has been used for agricultural fertilization in China for many centuries, little information is available on the use of wastewater for irrigation purposes. Most of the urban areas in China do not have central sewerage systems, which would facilitate the development of this practice. However, Shanghai and a number of other major cities do have s'ewerage systems in parts of their central areas. Australia As noted earlier, Australia has been one of the pioneers of waste- water irrigation. The Werribbee Farm established at Melbourne in 1898 has grown steadily over the years and is still in operation today; it covers a total area of some 10,000 ha consisting of wastewater-irrigated pastures grazed by some 20,000 cattle and 50,000 sheep (Croxford 1978). In a number of the more arid areas of central Australia, wastewater has been used to irrigate agricultural crops as well as recreational areas and golf courses. The state of Victoria developed a master plan for evaluating the potential of wastewater reuse to meet its increasing water shortages. No decision has yet been made as to the implementation of the wastewater reuse policies elucidated in that plan. In view of Australia's serious water problems, wastewater irrigation is expected to be one of the options considered for increasing the available supply of water in the future. - 18 - FUTURE TRENDS IN WASTEWATER REUSE IN AGRICULTURE As the foregoing review has indicated, wastewater reuse has become an important strategy for conserving water resources in arid and semiarid areas of the world suffering from major water shortages. In temperate regions, land application as a wastewater treatment aimed mainly at reducing river pollution has played an important role in the past. Today, such treatment is hampered by the fact that land areas near the major urban centers with high water use have become limited. Moreover, there is widespread concern about sanitary problems during rainy periods. In arid or semiarid areas, where a watercourse would carry little more than sewage flow in the dry season, the advantages of wastewater irrigation are economically attractive. Even in humid areas with regular periods of drought, there are definite advantages to wastewater irrigation. The fertilizer value of the nutrients in wastewater are additional economic and environmental incentives, particularly for developing countries, where the cost of imported chemical fertilizers makes it difficult to maintain high levels of agricultural productivity. Areas of the world with 500 mm of rainfall/year or less (see Fig. 1-3) in particular can benefit from wastewater irrigation. The reuse of wastewater from urban centers can help to increase the amount of water available for agriculture in these areas. This is also an economically attractive option for areas that need supplemental irrigation because of the unequal seasonal distribution of rainfall. Figures 1-4 to 1-6 show rainfall distribution patterns and temperatures for representative sites in North and South America (Fig. 1-4), Europe and Asia (Fig. 1-5), Africa and Australia (Fig. 1-6) where wastewater irrigation is practiced. Horizontal scales on the insert maps are in months (January to December in the Northern Hemisphere, and July to June in the Southern Hemisphere). In each case, summer is in the middle of the scale. Where the dry season coincides with the growing season or a portion of it, supplemental irrigation is usually of great agricultural benefit, and wastewater irrigation is often attractive. Since wastewater flows throughout the entire year, wastewater reservoirs storing water for use during the dry season could not only provide increased amounts of water for irrigation, but could also reduce the pollution in water courses (see Chap. 6 for a description of such reservoirs). Aridity index information is provided for Asia (Fig. 1-7), North and South America (Fig. 1-8-), and Africa and Australia (Fig. 1-9). These maps show the ratios of potential evaporation to precipitation and indicate climate zones, particularly those subject to desertification where reuse of wastewater would be extremely important. Areas where the potential mean evaporation is greater than the mean precipitation are considered arid, and supplemental or year-round irrigation may be essential for agricultural production there. From these figures it can be seen that wastewater irrigation could be used to improve agriculture and to provide sanitary disposal of the wastewater in large areas in all continents of the world. These are the regions in which the major development of wastewater reuse in agriculture can be expected to take place. They include large sections of many of the developing countries of South America, Africa, and Asia. - 19 - Although the economic and environmental motivations for wastewater reuse in agriculture, particularly in water-short areas, are strong, social and cultural constraints in some countries may limit the possibilities for introducing the practice in those areas. Despite sound scientific arguments in favor of wastewater reuse, implementation may be held up because of local or regional religious, cultural, and social customs and beliefs. Some societies, for example, have taboos against the contact or use of human excreta. Even trained governmental officials in the western countries may have deep fears of communicable disease because of the overreac- tions to the admonitions against the practice of excreta reuse promoted by western public health officials in the early half of the century. The desire to conform to the strictest standards of the industrial- ized countries is often strong in developing countries, even when the standards are not applicable to local conditions or may not be justified in the first place. This important aspect of the problem has been dealt with in some detail by Piers Cross--a social anthropologist (Cross and Strauss 1985). The reader is referred to that document for a fuller elucidation of the subject. [II <250; [ 250-500; 500-1,000; 10200 m >2,000. Fig. 1-3. Ceneralized annual global precipitation (om). Source: U.S. National Oceanic and Atmospheric Administration (NOAM), Boulder, Colo.; from U.S. Department of Commerce records. Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 3, p. 21. 1 40 130 do 90 so5 2 70 60 9D0 |0 60 50 40. C10 10 -7 ~~~~~~~~~~~~~70 - 0 \ ) ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ 6 (3 A , > J m,WtSao 0LIMA, PERU 0W 19,3 48 50~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~( 20 (20- VALPARAISO. CHILE -40 - 4( ~~~~~~~~~~~~~~~~~~~~~~~~~~14.3 490 30 MAKAHA KAI (Hawatl l l 5 l I * l l KINGSTON. JAMAICA 30 A~~~~~~~~~~~~~~~~~~~~~2. 800 30 ( 40 /40- 14 13 0 D~~~~~~~~~~LDLiiDry Season Pig. MEXI Climate diagrams for areasWet Season 16 MEX Co ~~~~~~~~~~~~~~20 Wet Season, 1 5,15 188 ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~~~~/precipitation more than 100mm. 160 I0 ~~~~~~~~~ ~ ~~~~~~~~~~90 80 6.0 5.0 40 Fig. 1-4. Climate diagrams f or areas with wastewater irrigation, North and South America. Sources: Adapted from H. Walter asjd H. Lieth Klimadiagram Weltatlas (Jena: VEB Gustav Fischer Verlag, 1960) and H. Walter E. Harnickell and D. Mueller-Dombois, Climate-diagram Maps (Berlin: Springer-Verlag, 1975). c 1960 Gustave Fischer Verlag and 1975 Springer-Verlag, respectively. 10 0 10 20 30 40 S0 f0 70 80 910 100 110 120 130 140 150 160 170 70 70 BERLIN, G8E8RMAN DEM. REP J) t / -60 -2Z (a X / a C sBUDAPEST. HUNGARY TEHERAN. IRAN/ - 5- 7 16 5 246 R 50 4r- Y~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~4 X + v :~~~~~~~~~~~~~~~~~~~EW DELHI,>~X< 30 30- PAPHOS. CYPRUS DrySeason -20 FAMAGUSTA. CYPRUS K Wet Season 20 195 227 Wet Season, precipitation 8.2 ~~~~~~~~~~~more than 1 00 mm. 10 BEERSHEBA. ISRAEL :::1 10 10 0 10 20 30 40 50 BOMBAY INDIA 80 MADRAS INDIA 100 110 120 130 140 150 160 170 Fig. 1-5. Climate diagrams for areas with wastewater irrigation, Asia and Europe. Sources: Adapted from H. Walter and H. Lieth Klimadiagram Weltatlas (Jena: VEB Gustav Fischer Verlag, 1960) and H. Walter E. harnickell and D. Mueller-Dombois, Climate-diagram Maps (Berlin: Springer-Verlag, 1975). c 1960 Gustave Fischer Verlag and 1975 Springer-Verlag, respectively. 20 10 0 1 0 20 30 40 50 60. 120 130 140 150 TUNIS, TUNISIA FES, MOROCCO 18.3' 415 ALEXANDRIA, 17.6' 545 AA E.O GP 10 10 30 EGYPT 30 0 8 20 220 I-|.x . s F- -+ KHARTOUM, SUDAN 2 285' 1633 la 10- 7330 -'.. 0 0~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~-0L 10 T--'%.~~~~~~~)i~~~-' 1r0 WINDHOEK, NAMIBIA 1 649 t8.2° 362 ' - 1. -30 _j 30 - .Dry Season Wet Season _40' . 20 10 0 10 20 30 40 50 120 I30 140 150 , , , . , , , . _______ ___.___I Fig. 1-6. Climate diagrams for areas vith wastewater irrigation, Africa and Australia. Sources: Adapted from H. Walter and H. Lieth Klimadiagram Weltatlas (Jena: VEB Gustav Fischer Verlag, 1960) and H. Walter E. Harnickell and D. Mueller-Dombois, Climate-diagram Maps (Berlin: Springer-Verlag, 1975). c 1960 Gustave Fischer Verlag and 1975 Springer-Verlag, respectively. 700N 700N 600N GO0N 500N SOON 400ON 400N 300N ~~~~~~~~~~~~~~~~~~W300N 200N 200N 1 00N 1 00N J humid; : savannah or steppe; desertification; ' desert. Isopleths show ratios of potential evaporation to average precipitation. Ratios are calculated by dividing mean annual net radiation by product of mean annual precipitation and latent heat of vaporization. Fig. 1-7. Aridity index for Asia. Sources: After Budyko-Lettau. Adapted from D. Henning, Atlas of Climate Aridity Indices (unpublished) and F. K. Hare, Climate and Desertification (Toronto: University of Toronto Press, 1976). Used by permission. Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 7, p. 25. 1 O0N 700N .0 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~00 SOON ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~~~~~~~~~~~10 40ON ~~~~~~~~~~~~~~~~~~~~~~~~~200S 300N 3005 Ln 200N 400S 100N 5005 J humid; a savannah or steppe; desertification; desert. Fig. 1-8. Aridity index for North and South America. See note to Figure 1.7. Sources: Same as for Figure 1-7. Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 8, p. 26. 1l -0 O 200S 0 300S . .S Y 400S20 J humid; z savannah or steppe; desertification; desert. Fig. 1-9. Aridity index for Africa and Australia. See note to Figure 1.7. Sources: Same as for Figure 1-7. Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 9, p. 27. - 27 - CHAPTER 2 ENTERIC PATHOCENS IN WASTEWATER AND THEIR SURVIVAL IN SOIL, CROPS, AND IN THE AIR Many studies have been carried out on the concentration and survival of pathogenic microorganisms--including viruses, bacteria, protozoa, and helminths in excreta, wastewater, sludge, soil, and crops--and their removal or inactivation by wastewater treatment processes. The hundreds of published scientific papers and unpublished reports and monographs that have appeared on these subjects over the past eighty years provide a basis for estimating the potential public health problems associated with the irrigation of crops-- either with raw wastewater or after various levels of treatment (Shuval 1977). Rather than present another independent review of this subject (which has been well covered by a number of previous reviews), we have decided to discuss selected highlights from the most recent authoritative and definitive analysis of this subject, prepared by Feachem et al. (1983) for the World Bank Studies in Water Supply and Sanitation. Thus the following sections are drawn from that source with the gracious permission of the authors. For detailed information on this subject, readers are referred to the original work. PATHOGENS IN EXCRETA This section presents a brief summary of the principal pathogenic disease agents found in human excreta. Viruses Numerous viruses may infect the intestinal tract and be passed in the feces, whereupon they may infect new human hosts by ingestion or inhalation. One gram of human feces may contain 10 infectious virus particles, regardless of whether or not the individual is experiencing any discernible illness. Although they cannot multiply outside a suitable host cell, the excreted viruses may survive for many weeks in the enviro? ent, especially if temperatures are cool (<150 C). Concentrations of 10 infectious particles per liter of raw sewage have been reported, and excreted viruses can be readily isolated from soil and natural waters at sites that have been exposed to fecal discharges. Five groups of pathogenic excreted viruses (see Table 2- 1) are particularly important--adenoviruses, enteroviruses (including poliovirus), hepatitis A virus, reoviruses, and diarrhea-causing viruses (especially rotavirus). Other virus groups are also found in feces. Infections with all of these are often subclinical, especially in children. - 28 - TABLE 2-1 Viral pathogens excreted in feces Can symptomless infections Virus Disease occur? Reservoir Adenoviruses Numerous conditions Yes Man Enteroviruses Polioviruses Poliomyelitis, paralysis, and other conditions Yes Man Echoviruses Numerous conditions Yes Man Coxsackieviruses Numerous conditions Yes Man Hepatitis A virus Infectious hepatitis Yes Man Reoviruses Numerous conditions Yes Man and animals Rotaviruses, Norwalk agent, and other viruses Diarrhea Yes Probably man Of the enteroviruses, most poliovirus infections do not give rise to clinical illness. Sometimes, however, infection can lead to mild influenza- like illness, to "virus meningitis," or to paralytic poliomyelitis, which may cause permanent disability or death. It is estimated that paralytic poliomyelitis occurs worldwide in only about 1 out of every 1,000 poliovirus infections, but most of these cases are children in developing countries. Consequently, the number of paralysis cases there can be high. Echovirus and coxsackievirus infections can cause a wide range of diseases and symptoms, including simple fever, meningitis, respiratory illness, paralysis, and myocarditis. Rotaviruses and other viruses are found in the feces of a large number of young children suffering from diarrhea and are another important group of excreted viruses. Their precise causative role and epidemiology remain uncertain, but they are responsible for a substantial proportion of diarrhea episodes among young children in many countries. - 29 - Hepatitis A virus is the causative agent of infectious hepatitis. Infection may lead to jaundice but is often symptomless, especially in young children. Bacteria The feces of a healthy person contain large numbers of commensal bacteria of many species. The species of bacteria found in the normal stool and the relative numbers of different species will vary among communities. Because these bacteria are ubiquitous and numerous in the feces of healthy people, they have been used as indicators of fecal pollution. The most widely used indicator has been the fecal coliform Escherichia coli, but enterococci (or, more generally, fecal streptococci), another widespread commensal group, are also used as indicators. Anaerobic, bacteria such as Clostridium, Bacteroides, and Bifidobacterium have also served as indicators, and their potential value in this regard is currently attracting increasing attention. On occasion, some of these bacteria, or their particular strains, may give rise to disease, as may other groups of bacteria normally absent from the healthy intestine. These pathogenic, or potentially pathogenic, bacteria are listed in Table 2-2. They enter a new host most commonly by ingestion (via water, food, the fingers, or dirt), but some may also enter through the lungs (after inhalation of aerosol particles) or through the eye (after the eye has been rubbed with fecally contaminated fingers). At some time during the course of an infection, large numbers of the bacteria will be passed in the feces, and thus the infection is allowed to spread to new hosts. Diarrhea is a prominent symptom of many bacterial intestinal infectiohs. The bacteria may invade the body from the gut and cause either generalized or localized infections. This invasion is characteristic of typhoid infections and other enteric fevers caused by salmonellae. When infections are restricted to the gut, bacteria are passed only in the feces. In typhoid, when invasion of the body has occurred, bacteria will also be found in the bloodstream and may be passed in the urine as well. A carrier state exists in all the infections listed in Table 2-2. Thus, in communities where these infections are endemic, a proportion of perfectly healthy individuals will be excreting pathogenic bacteria. These carriers play a prominent role in transmitting the infection they carry because they are mobile and thus disperse their feces widely. Cholera provides an example of the problem. A patient with severe cholera will be in bed for most of the time he or she is excreting Vibrio cholerae. Those who nurse the patient clearly are at risk, but the patient is not disseminating bacteria in the community if his excreta are disinfected or properly disposed of in a sanitary manner. By contrast, a patient with a mild case, or carrier, may look relatively healthy and be mobile while excreting up to 100 cholera vibrios per gram of feces. In some infections, the carrier state may last as long as the illness itself, but in others it may persist for months or even a lifetime. Some carriers may show symptoms of illness and continue to excrete the bacteria, whereas others may be healthy throughout infection. A carrier becomes especially dangerous when preparing or handling food or engaged in supplying water. - 30 - TABLE 2-2 Bacterial pathogens excreted in feces Can symptomless infection Bacterium Disease occur? Reservoir Campylobacter fetus ssp. jejuni Diarrhea Yes Anim7ls & man Pathogenic Escherichia colii Diarrhea Yes Manb Salmonella S. typhi Typhoid fever Yes Man S. paratyphi Paratyphoid fever Yes Man Other salmonellae Food poisoning and other salmonelloses Yes Animals & man Shigella spp. Bacillary dysentery Yes Man Vibrio V. cholerae Cholera Yes Man Other vibrios Diarrhea Yes Man Yersinia enterocolitica Diarrhea and septicemia Yes Animals & man&/ a. Includes enterotoxigenic, enteroinvasive, and enteropathogenic E. coli. b. Although many animals are infected by pathogenic E. coli, each serotype is more or less specific to a particular animal host. c. Of the thirty or more serotypes identified so far, a number seem to be associated with particular animal species. There is at present insufficient epidemiological and serological evidence to determine whether distinct serotypes are specific to primates. TABLE 2-3 Protozoal pathogens excreted in feces Can symptomless infection Protozoon Disease occur? Reservoir Balentidium Diarrhea, dysentery, and Yes Man & animals coli colonic ulceration (especially pigs and rats) Entamoeba Colonic ulceration, amoebic histolytica dysentery, and liver abscess Yes Man Giardia Diarrhea and malabsorption Yes Man and animals lamblia Source: Feachem et al. (1983). - 31 - Some of the pathogens listed in Table 2-2 are excreted entirely (or almost entirely) by man, but others are excreted by a wide range of animals. As a result, improvements in human excreta disposal alone cannot be used to control disease. That is to say, any changes made will likely not affect the transmission of pathogens from animal feces to humans. Three of the major infections listed in Table 2-2 (typhoid, shigellosis, and cholera), however, are assumed to be exclusively human infections that are spread from one person to another. In summary, all the viral and bacterial pathogens listed in Tables 2-1 and 2-2 are passed in the feces of man or animals; they are not free living (except possibly Vibrio cholerae). Infection of a new host normally follows ingestion of the pathogens and, because transmission occurs primarily through the swallowing of minute quantities of infected feces, the sanitary disposal of all feces (both human and animal) and perfect personal hygiene would virtually eliminate these diseases. This has unfortunately proved to be an unattainable goal for many infections, even in the most affluent societies; a more modest target than eradication must be set: the reduction of transmission to a manageable level. Bacteria of the genus Leptospira have been excluded from the discussion above because they do not fall within these generalizations. Although leptospirosis in the majority of human cases gives rise to a benign, self-limiting, febrile illness, it occasionally leads to severe, even fatal disease characterized by jaundice and hemorrhage (Weil's syndrome), whereupon death may result from kidney failure. Leptospira are excreted in the urine of animal carriers, and usually reach new animal and human hosts through skin abrasions or mucous membranes contaminated by infected urine. Man may be an intermittent carrier for a few weeks (seldom months) after an acute infection. Leptospirosis is considered here because of the risk to workers who handle excreta, which may contain leptospires either from animal carriers (for example, the sewer rat, Rattus norvegicus) attracted to such environments or, occasionally, from infected people. Protozoa Many species of protozoa can infect man and cause disease. Several such species are harbored in the intestinal tract of man and other animals, where they may cause diarrhea or dysentery. Infective forms of these protozoa are often passed as cysts in the feces, and man is infected when he ingests them. Only three species of human intestinal protozoa are considered to be frequently pathogenic: Balantidum coli, Entamoeba histolytica, and, Giardia lamblia (see Table 2-3). An asymptomatic carrier state is common in all three and, in the case of Entamoeba histolytica, carriers are primarily responsible for continued transmission. Helminths Many species of parasitic worms, or helminths, have human hosts. Some can cause serious illnesses, but a number generate few symptomg. Only those helminths whose eggs or larval forms are passed in the excreta are of - 32 - concern to this study. Only Schistosoma haematobium (the agent of urinary schistosomiasis) is voided in the urine; the others discussed here are all excreted in the feces. Helminths (except for Strongyloides) do not multiply within the human host--this factor is of great importance in understanding their transmission, the ways they cause disease, and the effects of environmental changes on their control. Helminthic disease is not an all-or-nothing phenomenon. In infec- tions due to viruses, bacteria, and protozoa, where massive asexual reproduc- tion occurs within the host, the severity of the infection cannot be related easily to the infecting dose of organisms. One either has measles or a common cold, or not; it is not meaningful to say that someone has "a lot of measles." By contrast, with helminthic infections, it is essential to think quantita- tively. The question is not just whether or not someone has a hookworm infection, but how many worms he has (in other words, how "heavy" or "intense" is the infection). Some worm burdens can be determined by purging the patient immediately after an anthelminthic, but more usually the output of eggs in the excreta is determined and used as an index of the intensity of infection. Even though there is a good deal of variation from day to day, the relation is valid at the community level, and, in any case, the egg output is always a better measure of transmission and sometimes a better guide to pathology than the burden of adult worms. Worm burdens and levels of egg output are not evenly distributed among their human hosts; within any sex and age group of an infected com- munity, there will be a few people who are carrying a heavy worm burden and a much larger number with light intensities of infection. In general, the risk of illness and its severity increases with the worm burden. It is therefore common in helminth infections to find many of the community lightly infected, some people (often with heavy infections) ill, and a few dying. It is relatively easy to see the public health importance of the heavy infections but far harder to assess disability in the lightly infected majority, since the consequences are likely to be nonspecific and the effects cumulative with those from other infections. The number of heavy infections is not simply proportional to the prevalence of infection. At high prevalences, increased transmission will tend to push up the proportion of heavy infections, and, at low prevalences, there may be few people heavily infected and the number may change little with transmission. Where immunity acquired by the host is unimportant, a reduction in transmission due to control of excreta may reduce the number of heavy infections and so reduce the burden of disease, even if it has little effect on the prevalence of infection. Because of this quantitative characteristic, the development of pathology in helminthic infections is usually the result of cumulative worm burdens, often carried over many years as a product of regular and repeated reinfections. In contrast, asexually replicating organisms may cause an overwhelmingly heavy infection and a state of gross disease within a few days or weeks after a single infective dose enters the body. - 33 - The excreted helminths are listed in Table 2-4. Their life cycles-- that is, the developmental stages through which they pass before reinfecting man--may be very complex (as is shown in the table). The helminths are classified into two main groups: roundworms (nematodes), and worms that are flat in cross section. The flatworms, in turn, may be divided into two groups: the tapeworms (cestodes), which form chains of helminth "segments"; and the flukes (trematodes), which have a single flat, unsegmented body. The roundworms may cause mechanical obstruction (Ascaris), rectal prolapse (Trichuris), itching around the anus (Enterobius), or anemia (hook-worms). They also divert food to themselves and may produce abdominal pain, but in many cases there are no symptoms. Adult tapeworms create health problems mainly by depriving their host of nutrients. Of the trematodes, some inhabit and damage the liver (Clonorchis) or lungs (Paragonimus). The schistosomes live outside the intestine in small blood vessels; the eggs that fail to escape from the host may damage several organs. Intestinal flukes may occur in large numbers, are transmitted primarily through food, and cause relatively mild symptoms. Most of the roundworms that infect man, and also the schistosome flukes, have separate sexes, with the result that transmission depends upon infection with both male and female worms and upon the meeting, mating, and egg production of these worms within the human body. Individuals may be infected with a single sex or with unmated worms, but these cases are not epidemiologically significant because they do not transmit infection. SURVIVAL OF INDICATORS AND PATHOCENS From the time of excretion, the concentration of all pathogens usually declines owing to the death or loss of infectivity of some of the organisms. Viruses and protozoa always decrease in numbers following excretion, but bacteria may multiply if they find themselves in a suitably nutrient-rich environment with a minimum of competition from other microorganisms. This can occur when salmonellae, for instance, contaminate certain foods, or when E. coli multiply in chlorinated sewage effluent from which many other bacteria have been eliminated. Multiplication of bacterial pathogens is generally rare, however, and is unlikely to continue for very long. Intestinal helminths--except the trematodes, which have a multiplication phase in their molluscan intermediate hosts--will decrease in numbers following excretion. (The multiplication possibilities for the excreted pathogens are summarized in Table 2-13.) The ability of an excreted organism to survive is referred to as its persistence. The natural death of organisms when exposed to a hostile environment is a factor of the utmost importance because the infectivity of excreta is reduced independently of any treatment process. In fact, some treatment processes have little effect on excreted pathogens and simply allow the necessary time for natural die-off to occur. Conventional sewage treatment has this kind of effect on protozoal cysts. Certain treatment processes, however, create conditions that are particularly hostile to - 34 - TABLE 2-4 Helminthic pathogens excreted in feces Helminth Common name Disease Transmission Distribution Ancylostoma Hookworm Hookworm Man -> soil -> man Mainly in warm wet duodenale climates Ascaris Roundworm Ascariasis Man -> soil -> man Worldwide lumbricoides Clonorchis Chinese liver fluke Clonorchiasis Man or animal -> Southeast Asia sinersis aquatic snail -> fish -> man Diphyllobo- Fish tapeworm Diphyllobo- Man or animal -> Widely distributed, thrium thriasis copepod -> fish -> mainly in latum man temperate regions Enterobius Pinworm Enterobiasis Man -> man Worldwide vermicularis Fasciola Sheep liver fluke Fascioliasis Sheep -> aquatic Worldwide in sheep- hepatica snail -> aquatic and cattle-raising vegetation -> man areas Fasciolopsis Giant intestinal Fasciolopsiasis Man or pig -> Southeast Asia, buski fltuke aquatic snail -> mainly China aquatic vegetation -> man Gastrodis- n.a. Gastrodis- Pig -> aquatic India, Bangladesh, coides coidiasis snail -> aquatic Vietnam, Philippines hominis vegetation -> man Heterophyes n.a. Heterophyiasis Dog or cat -> Middle East, heterophyes brackish water southern Europe, snail -> brackish- Asia water fish -> man Hymenolepis Dwarf tapeworm Hymenolepiasis Man or rodent -> Worldwide nana man Metagonimus n.a. Metagonimiasis Dog or cat -> East Asia, Siberia yokogawai aquatic snail -> (USSR) freshwater fish -> man Necator Hookworm Hookworm Man -> soil -> man Mainly in warm wet americanus climates Opisthorchis Cat liver fluke Opisthorchiasis Cat or man -> USSR, Thailand felineus aquatic snail -> fish -> man 0. viverrini n.a. Paragonimus Lung fluke Paragonimiasis Pig, man, dog, cat, Southeast Asia, westermani or other animal -> scattered foci in aquatic snail -> crab Africa and South or crayfish -> man America Schistosoma Schistosome Schistosomiasis; Man -> aquatic snail -> Africa, Middle East, haematobium bilharziasis man India S. japonicum Animals and man -> Southeast Asia snail -> man S. mansoni Man -> aquatic snail -> Africa, Middle East, man Central and South America Strongyloides Threadworm Strongyloidiasis Man -> man Mainly in warm wet stercoralis climates Taenia Beef tapeworm Taeniasis Man -> cow -> man Worldwide saginata T. solium Pork tapeworm Taeniasis Man -> pig (or man) -> Worldwide man Trichuris Whipworm Trichuriasis Man -> soil -> man Worldwide trichiura n.a. Not applicahle. - 35 - excreted pathogens and promote their rapid death. The effects of activated sludge on fecal bacteria, or of thermophilic digestion on all organisms, fit this category. The essential environmental factors in limiting pathogen persistence are time and temperature. The success of a given treatment process in reducing the pathogenicity of an effluent or sludge thus depends, in general, upon the retention time of the process and whether it creates an especially hostile environment to particular organisms. The sole environ- mental condition associated with night soil or a sewage treatment system that is highly fatal to all pathogens in a reasonably short time (a few hours) is raised temperature (in the range of 55-650 C). The only other low-cost process that causes 100 percent removal or destruction of most pathogens is the waste stabilization pond system with its long retention times, exposure to sunlight, and good sedimentation properties. Time lapse is a common factor in all treatment, disposal, and reuse technologies; in many cases, it is the feature that most determines the pathogen removal achieved. The rate of infectivity loss of an organism also depends very much on temperature: most organisms survive well at low temperatures (z 50 C) and die rapidly at high temperatures (>400 C). Except in sludge or night soil digestion or composting processes, temperatures approximate environmental temperatures--in most developing countries, the range is generally 15-350 C and commonly at 20-30 C. It is therefore useful to know the persistence of pathogens at ambient temperatures in different environments so that the likely pathogen content of various fecal products can be predicted. This section reviews pathogen survival at ambient temperatures in various environments: in feces, night soil, and sludge; in water and sewage; in soil; and.on crops. The shape of the curve describing pathogen survival over time should determine the way in which survival is reported. Many bacterial populations decline exponentially, so that 90 or 99 percent of the bacteria are lost relatively quickly; a few organisms may persist for longer periods. Such a situation is best described by the probability of survival for a given time or by the half life, which is the time required for half the population to die. For instance, 50 percent of fecal coliforms may die in water after 20 hours, whereas a few may persist for up to 50 days; the results obtained will depend heavily on sampling procedures. Most of the literature gives data on the persistence of the small proportion of long-term survivors, and only a few authors have reported the shape of the death curve or given the 50-90 percent destruction times. The discussion below therefore concentrates on the maximum persistence of a few organisms. This focus is epidemiologically appropriate for organisms that can replenish their numbers if they find themselves on food or other suitable substrates (e.g., shigellae, salmonellae, and pathogenic E. coli), or for organisms whose infective dose is believed to be low (such as the excreted viruses). It is less appropriate for cases in which regrowth is unlikely and infective doses may be high (for example, Vibrio cholerae); here what is important is the rapid decline of the bacteria to a level that no longer presents a major public health hazard. In organisms having several developmental stages outside the human host (such as hookworms and schis- tosomes), each stage will have its own survival pattern. When a developmental stage is under way yet depends on an unreplenished energy source (e.g., the - 36 - TABLE 2-5 Survival times of excreted pathogens in feces, night soil, and sludge at 20-300 C Survival time Pathogen (days) Viruses Enteroviruses-_ <100 but usually <20 Bacteria Fecal coliforms < 90 but usually <50 Salmonella spp. < 60 but usually <30 Shigella spp. < 30 but usually <10 Vibrio cholerae < 30 but usually < 5 Protozoa Entamoeba histolytica cysts < 30 but usually <15 Helminths Ascaris lumbricoides eggs Many months - Includes polio-, echo-, and coxsackieviruses. TABLE 2-6 Survival times of excreted pathogens in freshwater and sewage at 20-300 C Survival time Pathogen (days) virusesal b Enterovirusesb <120 but usually <50 Bacteria Fecal coliformsa/ < 60 but usually <30 Salmonella spp a/ < 60 but usually <30 Shigella spp.A7 < 30 but usually <10 Vibrio choleraec/ < 30 but usually <10 Protozoa Entamoeba histolytica cysts < 30 but usually <15 Helminths Ascaris lumbriocoides eggs Many months a. In seawater, viral survival is less, and bacterial survival is very much less than in freshwater. b. Includes polio-, echo-, and coxsackieviruses. c. V. cholerae survival in aqueous environments is still uncertain. - 37 - schistosome miracidium seeking its snail host), the length of life may be precisely defined. In Feces, Night Soil, and Sludge There is less literature on the survival of pathogens in these media than in aqueous environments. Some sources refer to survival of pathogens in the sludges of sewage works, but survival in feces and night soil may be sssumed to be broadly similar. Research on pathogen survival in these media may be summarized as shown in Table 2-5. In Water and Sewage Many studies have been conducted on the survival of excreted organisms in water and sewage. The data are summarized in Table 2-6. For all organisms, survival is highly dependent on temperature, with greatly increased persistence at lower temperatures. Survival of bacteria also depends greatly on the presence of other microorganisms in the water that might provide competition or predation. Bacteria often survive longer in clean water than in dirty water and the longest survival times are obtained by inoculating a single bacterial species into sterilized water. There is some evidence that virus survival is enhanced in polluted waters, presumably as a result of some protective effect that the viruses may receive when they are adsorbed onto suspended solid particles in dirty water. Coliforms, in particular E. coli, have attracted the most interest because of their established role as indicator bacteria. Substantial regrowth of coliforms is possible in organically polluted waters, but this growth phase eventually gives way to a progressive die-off. Survival in excess of 50 days is most unlikely, and, at 20-300 C, 20 days is a more common survival time. Mixed fecal streptococci have a similar (perhaps somewhat longer) survival, but, if the streptococci are predominantly S. bovis or S. equinus, the survival times are substantially shorter. Salmonella survival has also been widely reported; survival of over 2 months has been recorded, but 1 month is a more common upper limit. Shigella M. and Vibrio cholerae are less persistent, and survival of these bacteria for more than 20 days is seldom reported. With the development of viral detection techniques in the 1950s it became possible to demonstrate the presence of excreted viruses in the environment. The enteroviruses (polio-, coxsackie-, and echoviruses) have been frequently isolated from water and wastewater; the literature on this subject is growing rapidly. Viral survival may be longer than bacterial survival and is greatly increased at lower temperatures. In the 20-300 C range, 2 months seems a typical survival time, whereas at around 100 C, 9 months is a more realistic figure. Protozoal cysts are poor survivors in any environment. A likely maximum for Entamoeba histolytica in sewage or polluted water is about 20 days. Helminth eggs vary from the very fragile to the very persistent. The most persistent of all are Ascaris eggs, which may survive for a year or more. - 38 - In Soil Survival times in soil are relevant in all situations where effluent, sludge, compost, or other fecal products are being applied to the land as fertilizers or soil conditioners. Several factors (see TabLe 2-7) affect the survival time of enteric bacteria in soil (Gerba, Wallis, and Melnick 1975). Fecal coliforms can survive for many months under optimal conditions. In warm, especially arid, climates, survival is limited to 2-3 months at most. Fecal streptococcal survival is likely to be longer if human enterococcal species are dominant. Salmonellae may survive up to 1 year if the soil is cool, moist, and rich in organics (for example, if it is fertilized), but this varies considerably with the strain, and 50 days would be a more typical maximum. Data on Shigella or Vibrio cholerae survival in soil are limited. The information available on viruses suggests that virus particles adsorb to soil particles and become protected from environmental factors. Viral survival is greater at low temperatures: survivals of up to about 3 months have been reported in warm weather, increasing to around 5 months in European winter conditions. Protozoal cysts in soil are unlikely to survive for more than 10 days. Helminth egg survival varies enormously, but Ascaris eggs can survive for several years. The situation is summarized in Table 2-8. On Crops Excreted viruses and bacteria cannot penetrate undamaged vegetable skins. However, there are many reports in the literature on the isolation of all kinds of excreted pathugens from the surfaces of vegetables that have been irrigated or fertilized with fecal products. Root vegetables are more prone to contamination than others. Weather conditions have an important influence on the survival of pathogens on plants; warmth, sunshine, and low air humidity greatly promote pathogen death. The survival characteristics of various excreted organisms on crops may be summarized in Table 2-9. As indicated in the table, pathogen survival times on vegetables are short compared to survivals in other environments. Protozoal cysts are rapidly killed. Viruses, bacteria, and worm eggs survive longer, but few species can be expected to survive after 2 months. However, in most field situations, survival of pathogens on crops is sufficient to enable pathogens to survive harvesting and marketing and to reach consumers. A graphic presentation developed by this study of the survival times of selected enteric pathogens is shown in Figure 2-1. This is based on data presented in Feachem et al. (1983). OVERALL PATHOGEN REMOVAL EFFICIENCY OF WASTEWATER PROCESSES The removal efficiency for various pathogens by wastewater treatment processes has been summarized in general terms in Table 2-10 (see Chapter 5 for details). - 39 - TABLE 2-7 Factors affecting survival time of enteric bacteria in soil Soil factor Effect on bacterial survival Antagonism from soil Increased survival time in sterile soil microflora Moisture content Greater survival time in moist soils and during times of high rainfall Moisture-holding capacity" Survival time is less in sandy soils than in soils with greater water-holding capacity Organic matter Increased survival and possible regrowth when sufficient amounts of organic matter are present pH Shorter survival time in acid soils (pH 3-5) than in alkaline soils Sunlight Shorter survival time at soil surface Temperature Longer survival at low temperatures; longer survival in winter than in summer Source: Adapted from Gerba, Wallis, and Melnick (1975). TABLE 2-8 Survival times of excreted pathogens in soil at 200 C Survival time Pathogen (days) Virus Enteroviruses- <100 but usually <20 Bacteria Fecal coliforms < 70 but usually <20 Salmonella spp. < 70 but usually <20 Vibrio cholerae < 20 but usually <10 Protozoa Entamoeba histolytica cysts < 20 but usually <10 Helminths Ascaris lumbricoides eggs Many months Includes polio-, echo-, and coxsackieviruses. 40 - TABLE 2-9 Survival times of excreted pathogens on crops at 20-300 C Survival time Pathogen (days) Viruses^ Enteroviruses_/ < 60 but usually <15 Bacteria Fecal coliforms < 30 but usually <15 Salmonella spp. < 30 but usually <15 Shigella spp. < 10 but usually < 5 Vibrio cholerae < 5 but usually < 2 Protozoa Entamoeba histolytica cysts, < 10 but usually <15 Helminths Ascaris lumbriocoides eggs Many months _/ Includes polio-, echo-, and coxsackieviruses. TABLE 2-10 Enteric pathogen removal efficiencies of wastewater treatment pr4 cesses (in log1o units) (i.e., 4 = 10- = 99.99 percent removal) Treatment process Viruses Bacteria Protozoa Helminths Primary sedimentation 0-1 0-1 0-1 0-1 Septic tanks 0-1 1-2 1-2 1-2 Trickling filters 0-1 0-2 0-1 0-1 Activated sludge 1-2 2-3 1-2 1-2 Stabilization ponds 2-4 4-6 4-6 4-6 (20 day--4 cells) Source: This table was developed for this study and is based on a review of numerous published laboratory and field studies. - 41 - ORGANISM EXCRETED SURVIVAL -MONTHS* ORGANISM ~~LOAD** I 2 3 |4 5| 6 | 7 1 8 | 9 | 10|11 |1721 1. Campylobocter spp. l0 2 2 Giardio lamblia 103 3. Entamoeba histolytico a oE 4. Shigella spp. 107 5. Vibrio cholerae io7 +2 6. Salmonella typhi *io 7 Salmonella spp. 1O8 8. Escherichia coli (poth.) 108 9. Enteroviruses 107 10. Hepatitis A virus le? 11. Ancylostoma duodenale 10e 12. Trichuris trichiura 103 13. Toenia saginota 104 14. Ascaris lumbricoides 104 * Estimated average life of infective stage at 20°-30°C ** Typical avg. number of organism /gm feces Fig. 2-1. Persistence of selected enteric pathogens in water, wastewater, soil, and on crops. Source: Based on data from Feachem et al. 1983. This discussion concentrates on pathogen survival rather than pathogen removal because health hazards are posed by the pathogens that survive a treatment process, not by those that are removed by treatment. Figures such as 99 percent or 99.9 percent removal appear highly impressive but they represent 1 or 0.1 percent survival, respectively, and this degree of survival may be highly significant wherever incoming concentrations are great. If an influent to a sewage works contains, say, 10 pathogenic bacteria per liter, then 99 percent removal will produce an effluent with 10 pathogenic bacteria per liter. In areas where the effluent is to be reused, or where it is to be discharged to a stream that populations downstream use as a source of drinking water, such effluent quality may be inadequate. The emphasis in the literature on the exact proportions of pathogens removed by various treatment processes is thus misleading. For instance, most conventional treatment plants remove 90 to 99 percent of enteric bacteria. This is a very poor level of removal; whether trickling filters remove a little less (say, 95 percent) than activated sludge plants (say, 99 percent), they are both technologies with poor pathogen removal characteristics (but they were never designed to have them, as the next section explains). A removal capability of less than 99 percent always means more than 1 percent survival, or always less than a log unit reduction of 2. In developing countries, where incoming wastes have high concentrations of pathogens (especially viruses, bacteria, and protozoal cysts--see Table 2-11), a survival of more than 1 percent is usually inadequate. TABLE 2-11 Possible output of selected pathogens in the feces and sewage of a tropical community of 50,000 in a developing country Prevalence of Average number of infection in country organisms per gram Total excreted daily Concentration per Pathogen (percent of ecesb per infected personc/ Total excreted daily by town liter in town sewagb/ Viruses Enteroviruses-/ 5 106 108 2.5 x 101l 5,000 Bacteria Pathogenic E. colli.2 0 1010 Salmonella app. 7 1o6 108 3.5 x 1011 7,000 Shigella spp. 7 1o6 108 3.5 x 1011 7,000 Vibrio cholerae 1 106 108 5 x 1010 1,000 Protozoa Entamoeba histolytica 30 15 x 104 15 x 106 2.25 x 1011 4,500 4 Helminths Ascaris lubricoides 60 io4f/ io6 3 x 1010 600 Hookworms&' 40 80 / 8 x 104 1.6 x 109 32 Schistosoma mansoni 25 40 4 x 103 5 x 107 1 Taenia saginata 1 104 10 5 x 108 10 Tricharis trichiura 60 2 x o-,/ 2 x 105 6 x 109 120 ? Uncertain. Note: This table is hypothetical, and the data are not taken from any actual, single town. For each pathogen, however, the figures are reasonable and congruous with those found in the literature. The concentrations derived for each pathogen in sewage are in line with higher figures in the literature, but it is unlikely that all these infections at such relatively high prevalences would occur in any one community. a. The prevalences given in this column refer to infection and not to morbidity. b. It must be recognized that the pathogens listed have different abilities to survive outside the host and that the concentrations of some of them will rapidly decline after the feces have been passed. The concentrations of pathogens per liter in the sewage of the town were calculated by assuming that 100 liters of sewage are produced daily per capita and that 90 percent of the pathogens do not enter the sewers or are inactivated in the first few minutes after excretion. c. To calculate this figure, it is necessary to estimate a mean fecal weight for those people infected. This must necessarily be the roughest of estimates because of the age-specific fecal weights and the age distribution of infected people in the community. It was assumed that people more than 15 years of age excrete 150 grams daily and that people under 15 excrete, on the average, 75 grams daily. It was also assumed that two-thirds of all infected people are younger than 15. This gives a mean fecal weight for infected individuals of 100 grams. d. Includes polio-, echo-, and coxsackieviruses. e. Includes enterotoxigenic, enteroinvasive, and enteropathogenic E. coli. f. The distribution of egg output from people infected by these helminths is extremely skewed; a few people excrete very high egg concentrations. g. Ancylostoma duodenale and Necator americanus. - 43 - In comparing the capability of treatment technologies to remove pathogens, one should not dwell on trivial differences (for instance, 92.3 percent versus 97.8 percent removal), but should translate removal efficiencies into orders of magnitude. Conventional treatment works remove between 1 and 2 log units of enteric bacteria and should be contrasted with technologies, such as waste stabilization ponds, that remove 5 log units. In considering stabilization ponds or thermophilic digesters, which have high removal performances, one should avoid a comparison that is formulated in terms of percentage removal (use of this convention disguises, for instance, the important difference between 99.99 and 99.999 percent removal). The removal characteristics of treatment technologies should be related to the incoming concentrations of particular pathogens, to the intended reuse or disposal arrangements, and to the associated health risks. Different pathogens occur in varying concentrations and are affected in different ways by a given treatment technology. For instance, protozoal cysts will be found in raw sludge in relatively low numbers and will not survive sludge treatment. In contrast, Ascaris eggs may be found in sludge in high concentrations and will survive most sludge treatment processes. DISPERSION OF AEROSOLIZED ENTERIC PATHOGENS Aerosols are defined as particles from 0.01 to 50 pm in size that are suspended in the air. In various wastewater treatment and disposal processes, aerosol droplets containing pathogens from the wastewater stream can be formed as a result of aeration processes, trickling, and sprinkler spraying of wastewater and sludge. Studies in wastewater sprinkler irrigation have shown that between 0.1 and 1 percent of the liquid can be aerosolized, depending on sprinkler nozzle design, water pressure, and wind speed. The initial density of the microorganisms in the aerosols is a function of their concentration in the wastewater stream undergoing aerosoli- zation. However, this initial concentration is reduced rapidly as the aerosols disperse with the wind toward adjacent populated areas, which may thus become exposed to the airborne pathogens. The factors affecting the concentration of airborne aerosolized pathogens at downwind sites include the effect of initial aerosol shock or impact; biological decay, or die-away with time and distance downwind; and, of course, physical dispersion as a result of dilution of the aerosol particles within the air stream. Studies have indicated that although conventional indicator bacteria have been shown to have a 0.5-1.0 log1o loss as a result of initial aerosol shock, some pathogenic bacteria are more resistant; enteroviruses have been found to be least affected. - 44 - Several environmental parameters influence biological die-away or decay of airborne pathogens, including relative humidity, ultraviolet radia- tion, and temperature. Aerosolized indicator bacteria have been found to die away rapidly at the rate of about 1.0 logl0 reduction per 100 m of travel in a moderate wind. Bacteria phages have been shown to be more resistant, and significant decay for enteric viruses is considered unlikely at any reasonable distance from the spray source (Sorber and Sagik 1980). Various field studies have been able to detect enteric bacterial indicators as much as 1,000 m downwind from wastewater processes generating aerosols. Typical concentrations oi airborne indicator organisms at downwind sites have ranged from 1 So 100/m . Thus a person breathing normally and inhaling about 10 to 20 m of air per day might respire from 10 to 1,000 airborne microorganisms of wastewater origin per day. A portion of these may be retained in the lung and could cause infection in a nonimmune person. A recent study in Israel detected total coliforms, fecal coliforms, and fecal streptococci at air sampling stations 730 m downwind of sprinkler irrigation sites adjacent to residential areas in small agricultural communities (kibbutzim). One hundred percent of the night-time air samples at that distance were above background levels, and some 80 percent of daytime samples were above background. About 40 percent of the viable airborne bacteria of wastewater origin were entrapped within aerosols in the 1-5 PM size range. This is the size considered respirable and can be retained in the lung. In that same study, enteric viruses were detected in air samples at downwind sites up to 730 m away. Eighty-one percent of the virus-positive air samples were negative for one or more of the conventional bacterial indica- tors, and 31 percent were negative for all three conventional indicator organisms (Applebaum et al. 1984). These findings support the hypothesis that aerosolized enteric viruses ar.e resistant to hostile environmental factors associated with airborne dispersion. In sprinkler irrigation with wastewater, such viable aerosolized enteric viruses can be carried considerable distances downwind to potentially susceptible populations working or living in the vicinity. The potential for disease transmission by such airborne enteric viruses exists, at least in theory, since the minimal infectious dose is in some cases very low and some types are known to cause infections by the respiratory route. GENERAL CONCLUSIONS It is possible to .draw some general conclusions from the foregoing discussion concerning the concentrations of enteric pathogens in wastewater and their survival on soil, on crops, and in the air. - 45 - The full spectrum of enteric pathogens--including viruses, bacteria, protozoa, and helminths--endemic within a community and causing either clinical disease or subclinical infections are excreted in very high concentrations rnd appear in the raw wastewater stream in concentrations ranging from 10 to 10 /1. Conventional treatment processes may reduce the concentration of pathogens by a few orders of magnitude; nevertheless, numerous pathogens are left in the effluent at concentrations of 10-100/1. If applied to the soil by the agricultural irrigation methods normally used in developing countries (such as flood or ridge-and-furrow irrigation), many of these pathogens can survive for weeks or months, especially in moist shaded areas. The helminths are particularly resistant to the environmental conditions in.the soil, and, in the extreme case, the eggs of Ascaris can survive for as much as a year in fecally contaminated soil. Although survival times on crops are generally shorter, many pathogens can survive for days or weeks, particularly in moist protected areas of vegetable and salad crops (leafy vegetables such as lettuce or cabbage, and root crops such as carrots or radishes provide such protective conditions). Thus, both field and laboratory studies have shown that pathogens applied to the land in raw wastewater can live sufficiently long periods of time in the soil, and 'on the crops, to allow some pathogens to survive harvesting and marketing and reach the public consuming such crops. The agricultural workers who are in direct contact with the wastewater and the wastewater-irrigated soil and crops are even more likely to be exposed to these enteric pathogens. Enteric pathogens, particularly viruses, can be transported over considerable distances in the form of aerosolized droplets resulting from sprinkler irrigation with wastewater. The droplets can then be inhaled by workers or nearby population groups. Whether or not people actually become infected or ill after working in wastewater-irrigated fields, after consuming wastewater-irrigated vegetables, or after breathing aerosolized pathogens from wastewater sprinkler irrigation, depends on a number of additional factors to be discussed later. These go beyond the question of the mere presence of pathogens in the soil, on the crops, or in the air, to include minimal infectious dose, state of immunity, and concurrent contamination through other routes. DEVELOPMENT OF A CONCEPTUAL EPIDEMIOLOGICAL APPROACH In order to provide a conceptual framework that will lead to a better understanding of the nature of the transmission of excreta-related infections and the efficacy of various environmental control strategies, we present a number of approaches. Feachem et al. (1983) have proposed an "Environmental Classification of Excreta-related Infections" that is particularly germane to the subject of this report and therefore is presented here in an abbreviated form. The - 46 - purpose of the environmental classification, according to Feachem, is to group infections in such a way that the efficacy of different preventive measures is made clear. The object here is to propose an environmental classification for the infections associated with excreta. Understanding these infections depends on some basic facts of transmission--especially latency, persistence of pathogens in the environment, and the infective dose for humans. These and other key concepts are discussed before the environmental classification is set forth. If an excreted infection is to spread, an infective dose of the disease agent has to pass from the excreta of a patient, carrier, or reservoir of the infection to the mouth or some other entryway of a susceptible person. Spread will depend upon the numbers of pathogens excreted, how these numbers change during the particular transmission route or life cycle, and the dose required to infect a new individual. Infective dose is in turn related to the susceptibility of the new host. Three key factors intervene to govern the probability that, for a given transmission route, the numbers of excreted pathogens (excreted load) from one host will form an infective dose for another: latency, persistence, and multiplication. These concepts will be discussed later. Excreted Load The concentration of pathogens passed by an infected person, or excreted load, varies widely. A person infected by a small number of nematode worms, for instance, may pass 6only a few eggs per gram of feces, whereas a cholera carrier may excrete 10 &ibrios per gram, and a patient with an acute attack of cholera may pass 101 vibrios in a day. In areas where large numbers of pathogenic organisms are being passed in the feces, high pathogen concentrations in sewage are common. Even in a developed, temperate country such as England, where water use is rglatively high and salmonellosis relatively rare, raw sewage may contain 10 salmonellae per liter. At these concentrations a removal efficiency of 99 percent in sewage works will still leave 10 pathogenic organisms per liter of effluent. The health implications of these pathogens will depend upon the effluent disposal method, the pathogens' ability to survive or multiply, and the infective dose required. The magnitude of the potential health hazard from excreta can be appreciated by considering a typical load of pathogens excreted by a hypothetical poor tropical community in a single day, as presented in Table 2-10. Latency Latency refers to the interval between the time that a pathogen is excreted and the time that it can infect a new host. Some organisms-- including all excreted viruses, bacteria, and protozoa--have no latent period and are immediately infectious in raw excreta. The requirements for the safe disposal of excreta containing these agents are different from those for helminthic infections that have prolonged latent periods. Latency can affect the choice of disposal systems: that is, infections that have a considerable - 47 - latent period are largely risk-free when present in carted night soil, whereas others constitute a major health hazard in fresh night soil. Thuis the environmental classification of Feachem et al. separates the first two categories, in which no latency, is observed, from the categories in which a definite latent period occurs. Among the helminthic infections (see Table 2-4), only three have eggs or larvae that may be immediately infectious to man after being passed in the feces. These are the pinworms (Enterobius vermicularis), a dwarf tapeworm (Hymenolepis nana), and occasionally a minute nematode (Strongyloides stercoralis). All the other excreted helminths require a distinct latent period, either because their eggs must develop into an infectious stage in the environment outside the body, or because these parasites have one or more intermediate hosts through which they must pass in order to complete their life cycle. Persistence Viability of the pathogen in the environment, or persistence, is a measure of how quickly it dies after leaving the human body. This single property is the most indicative of the fecal hazard. A highly persistent pathogen will create a risk throughout most treatment processes and during the reuse of excreta. A pathogen with short persistence outside the body, however, must rapidly find a new, susceptible host. Since transmission in this case cannot follow a long route through sewage works and the final effluent disposal site back to man, it will involve the family or other close group that will enable the infection to be transferred from one person to another who is lax about personal cleanliness. More persistent organisms, in contrast, can readily generate new cases of disease much farther afield. As persistence increases, so then must concern for the ultimate means of excreta disposal. Similarly, pathogens that tend to persist in the general environment will require more elaborate processes to inactivate them in a sewage works. Methods of sequestering these pathogens, such as sedimentation into a sludge for special treatment, are often needed. Measurement of pathogen persistence in a laboratory is easy. Laboratory results, however, need to be confirmed by field studies, which are more difficult to carry out. Interpreting field results on persistence requires knowledge of how many pathogens are being shed in a community's excreta (relatively easy to determine) and of the infective doses for man (extremely difficult to assess). Persistence of enteric pathogens in the environment was discussed earlier in the chapter (also see Fig. 2-1). Multiplication Under favorable conditions, certain pathogens will multiply in the environment. Originally low numbers can thus produce a potentially infective dose (see the next section). Bacteria may multiply on a favored substrate (for instance, salmonella on food), and trematode worms reproduce in their - 48 - TABLE 2-12 Environmental classification of excreted infections Category and Environmental Major control epidemiological features a Infection transmission focus measure I. Nonlatent; low infective Amoebiasis Personal Domestic water supply dose Balantidiasis Domestic Health education Enterobiasis Improved housing Enteroviral Provision of toilets infectionsb/ Giardiasis Hymenolepiasis Infectious hepatitis Rotavirus infection II. Nonlatent; medium or high Campylohacter Personal Domestic water supply infective dose; moderately infection Domestic Health education persistent; able to multiply Cholera Water Improved housing Pathogenic Crop Provision of toilets Escherichi. coli Treatment of excreta infections' prior to discharge or Salmonellosis reuse Shigellosis Typhoid Yersiniosis lIt. Latent and persistent; no Ascariasis Yard Provisions of toilets intermediate host Hookworm infectiond/ Field Treatment of excreta Strongyloidiasis Crop prior to land Trichuriasis application IV. Latent and persistent; cow Taeniasis Yard Provision of toilets or pig as intermediate host Field Treatment of excreta Fodder prior to land application Cooking, meat inspection V. Latent and persistent; Clonorehiasis Water Provision of toilets aquatic intermediate host(s) Diphylloborthriasis Treatment of excreta Fascioliasis prior to discharge Fasciolopsiasis Control of animal Gastrodiscoidiasis reservoirs Heterophyiasis Control of intermediate Metagonimiasis hosts Opisthorchiasis Cooking of water plants Paragonimiasis and fish Schistosomiasis Reducing water contact VI. Spread by excreta-related Bancroftian filariasis Various fecally Identification and insects (transmitted by Culex contaminated sites elimination of pipiens) in which insects suitable insect All the infections in breed breeding sites I-V able to be transmitted mechanically by flies and cockroaches a. See Table 2-13 for data on additional epidemiological features by pathogen. b. Includes polio-, echo-, and coxsackievirus infections. c. Includes enterotoxigenic, enteroinvasive, and enteropathogenic E. coli infections. d. Ancylostoma duodenale and Necator americanus. Source: Feachem et al. (1983). - 49 - molluscan intermediate hosts. In the former case, light fecal contamination may increase bacterial numbers to the high minimal infective doses required in many excreted bacterial infections. This may be the usual mode of infection, since multiplication in water is limited compared with the massive increases possible in food. Excreted viruses and protozoa do not multiply outside their animal hosts. Among the helminths transmitted by excreta, all the trematodes that infect man reproduce in aquatic snails. This aquatic stage in their life cycles introduces a prolonged latent period of a month or more while the trematodes develop in the snail, followed by an output to the environment of up to several thousand larvae for each egg reaching the water. (Category V of the environmental classification below contains infections of this type; see Table 2-12.) Infective Dose In a predictable world, the assessment of health risk could simply be calculated from the output of pathogens in the excreta of those infected, the median infective dose (ID50) of particular organisms, and the efficiency of excreta treatment processes in inactivating pathogens. Because of the variable infective dose of most pathogens and the uneven distribution of infection in the environment, the real world is much less calculable. Although the minimal infective dose for some diseases may be a single organism, or very few organisms, the doses required in most bacterial infections are much higher. Data on infective doses are very hard to acquire, since they involve administering a known dose of a pathogen to a human volunteer. Information is scanty and concerned with doses required to infect half those exposed (ID50), rather than a small proportion, at a single exposure. The volunteers generally have been well-nourished adults, usually from nonendemic areas. Results of this kind must therefore be applied with great caution to malnourished peasant children continually exposed to an infection. It has been found that changes in the manner of administering experimental doses, such as preceding a dose of cholera vibrios with an alkaline substance to reduce temporarily free gastric acid, may lower the ID50 of such organisms by a factor of 103. And, although ID50 may be the most reliable gauge of infectivity in human experimental studies, in natural transmission the infective dose for 5 percent or less of the population may be of greater epidemiological significance. Uncertainty over the size of the minimal infective dose in nature makes it a difficult criterion to use in devising a classification; neverthe- less, it is too important to be left out. The difficulties are greatest with the major excreted bacterial infections and with protozoa. For excreted viruses, there is evidence of low ID50 in the laboratory and in human populations. In helminthic infections, a single egg or larva can infect if ingested, even though a high proportion of worms can fail to mature (espe- cially in locations where immunity is present). A graphic summary of minimal infective doses for selected enteric pathogens developed for this study is presented in Figure 2-2. - 50 - Challenge Dose (log,) 1, Low ,G Medium . High ORGANISM 1II 1 2131415 617 8 9110 1. Ascoris lumbricoidOs Z Ancylostoma duodenael;a KEY T Trichuris trichiura" WE 4. Enterovirum % of volunteers 5. Norwolk Agent" _ m developing infectlon 6. Hepatitis A virus!* ond/or illne#s 7 EndOnoebo coli _I * 8 Giardia lamblioe M 9 Shigella dysenteriae 10 Shigella flemner * II. Vibrio choAeraer 12 SOlmonello typhi 13 Salmonella newport 14. Escherichio coli (pathogenic) 15 Salmonella derby' 16 Clostridium perfringens 17 Salmonella pullorum Fig. 2-2. Minimal infective dose of selected enteric pathogens. Sources: Clinical-response of adult humans to varying challenge doses based on data from Bryan (1977), feachem et al. (1983), and other sources. Host Response Host response is important in determining the effect once an individual has received a given dose of an infectious agent. Acquired immunity and the relation of age to pathology are particularly important in predicting the effects of sanitation. At one extreme would be infection with a short-lived parasite to which little immunity develops and for which the relation between infection and disease is, not age dependent. A close, almost linear relationship between exposure and disease might be expected in this case, with appropriate improvements in sanitation yielding health benefits proportional to effect. Ascaris closely approximates this model. At the other extreme would be infection with viruses or bacteria to which long-lasting immunity develops and for which the chances of overt, symptomatic disease in those infected increase with age. A case in point is infection with poliovirus (see Table 2-1). Under poor sanitary conditions, all persons are infected at a young age, older children and adults are immune, and the disease is limited to a few of the youngest children, who may suffer chronic paralysis. If sanitation is improved, infection is deferred to a later period in life, but the pathological consequences will be more serious. Thus, although poliovirus transmission may be reduced by improving sanitation, such improvements will not necessarily curtail the disease, a result achieved in practice by immunization. This pattern may also apply to other excreted infections- such as infectious hepatitis, and it has been proposed for typhoid. There are several other excreted infections, however, in which human immunity is important in regulating the amount of disease. Immunity tends to - 51 - diminish the health significance of moderate sanitary improvements, and may in part explain the disappointing effects of some sanitary programs. In other words, the balance between exposure to infection and the host's response to it will determine the pattern of the excreta-related disease. If transmission--that is, exposure to a particular infection--is limited, then most people will not have encountered the infection and will be susceptible. If a sudden increase in transmission of the disease occurs, it will affect all age groups in the form of an epidemic. Under these circumstances, improvements in sanitation that strike at pathogen transmission will have a considerable effect in reducing an epidemic's likelihood, and its magnitude if one occurs. By contrast, if transmission is vigorous, most people will be repeatedly exposed to an infection, having first acquired it in childhood. Subsequent exposures may be without effect if immunity is developed after the first attack, or immunity may develop cumulatively from a series of attacks. The infection will nevertheless always be present, and can be described as endemic. Under these conditions, much of the transmission is ineffective because of human-acquired immunity, and reduced transmission through improved sanitation will only delay the occurrence of infection somewhat, so that older, instead of younger, children will exhibit symptoms. Extensive sanitary improvements will either render the infection rare or, if the disease was originally highly transmitted, make it an adult disease. Diseases exemplify- ing this scenario are typhoid, which can be completely prevented in a community by adequate management of excreta and of water supplies, and poliomyelitis, which can be prevented only by immunization. The consequences of a disease that is prevalent among juveniles--when children are not only the chief sufferers, but also the main sources of infection--present a further challenge to sanitation. The acute need for better community excreta disposal must focus on young children, the group perhaps least inclined to use any facilities that are made available. Nonhuman Hosts Some excreted infections (for example, shigellosis) are confined strictly to humans, and the control of human excreta alone is required for their prevention. Many others (such as salmonellosis), which involve wild or domestic vertebrate animals as well as man, are called zoonoses. There are two groups of zoonoses, and each has quite different implications for sanitation (Fig. 2-3). In the first group, animals act as hosts alternative to man: even if human excreta are under complete control, the excreta of other animals can continue to transmit the infection. In effect, the animal involved is "in parallel" with man, and it is necessary to control both human and animal excreta. In the second group, the animal is an essential step in the transmission of the disease from one human to another (Fig. 2-3, "in series"). In this case, control of either human excreta alone or the animal infection alone will suffice to end transmission. In the environmental classification below, this second group, which contains the human tapeworms of the genus Taenia, is therefore separated from the other categories. - 52 - Zoonoses Anima/s g/z<> > Animal in parallel Animl/s Animal in series M07 Fig. 2-3. Involvement of other vertebrates in the transmission of human excreted infections. Examples of zoonoses in parallel are salmonellosis and balantidiasis; examples of zoonoses in series are beef and pork tapeworm infections. Source: After Feachem et al. 1983. Some excreted helminthic infections have invertebrate intermediate hosts (see Table 2-4); they will be controlled if excreta are prevented from reaching the intermediate hosts, if the intermediate hosts are controlled, or if people do not eat the intermediate host uncooked or do not have contact with the water in which the intermediate host lives (depending on the particular organism's life cycle). CATEGORIES OF EXCRETA-RELATED INFECTIONS The excreted infections can be grouped in several ways according to the epidemiological features discussed above. Feachem et al. (1983) have proposed a classification that considers the effects of excreta disposal and changes in disposal facilities and technologies (see Table 2-12). Six categories of infection have been distinguished in Table 2-12, and the relevant environmental or epidemiological features broadly considered are latency, infective dose, persistence, multiplication, and transmission. Further data on specific excreted pathogens--arranged by category and epidemiological feature--are provided in Table 2-13. Control measures appropriate to each environmental category of pathogen are indicated in Table 2-12, and data on immunity and pathogen concentrations in excreta, which vary with each organism, are presented in Table 2-13. The first five categories of excreted pathogens clearly differ from the last, which contains excreta-breeding insect vectors of disease. The insects themselves are not pathogens and a variety of sanitation methods and additional specific measures can be directed against these vectors. For these reasons, category VI is not included in Table 2-13. The excreted infections are divided on the basis of the presence (categories III to V) or absence (categories I and II) of a latent period - 53 - (health problems associated with fresh feces or night soil occur primarily in these first two categories). The distinction between categories I and II and categories III to V is fundamental and clear-cut, corresponding closely to the biology of the pathogens (in that all infections in categories III to V are helminthic). The subdivisions of the infections having latency are also clear, with category III containing the soil-transmitted worms; IV the tapeworms, which depend on the access of cattle and pigs to human feces; and V the trematodes and other worms requiring aquatic intermediate hosts. The most useful division of categories I and II has proved to be one based on ID50, even though knowledge of the ID50 for infections affecting malourished peasant children in the tropics is nonexistent. With ID50 as the criterion, categories I and II break in a way that makes theoretical sense and also correlates in some degree with the likely effects of improved excreta disposal facilities. Each category in Table 2-12 implies some minimum sanitary require- ments for control of the diseases within it, and often control measures ancillary to excreta disposal facilities further contribute to success. In conclusion, Feachem et al. (1983) state that the environmental features of the categories defined above can be correlated with the length and spread of transmission routes and on that basis complementary controls can be identified for most diseases. If excreta disposal alone is improved, however, or in this case wastewater treatment to reduce exposure to pathogens in irrigation, the control likely to be achieved for each category is as follows: Category Control I (includes enteric viruses and protozoans) Negligible II (includes bacterial diseases such as cholera, Slight to moderate typhoid, and shigellosis) III (includes the geohelminths ascaris, trichuris, Moderate to great and hookworm) IV (taeniasis) Moderate to great The outstanding difference is between categories I and II, which depend strongly on personal and domestic cleanliness, and the other categories, which do not. The central changes necessary to control infections in categories III and IV are relatively simple--namely, the provision of toilets that people of all ages can use and keep clean, and the treatment of fecal products prior to recycling on the land. The reason that reports on the effects of latrine programs often do not show a marked decrease in the prevalence of the infec- tions in categories III and IV is that, although latrines have been built, they have typically neither been kept clean nor used by children or adults who worked in the fields. TABLE 2-13 Basic epidemiological features of excreted pathogens by environmental category Median Multiplication infective Major Excrete L P outside human dose Significant nonhuman Intermediate Pathogen load1 Latenb PersistenceC! host (ID50) immunity? reservoir? host CATEGORY 1 Enterovirusesd/ 107 0 3 months No L Yes No None Hepatitis Alvirus 106 (?) 0 ? NO L(?) Yes No None Rotavirus 106 (?) 0 ? No L(?) Yes No(?) None Balantidium coli ? 0 ? No L(?) No(?) Yes None Entamoeba histolytica 105 0 25 days No L No(?) No None Giardia lamblia 105 0 25 days No L No(?) Yes None Enterobius vermicularis Not 0 7 days No L No No None usually found in feces Hymenolepis nana ? 0 1 month No L Yes(?) No(?) None 4, CATEGORY II Campylobacter fetus sap. jejuni 107 0 7 days Yese/ H(?) ? Yes None Pathogenic Escherichia colif/ 108 0 3 months Yes H Yes(?) No(?) None Salmonella 108 0 2 months Yese/ H Yes No None S.- typhiYeNoon Other salmonellae 108 0 3 months Ye-e/ H No Yes None $higella spp. 10 0 1 month Yese M No No None Vibrio cholerae 107 0 1 month(?) Yes H Yes(?) No None Yersinia enterocolitica 105 0 3 months(?) Yes H(?) No Yes None CATEGORY III Ascaris lumricoides 104 10 days 1year No L No No None Hookworms8i 102 7 days 3 months No L No No None Strongyloides stercoralis 10 3 days 3 weeks Yes L Yes No None (free-living stage much longer) Trichuris trichiura 103 20 days 9 months No L No No None CATEGORY IV Taenia saxi3ata and T. solium/- 104 2 months 9 months No L No No Cow (T.saginata) or pig (T.solium) CATEGORY V Clonorchis sinensisi! 1o2 6 weeks Life of fish Yeas/ L No Yes Snail and Diphyllsbthrium 14 fish latuLbot!rium 10 2 months Life of fish No L No Yes Copepod and latumS- , fish Fasciola hepatical! t 2 months 4 months Yesi/ L No Yes Snail and aquatic Fasciolopsis buski!h/ 103 2 months ?es/ L No Yes Snail and aquatic plant Gastrodiscoides homini_/ ? 2 months(?) ? Yesi/ L No Yes Snail and aquatic plant Reterophyes heterophyes.-i/ ? 6 weeks Life of fish Yesa/ L No Yes Snail and fish Metagonimus yokogawai±' 6 weeks(?) Life of fish Yea/ L No Yes Snail and fish I Paragoni us / ? 4 months Life of crab Yesi! L No Yes Snail and j westermani- crab or crayfish Schistosoma 4 per 5 weeks 2 days Yes/ L Yes No Snail S.haematobiumh!. milliliter of urine S_ aponi / 40 7 weeks 2 days Yes1! L Yes Yes Snail S.mansoniEf 40 4 weeks 2 days Yes/ L ? No Snail Leptospira sppJk/ urine(?) 0 7 days No L Yes(?) Yes None Note: L Low (<202); M medium (u 104); H high (>106). ? Uncertain. a. Typical average number of organisms per gram of feces (except for Schistosoma haematobium and Leptospira, which occur in urine). b. Typical minimum time from excretion to infectivity. c. Estimated maximum life of infective stage at 20°-300 C. d. Includes polio-, echo-, and coxsackieviruses. e. Multiplication takes place predominantly on food. f. Includes enterotoxigenic, enteroinvasive, and enterophathogenic E. coli. g. Ancylostoua duodenale and Necator americanus. h. Latency is minimun time from excretion by man to potential reinfection of man. Persistence here refers to maximum survival time of final infective stage. Life cycle involves one intermediate host. i. Latency and persistence as for Taenia. Life cycle involves two intermediate hosts. j. Multiplication takes place in intermediate snail host. k. For the reasons given in Chapter 1, Leptospira spp. do not fit any of the categories defined in Table 2-2. - 56 - PROPOSED MODEL TO PREDICT THE RELATIVE EFFECTIVENESS OF PATHOGENS IN CAUSING INFECTIONS THROUGH WASTEWATER IRRIGATION The preceding analysis of Feachem et al. (1983) and our own evaluation of the theoretical epidemiological considerations lead us to suggest that a number of factors function as intervening variables that influence the relative effectiveness of various groups of pathogens to cause infections in humans through wastewater irrigation. The following are the main intervening variables that contribute to effective transmission by wastewater irrigation as compared with other routes of transmission: 1. Long persistence in the environment 2. Low minimal infective dose 3. Short or no immunity 4. Minimal concurrent transmission through other routes such as food water and poor personal or domestic hygiene 5. Long latent period and/or soil development stage required. Table 2-14 presents in a condensed form a summary of the epidemio- logical characteristic of the main groups of enteric pathogens as they relate to the above five variables: TABLE 2-14 Epidemiological characteristics of enteric pathogens vis-a-vis their effectiveness in causing infections through wastewater irrigation Persistence Minimum Concurrent Latency soil in infective routes of development Pathogen environment dose Immunity infection stage Viruses Medium Low Long Mainly home No contact and food and water Bacteria Short/medium Medium/high Short/medium Mainly home No contact and food and water Protozoa Short Low/medium None/little Mainly home No contact and food and water Helminths Long Low None/little Mainly soil Yes contact outside home and food - 57 - Although the above condensed summary is an oversimplification, it does provide a theoretical basis in general terms for ranking the groups of pathogens as to their potential effectiveness to transmit disease through wastewater irrigation. On theoretical grounds alone, it appears that the helminth diseases, in those areas of the world where they are endemic, will be the ones most effectively transmitted by irrigation with raw wastewater owing to their high persistence in the environment; their small minimum infective dose; and the fact that there is little or no immunity, that concurrent infection in the home is often limited, and that latency is long, with soil development stage required for transmission. The enteric virus diseases, however, should be the ones least effectively transmitted by irrigation with raw wastewater, despite the. fact that they often have very small minimal infective doses and survive for reasonably long periods in the environment. Concurrent routes of infection in the home due to poor hygiene are so intense that most infants are exposed in the first years of life. Since immunity for most enteric virus diseases is for life, or at least for very long periods, there is little likelihood of additional infection as a result of environmental exposure such as might result from wastewater irrigation. We estimate that the bacterial and protozoan diseases rank between these two extremes. Overall, then, the pathogens can be ranked in the following descending order of risk: 1. High - Helminths (the intestinal nematodes - ascaris, trichuris, hookworm, and taeniasis) 2. Lower - Bacterial Infections (cholera, typhoid, and shigellosis) and Protozoan Infections (amebiasis, giardiases) 3. Least - Viral Infections (viral gastroenteritis and infectious hepatitis) This ranking is consistent with the theoretical considerations noted by Feachem et al. (1983) in their "Environmental Classification of Excreted Infection," although their comments do not relate specifically to the special case of wastewater reuse. Our proposed model and ranking of pathogens along with the rationale behind it was reviewed in the World Bank/WHO-sponsored Engelberg Report (1985) and obtained the endorsement of an international group of environmental experts and epidemiologists. In Chapters 3 and 4 we turn to the available epidemiological evidence to determine whether the above concepts and theoretical model fit the empirical evidence. - 58 - CHAPTER 3 HEALTH EFFECTS ASSOCIATED WITH WASTEWATER IRRIGATION: EARLY REPORTS, OPINIONS, AND POLICIES THE NINETEENTH CENTURY In the middle of the nineteenth century, before it was known that certain diseases were caused by microorganisms, many believed--as some still do today--that the odors and bad smells associated with decomposing organic matter and human excreta were the causes of outbreaks of diseases such as cholera, typhoid, dysentery, and the plague. It might have been natural to assume that the sewage farms of those days were breeding places for these diseases. As will be seen below, the early medical reports did not bear out these widely held beliefs. On the contrary, they almost consistently reported no negative health effects from sewage farms. Medical opinions as to the health implications of sewage farming and broad irrigation with wastewater first began to appear in the scientific literature about the second half of the nineteenth century. Some of this early literature has been reviewed by Wilson (1944) and is quoted here from that source. In the First Report of the River Pollution Commission in 1868, Christison wrote of the sewage farms at Edinburgh, which began operating as early as 1650: "I am satisfied that neither Typhus nor Enteric Fever nor Dysentery, nor Cholera is to be encountered in or around them, whether in epidemic or non-epidemic seasons, more than in any other agricultural district of the neighbourhood." Similarly, Littlejohn, medical officer for health (MOH) for Edinburgh, observed no such association: "During the time that cholera was epidemic at Leith and Edinburgh in 1865-1866, not a single case occurred at the Piershill Barracks [sewage farm]." The same finding was reported at the Barking sewage farm: "For when cholera was epidemic in London, not a single case occurted at or near the sewage farm which received the sewage from North London." And on September 3, 1870, Buchanan, MOH to the Privy Council, reported in the British Medical Journal, "In fact the evidence goes to show that sewage fields when properly managed are certainly not injurious to health and may be even advantageous to it." Chase (1964) has noted that in 1871 H. L. Bowditch, chairman of the Massachusetts State Board of Health, gave a favorable report on the use of sewage irrigation at that time in the cities of Lenox and Worcester, Massachusetts. The 5th Report of the Royal Commission on Treating and Disposing of Sewage, 1908, answered the question, "Is a sewage farm dangerous to health?" by stating that "no proof has yet been furnished of direct or widespread injury to health in the case of well managed sewage farms." This statement - 59 - implies, however, that injury to health may have been reported in connection with "poorly" managed sewage farms, although no details are provided by Wilson on that point. In 1865, Cobbold published one of the first monographs on bilharzia (schistosomiasis), and warned of the possibility of "spreading Bilharzia far and wide" over England as a result of plans to establish large sewage farms fed with town sewage. By 1871, however, Cobbold himself had concluded in an article in the Medical Times and Gazette (February 25, 1871) that--at least as far as England was concerned--where Bilharzia was not endemic, "little harm can result from sewage distribution on farms." Cobbold's early insightful remarks about wastewater reuse, particularly in aquaculture, may have been quite correct with respect to areas of the world where the disease is endemic and where the intermediate snail hosts can become established in reservoirs, ponds, or canals of wastewater diluted with freshwater. After John Snow's famous report of 1854 on the waterborne cholera epidemic associated with the Broad Street pump in London, there was some concern about possible cholera transmission at sewage farms. Once Koch isolated and identified the cholera vibrio in 1884, it became possible to study experimentally the possible transmission of the organism by wastewater. In 1898, in the first such study using cultures of Koch's cholera vibrio, Houston demonstrated that it was possible to recover a culture of living vibrio inoculated into a sample of London sewage as much as fifteen days later. Thus, this early experiment suggested the possibility that cholera organisms could survive long enough to infect sewage farms and irrigated crops. The first intimation that pathogens could be dispersed as aerosols from an aerated wastewater source was also reported by scientists in the last century. An outbreak of Asiatic cholera in Southampton in 1866 was attributed by Parks to the dispersion of infected droplets of sewage into the air. He noted that the discharge from a sewage pump led into an open channel and the result was a "frothing surge." During the two-week period after this pumping was initiated, a cholera epidemic broke out and there were 107 fatal cases in the houses adjacent to the open canal. In 1877 Frankland described experiments he had carried out on the possible formation of such airborne droplets. He stated that moderate agitation of a liquid in air "does not cause suspension of liquid droplets capable of being airborne but that the breaking of gas bubbles at the surface of a liquid did cause the emission of tiny droplets capable of being carried by a current of air." These little-known pioneering studies of over 100 years ago raised questions that are still being studied and debated by public health authori- ties today. The outstanding feature of most of the medical reports from the. period before the turn of the century, crude and as unregulated as they were, was the uniform lack of evidence of disease transmission as a result of sewage farming. Although detailed evidence is not presented, one must ask whether - 60 - these early reports reached the conclusions they did because of inadequate methods of investigation, which prevented them from detecting whether or not a problem existed, or because there were indeed no detectable health problems at the sewage farms of those days, despite the epidemics of enteric disease raging throughout England and the rest of Europe at the time. One hypothesis that might explain these reports is based on the assumption that, during that period, concurrent routes of enteric disease transmission--particularly highly contaminated drinking water and low levels of personal and domestic hygiene--were so dominant that any additional disease transmission associated with contamination of workers at sewage farms or the sale of contaminated vegetables to the general public was effectively masked by immunity associated with the massive disease transmission by the other routes. On the basis of this hypothesis, one might predict that, with the rapid improvements that were taking place at the turn of the century in the treatment of drinking water and with the generally improved level of hygiene and standard of living, negative health effects from sewage farming would begin to show up only at a later period, when the infection through the other concurrent routes was greatly reduced. In any event, the medical reports and opinions from before 1900 provided direct support for the widely accepted policy and practice of land disposal and sewage farming that had spread rapidly in Europe and the United States, and was rarely regulated as to the type of crops to be grown. FIRST HALF OF THE TWENTIETH CENTURY With the rapid growth of the urban centers where sewage farming was practiced, housing areas began to encroach on the farms. Complaints of odor, combined with increased land values and the development of alternative land- intensive, engineered, biological wastewater treatment systems, all led to the abandonment of many of the famous European sewage farms. As the scientific basis for communicable disease transmission became more widely understood, public health officials began to show concern as to the potential public health problems associated with sewage farming and land application of wastewater, which had been minimized by their medical colleagues of the previous century. A detailed review of "broad irrigation," "sewage farming," and "land application" was presented in George W. Fuller's book on sewage disposal in 1912. After reviewing the engineering and economic aspects, as well as the position of public health authorities, he concluded that "objections to the method have increased rather than decreased in recent years. These relate to objectionable odors, prejudices against the use of sewage in growing vegetables and to the transmission of disease germs by flies and other insects." Concerning the health risks of growing vegetables eaten in the raw - 61 - condition, the author states, "While the available evidence does not show any specific instances of such trouble . . . in the interests of public health, vegetables for human consumption which are eaten even occasionally in the raw condition . . . should not be grown on sewage farms." A few years later, Abel Wolman (1924) wrote a pioneering review of the hygienic aspects.of the use of sewage sludge for fertilizer. He stated that in 1922, although only limited epidemiological evidence was available, the Maryland State Board of Health decided it was necessary to promulgate regulations to prevent the direct application of wastewater sludge on growing vegetables. The health authorities believed that this practice might be detrimental to public health. These negative opinions were based in part on negative health reports from areas in Asia where night soil fertilization of vegetables was widely practiced. In addition, Kligler reported in 1923 that the high rate of Ascaris infections among the population of Jerusalem was caused by the consumption of salad crops irrigated by raw wastewater from the city. Furthermore, Walker (1927) concluded that the unusually high incidence of Ascaris infections among a battalion of the Singapore garrison was due to their eating vegetables grown by Chinese gardeners who used feces for fertilizer. Nills (1927) attributed the uniform distribution of Ascaris among all classes in Korea to the transmission- via pickled vegetables fertilized with human excreta, which were eaten in large quantities in all population groups. Winfield (1937), however, concluded that, on epidemiological grounds, vegetables were not the significant factor in determining the number of worms harbored by the Shantung villagers of North China; he suggested that this was largely due to the way vegetables are fertilized in that area. Since the original paper is not available, it is difficult to evaluate the evidence on which this conclusion is based. Winfield and Yao (1937), in another paper in the same series, recorded that the careful examination of the washings of 275 kg of vegetables bought in the Tsinan market between the years 1933 and 1934 failed to demonstrate a single Ascaris egg. Five samples of vegetables from Lungshan were also free of eggs. The authors concluded that vegetables seemed to be a negligible source of Ascaris infection in North China because of the dry climate and the method of applying human feces fertilizers in the form of dried cakes. They concluded that the overall quantitative importance of vegetables as a source of Ascaris infection is relatively minor, even where wet methods of applying human feces as fertilizer are used. The authors provide no data to support this conclusion. However, the findings of other early authors do not agree with the findings of Winfield and Yao (1937). In Manchuria, Ishikawa (1929) studied conditions where dry fertilizer were used. This fertilizer is produced by mixing manure, feces, and soil together to form bricks that are dried and kept until needed. Ishikawa found that Ascaris eggs could survive in these bricks during both the winter and summer months. When he carefully washed fresh vegetables, he found Ascaris eggs on 100 percent of Chinese cabbage examined, and stated eggs developed into cultures from cabbage washings. When Yosesato - 62 - and Sumi (1932) examined vegetables bought in the public market on the streets of Mukeden, they found helminth eggs, mainly Ascaris and Trichuris, in the following percentages: lettuce 92 percent, spinach 54 percent, radishes 43 percent, onions 33 percent, cabbage 17 percent, potatoes 8 percent, and no positive on cucumbers, tomatoes, and eggplant. On one occasion, he found Ascaris eggs on pickled radishes. This finding provides some laboratory support for the findings of Nills (1927) that even pickled vegetables can disseminate the highly resistant ova of Ascaris. It should be pointed out here that the methods used to detect and identify helminth ova varied considerably during these early periods of research on the subject, and that fact alone could explain some of the discrepancies. Two earlier studies of S. typhi on contaminated radishes and lettuce showed survival up to twenty-one days in sunny sites and thirty-seven days in the shade (Creel 1912; Melick 1917). The latter study also showed that S. typhi become attached to leaves lying on contaminated soil, and are not readily removed by washing, thus indicating that the pathogens on such contaminated salad crops could reach the homes of consumers. Although few, if any, of these early reports of the first half of the twentieth century include data that strongly point to the negative health effects of land applications and wastewater reuse in agriculture, public health officials generally tended to adopt a more cautious position than their colleagues of the previous century. Regulations promulgated in some areas banned land application completely, and in other areas strictly limited the practice, particularly by forbidding wastewater irrigation of vegetables eaten raw, or by requiring high levels of effluent quality. A statement that typifies this cautious attitude was made by J. W. Scharff (1940), former chief medical officer of Singapore, in reference to night soil fertilization of vegetables: "Though the vegetables thrive, the practice of putting human waste directly on the soil is dangerous to health. The heavy toll of sickness and death from various enteric diseases in China is well known." MORE RECENT OPINIONS AND STATEMENTS OF POLICY As the scientific methods improved for the detection and identifica- tion of microbial pathogens in environmental samples, numerous investigators in various areas of the world were able to provide sound scientific evidence that most enteric pathogens excreted by a community could be detected at high concentrations in the wastewater stream, and in wastewater-irrigated soil and on crops. With the discovery of techniques to isolate enteric viruses in the environment, a number of investigators were able to demonstrate that--like the bacteria, protozoa, and helminths previously investigated--these newly discov- ered enteric viruses could be detected in wastewater-irrigated soil and crops as well as in the air near sprinkler irrigation sites using wastewater. This raised a new round of public health concerns about wastewater reuse in agriculture (see Chap. 2). - 63 - Primarily on the basis of these laboratory findings of enteric pathogens in the environment, including their ability to survive on crops, most public health authorities developed strict regulations concerning the use of wastewater in irrigation. Typical of the strictest of these regulations were those of the California State Department of Public Health mentioned previously, which required that effluent used to spray irrigate crops for human consumption in the raw state be disinfected and filtered wastewater with a turbidity of no more than 10 units, and no more than 2.2 coliforms per 100 ml. These requirements for wastewater effluent come close to the standards required for drinking water. (See Table 1-1 for a summary of the main features of these regulations.) A World Health Organization (1973) report by an international group of experts on the reuse of effluents pointed out the potential risks to health from the use of improperly treated effluent to irrigate vegetables eaten raw; it also discussed potential health problems of agricultural workers practicing irrigation with wastewater. The possibility of transmitting human tapeworms to cattle that graze on pastures irrigated with partly treated wastewater was reviewed, as was the possible transmission of enteric pathogens by fish grown in ponds enriched with wastewater. The risk of fish pond workers being exposed to infections of schistosomiasis in those areas where the disease is endemic was also discussed. The WHO group recommended certain treatment processes to meet the given health criteria for wastewater reuse. These recommendations have become widely accepted by the public health profession around the world and serve as the leading public health policy guidelines on wastewater reuse (see Table 3-1). These guidelines recommend that wastewater be given primary and secondary treatment and at times be disinfected prior to the irrigation of crops eaten raw. It was also recommended that the effluent contain no more than 100 coliform organisms/l00 ml in 80 percent of the samples. These standards, although less rigorous than those of California, are still quite restrictive. In general, it can be said that current conventional thinking in the public health community of today's industrialized nations sees numerous potential risks in the practice of wastewater irrigation, which, nevertheless, can be controlled by strict regulations as to the type of crops grown and the degree of pretreatment provided to the wastewater. The foregoing review of early opinions on the health aspects of wastewater reuse in agriculture--which reverted from initial enthusiasm to a highly cautious and generally negative approach--may help us to understand some present-day opinions on the subject, some of which may be based on earlier fears of disease transmission by fecal odors or on more recent negative medical opinions developed in the first half of this century. Some of the more restrictive public health and environmental engineering opinions of today also may derive from these traditionally accepted ideas, particularly those associated with the dramatic public health benefits that resulted from improved water supplies during the early part of this century. The drastic drop in typhoid fever and cholera rates as a result of the introduction of - 64 - TABLE 3-1 Suggested treatment processes to meet the given health criteria for wastewater reuse Irrigation Recreation Municipal Reuse Crops not Crops eaten for direct cooked; Crops human fish eaten No Industrial Non consumption culture raw contact Contact reuse potable Potable Health criteria (see below B + F for explanation of symbols A + F or D + F D + F B D + G C or D C E Primary treatment *ev *v 00* *** *-- 0*e 0*e Secondary treatment *** 00 *ee e*- e*. eec Second filtration or equivalent poisoning methods 0 0 *00 *ee Nitrification 0 ... Denitrification 0* Chemical clarification 0 *- Carbon absorption ee Ion exchange or other means of removing ions 0* Disinfection * *ee * *ee *s* 0*. Health criteria: A - Freedom from gross solids; significant removal of parasite eggs. B - As A, plus significant removal of bacteria. C - As A, plus more effective removal of bacteria, plus some removal of viruses. D - Not more than 100 coliform organisms per 100 ml in 80 percent of samples. E - No fecal coliform organisms in 100 ml. plus no virus particles in 1000 ml. plus no toxic effects on man and other drinking water criteria. F - No chemicals that lead to undesirable residues in crops or fish. G - No chemicals that lead to irritation of mucous membranes and skin. In order to meet the given health criteria, processes marked * * * will be essential. In addition, one or more processes marked 0 0 will also be essential, and further processes marked 0 may sometimes be required. * Free chlorine after 1 hour. Source: Reproduced with the permission of the WHO from: Reuse of Effluents: Methods of Wastewater Treatment and Health Safeguards, WHO Technical Report Series no. 517 (Geneva, 1973). - 65 - 25 - Water chlorinated 20 n~ 20l / 1 v. 06 &15' o 10Water Filtered 2 and chlorinated 1~~~~~~~~1 .5 0. - 0 1l9O0 1933 Time- years Fig. 3-1. Effect of water purification on death rate from typhoid fever in Detroit, Michigan, 1900-1933. Source: Adapted from Pair and Geyer (1956). sand filtration and chlorination of municipal water supplies in Europe and the United States provided conclusive evidence as to the dangers of waterborne enteric pathogens and of the benefits that could be derived by their control through environmental intervention. Figure 3-1 illustrates the rapid drop in typhoid fever death rates in a typical large city in the United States that over the years after the turn of the century progressed from drinking untreated contaminated surface water to purified water treated by sand filtration and disinfection by chlorination. Few public health measures involving environmental interventions have ever had such telling results. It may have been intuitively assumed by many that the strict regulation of wastewater irrigation would achieve similar results. Although the role of improved drinking water quality in eliminating typhoid is generally accepted, other factors such as improved levels of hygiene and raised standards of living undoubtedly contributed to the decline of the disease. In concluding this review of early reports and opinions of the post- 1900 period, we should note that firm epidemiological evidence demonstrating severe negative health effects of wastewater reuse in agriculture is rela- tively rare', and is seldom mentioned in defense of the widely accepted con- servative public health standards and policies quoted above. In the following chapters, we analyze the epidemiological evidence available on quantifiable health effects associated with wastewater reuse in agriculture in an effort to develop a rational approach to controlling these effects. We also consider whether special conditions--environmental, immunological, or others--in the developing countries should be taken into account in policy considerations and in the required remedial measures for wastewater irrigation. - 66 - CHAPTER 4 EVALUATION OP EPIDEKIOLOCICAL EVIDENCE OF HUKAN HEALTH EFFECTS ASSOCIATED WITH WASTEWATER IRRIGATION As stated earlier, one of the primary goals of this report is to assess the epidemiological evidence of quantifiable human health effects associated with wastewater irrigation. The following evaluation is based on available scientific papers published in recognized journals and on a number of unpublished reports and theses that have been obtained after an intensive worldwide search through the good offices of numerous international agencies, as well as through direct contact with health and other agencies in a number of developing countries. For the purposes of this analysis, we have considered only epidemiological studies dealing with the following possible routes of infection: o Wastewater applied to edible crops or pastures, causing disease and/or infections in humans consuming edible crops, dairy products, or meat from grazing animals exposed to such contaminated environments. o Wastewater used in agriculture, causing disease and/or infection in directly exposed agricultural workers, in contact with irrigation systems. o Wastewater used in agricuLture, causing disease and/or infection in nearby nonagricultural populations, including family contacts of wastewater-exposed agricultural workers and noncontacts exposed to wastewater aerosols and/or other routes. This study has not directly examined the health effects associated with the use of night soil or wastewater sludge for the direct fertilization of agricultural crops, nor the use of wastewater, night soil, or wastewater sludge in aquaculture. However, some of the findings of this review may have a bearing on these subjects. They are discussed in some detail in the definitive World Bank study: Sanitation and Disease--Health Aspects of Excreta and Wastewater Management (Feachem et al. 1983) as well as in two special studies on the subject sponsored by the WHO and UNEP through the International Reference Center on Wastes Disposal (Blum and Feachem, 1985; Cross and Strauss, 1985). INTERVENING FACTORS THAT INFLUENCE THE LEVEL OF ENVIRONMENTALLY TRANSMITTED DISEASE ASSOCIATED WITH WASTEWATER REUSE Numerous papers and reports have measured the presence, con- centration, and survival times of pathogenic microorganisms in wastewater streams and on land and crops irrigated or fertilized with wastewater. From - 67 - our current state of knowledge on this subject (see Chap. 2) it is clear that a broad spectrum of pathogenic microorganisms--including bacteria, viruses, helminths, and protozoa--are present in wastewater in high concentrations and that they survive for days, weeks, and, at times, months in the soil and on crops that come in contact with wastewater. However, the mere detection of pathogenic microorganisms in the soil, on food crops, or in the air is not in itself sufficient proof that human beings are becoming infected or ill as a result of contact or exposure to such pathogens. A number of additional factors intervene, including the minimal infectious dose of microorganisms that must be ingested before a person becomes infected or sick; the state of immunity of the exposed persons to the pathogens under study; the extent of concurrent infection through other routes that may have a more dominant effect and that may mask any disease transmission associated with reuse; and the degree to which subclinical disease results from infection not leading to any obvious or detectable, clinical symptoms, including discomfort or pain. Usually only when an individual feels sick enough to visit a clinic for treatment is the disease episode recorded or detected in such a manner that it could be included in the statistics of disease incidence within the framework of an epidemiological study based on morbidity data. In studies where differences in health effects between populations exposed. to environmental transmission of enteric disease are measured by pathogen antibody levels in blood sera samples, other confounding factors exist. For diseases that cause only temporary immunity (such as cholera or typhoid), antibody levels are indicative of recent infections only. In diseases that impart permanent immunity, antibody prevalence in young children may reach such high levels, as is the case with most enteroviruses, that there is no point in studying differences in levels of environmental transmission among adults such as wastewater farm workers. All these factors must be taken into account in interpreting the results of the epidemiological studies reviewed in this section. These studies have attempted to measure quantifiable differences in levels of environmental transmission of enteric disease by wastewater reuse in agriculture. CRITERIA AND GUIDELINES FOR EVALUATING EPIDENIOLOGICAL STUDIES In evaluating past epidemiological studies, we have used a number of criteria and guidelines to test the proof of the causal relationship between the disease and exposure to wastewater use, in whatever form. Providing proof of such a relationship is considerably more difficult than merely demonstrat- ing an association. Bradford-Hill (1965) suggested nine criteria for evaluat- ing the proof of an epidemiological study. These include strength of associa- tion; frequency of reporting; lack of attention to whether the disease occurs in other environments or whether the environment under study is associated with other diseases; demonstrations that disease follows exposure after an expected incubation period and does not occur before exposure; the existence of a positive dose-response relationship; close consideration of the known - 68 - biology and natural history of the disease; experiments that support a causal relationship; and a fair analogy with some parallel context. Our evaluation is also based on the "Documentation Guidelines for Epidemiological Studies" prepared in 1979 by the Guidelines Committee of the Epidemiology Working Group, which is an interagency effort of the U.S. Public Health Service, the U.S. Environmental Protection Agency, and the National Institutes of Health. These draft guidelines were established to help regulatory agencies in the United States conduct objective and scientific evaluations of epidemiological studies as they bear on public health policymaking decisions. Blum and Feachem's (1983) review of methodological problems in the evaluation of the health impact of water supply and sanitation investments also provided valuable insight and criteria. In addition, we have relied on some other general principles of public health and epidemiology. EPIDEMIOLOGICAL STUDIES ON THE HEALTH EFFECTS ON THE GENERAL POPULATION CONSUMING EDIBLE CROPS, DAIRY PRODUCTS, OR MEAT EXPOSED TO WASTEWATER APPLICATIONS Ascariasis and Trichuriasis among Inmates in Tara Prison, Egypt Possibly the earliest systematic epidemiological study providing strong suggestive evidence that wastewater irrigation of vegetables leads to the transmission of helminthic infections to persons consuming such contaminated vegetables is that described by Khalil (1931), professor of parasitology, Faculty of Medicine, Cairo, Egypt. Tara prison lies about 13 km south of Cairo on the banks of the Nile River. The prisoners were employed in cutting limestone from the hills. They were not involved in agricultural occupations at the time, although most of them were originally farm laborers from all provinces in Egypt. Sullage water, together with the contents of the bucket latrines, was run through open cemented channels and pumped into a small sewage farm 8 ha (20 acres) in area. This farm cultivated vegetables that were consumed by the prisoners and wardens, generally after cooking. Khalil states that prison policy prohibited the cultivation of any vegetables that are eaten raw. Whether that policy was effectively enforced, however, is not stated. The first survey in 1924 consisted of microscopic examinations of the urine and stools of all inmates of the prison--that is, 2,146 males. There were two major findings. First, the incidence of schistoso- miasis and ancylostomiasis (hookworm) was much lighter among the prisoners than among the general peasant population from which most of them came; furthermore, the infection rates for those diseases dropped drastically as the length of time in prison increased. Khalil believed that this decline was related to the absence of reinfection. In general, the prisoners were not exposed to infected bodies of water or to soil polluted with the excreted organisms, since the use of bucket latrines was carefully enforced; thus, reinfection was prevented and the parasites eliminated with time. - 69 - The second finding was that the incidence of ascariasis and trichuriasis (called trichocephalus in Khalil's report) was not reduced with time in prison, and reflected the high rates among the general peasant population (see Fig. 4-1). Khalil attributed this to the presence of the small sewage farm, which used prison sewage to irrigate and fertilize the soil and to grow vegetables. He stated that the vegetables were contaminated with ripe helminth ova, which had developed favorably in the warm, shady, and moist soil of the farm. Although these vegetables were generally eaten after cooking, Khalil believed that the utensils and contaminated hands of the kitchen staff afforded ample opportunity for infection. He demonstrated the presence of fully developed ova of Ascaris in the earth and on roots of vegetables found in the kitchen by direct microscopic examination and by the soil flotation method. He concluded that the hands of all prisoners working in the kitchen were contaminated and that, in turn, they contaminated the food being distributed after cooking. The possibility that raw vegetables may have been consumed as well was not mentioned, although in our opinion it was highly probable. Further evidence is presented in Khalil's comparison of the high incidence (73 percent) of Ascaris infection among the 3,181 residents of Port Said and the low incidence in the village of Kom Ombo in Aswan province. At the same time, however, the incidence of Schistosoma in Kom Ombo was 82 percent as a result of the high degree of exposure to fecally contaminated bodies of water, and that of Ancylostoma was 24 percent as a result of the high degree of soil pollution (see Fig. 4-2). Sanitary conditions in the Aswan villages were so poor that almost no house in the village had latrines or water supply. Khalil states that the reason for the low levels of Ascaris infection remained obscure until it was revealed that human manure was never used as fertilizer in any of the fields in the Kom Ombo area. Port Said, where few endemic cases of schistosomiasis or ancylostomiasis were found, had a central sewage system, practiced sewage farming with municipal wastewater, and grew large quantities of vegetables that were eaten raw despite official 100 NEW AFTER 13+ INMATES' ' YEARS 80- p60 0 20 / 0 HOOKWORM SHISTOSONA ASCARIS Fig. 4-1. Prevalence of parasitic infections in Tara Prison, Egypt, 1925. Source: Adapted from Khalil (1931). - 70 - 1oF PORT SAIOD K OM O 304 VILLAGE 80f11 LZiLJ)ASSWAN 80 _ w _60 - o 'L40 - 20 HOOKWORM SHISTOSOIA ASCARIS Fig. 4-2. Prevalence of parasitic infections in two communities in Egypt, 1925. In Port Said, vegetables and salad crops were irrigated with raw wastewater. In Kom Ombo Village, no wastewater irrigation or night soil fertilization of vegetables took place. Source: Adapted from Khalil (1931). prohibition of these practices. Khalil concluded that the primary factor in Ascaris infection in Egypt at that time was contamination of food, particularly vegetables fertilized with human waste or wastewater. Thus he argued that although helminthic diseases associated with soil pollution and water pollution had disappeared among the prisoners because they had not been reinfected, ascariasis and trichuriasis infections flourished owing to wastewater irrigation of the vegetables. He applied the same logic in comparing Port Said with the villages in the Kom Ombo region in Aswan Province. Port Said, with its generally improved sanitary conditions, still had a very high incidence of Ascaris infection, which the author associated with the sewage farming of vegetables, whereas Aswan Province showed little Ascaris infection, since fertilization of vegetables with human manure was not practiced there. It is important to -note that Khalil found that the incidence of Ascaris and Trichuris infection among the prisoners was similar to that found among the wardens, who also became infected by the sewage-contaminated vege- tables prepared in the kitchen, which served both groups. In the village of Maasara lying about 3 km from the prison, 80 percent of the inhabitants were infected with Ancylostoma. The prison was equipped with sanitary bucket latrines, which were also available to the prisoners at the quarry where they worked. The fact that ancylostomiasis infection dropped from 67.3 percent among the newly admitted prisoners to less than 10 percent among those that had remained in prison 10 years or more is evidence of the value of the bucket latrines in preventing soil pollution and - 71 - thus in reducing the possibility of reinfection. Khalil concluded from observations at Tara prison and other localities in Egypt that fecally contaminated vegetables were without doubt the principal vehicle for Ascaris infection. Although this early study does not meet all of the rigorous criteria of a modern epidemiological investigation, the weight of the author's evidence is strong and it is difficult to assail the logic of his argument. Moreover, he appears to provide the first substantive evidence along modern lines that Ascaris and Trichuris infections are effectively transmitted by vegetables grown in soil irrigated with wastewater. The finding of a clear and measureable quantitative excess of Ascaris and Trichuris infections as a result of consuming wastewater-irrigated vegetables, even in a poor country where the diseases are highly endemic and where levels of personal and domestic hygiene are not high, is of particular interest since this may be of importance to other developing countries with low socioeconomic levels. Ascariasis in Darmstadt, Germany By 1892 the city of Darmstadt, Germany, had a major central sewage system that included as its only wastewater treatment and disposal element a large farm (874 ha) irrigated with raw sewage. Vegetables, potatoes, and salad crops were grown in addition to other crops. Untreated sewage was applied to the soil. Baumhogger (1949) reports that a number of major epidemics of ascariasis took place in Darmstadt between 1908 and 1909, during 1921, and between 1945 and 1948. In each case, the epidemic was associated with severe economic dislocations and food shortages comparable to those that occurred during and immediately after the two world wars. The vegetables and salad crops grown on the Darmstadt sewage farms obviously made up an important part of the food supply of the population of the city, particularly during those periods. In Baumhogger's opinion, the population was weakened by malnutrition, which increased people's susceptibility to infection. According to the figures of Krey (1949) and Schlieper and Kalies (1944), 40-50 percent of the Darmstadt population was positive for Ascaris infections in stool examinations. In the Grieshiem section of the city--where the local farmers irrigated vegetables with wastewater--the Ascaris rate was 89 percent (see Fig. 4-3). Children under 15 generally had the highest infestation rates. Krey also mentions a comparable situation described by Gromatschewskii (1930) in Odessa, which was also associated with irrigation of vegetables and salad crops with raw wastewater. Baumhogger states that in a comparative study carried out at Berlin during the same period, only 2.2 percent of the population were shown to be positive for Ascaris eggs. He also points out that Berlin practiced wastewater irrigation on sewage farms from an early period as well; however, the Berlin wastewater underwent treatment, including sedimentation and biological oxidation, prior to irrigation. - 72 - wostewoterj no wastewater 100 irrigation e irrigation Fig. 4-3. iastewater itreiatioe fveealedn ~80 C.) ~60 > 40 C/) 20 0A 0 - - U. Fig. 4-3. Wastewater irrigation of vegetables and Ascaris prevalence in Darmstadt, Berlin, and other cities in Germany in 1949. Darmstadt used raw wastewater for irrigation. In Berlin, wastewater received biological treatment and sedimentation. Sources: Baumhogger (1949), Krey (1949), and Schlieper and Kalies (1949). Ascaris rates in cities not practicing wastewater irrigation were as follows: Marburg, 9.8 percent; Wiesbaden, 5 percent; Bad Homburg, 8 percent; Dillenbury, 5 percent; and Giesen, 2.7 percent. The author contends that the epidemics of ascariasis in the Darmstadt area are a direct result of sewage irrigation with raw, unsettled sewage that carried heavy loads of Ascaris eggs to the fields and to the crops, which were later disseminated to the general population on vegetables and salad crops. In the author's opinion, the pretreatment of wastewater in Berlin, including sedimentation, provided adequate protection in preventing such an epidemic from breaking out in Berlin. In conclusion, the author recommended that the Darmstadt sewage be treated in such a manner as to effectively remove the Ascaris eggs prior to using the effluent for the sewage farm. It is difficult to evaluate the findings of these studies since very little information is given on the methodology of the population surveys or other factors that may have contributed to the very high Ascaris infestation among the population in Darmstadt compared with the population in Berlin, both - 73 - of which practiced sewage farming at that time. However, the strong implication that the widespread epidemic of ascariasis in Darmstadt resulted from the use of untreated wastewater effluent for the irrigation of vegetable and salad crops--such disease transmission did not occur in Berlin, which used well-treated sewage--is difficult to ignore. The fact that the highest rates were in the Grieshiem sections, where sewage farming was actually carried out, suggests that occupational exposure of workers and contact infection of their families may also be one of the modes of transmission other than the vegetables. Despite their limitations, these studies provide strong circumstan- tial evidence that, even in a country of generally high personal and domestic hygiene, irrigation of vegetables and salad crops with raw wastewater can lead to quantifiable massive transmission of Ascaris infections to the general population consuming such crops and/or to those occupationally exposed to wastewater irrigation. Epidemiological Evidence for Belminth Transmission by Vegetables Irrigated with Wastewater in Jerusalem In reviewing earlier research, we reanalyzed the data in some published and unpublished papers on the possible association between irrigation of salad and vegetable crops with raw wastewater in Jerusalem and helminth infections with Ascaris and Trichuris during the period 1936-1982. In 1923, for example, Kligler examined the use of Jerusalem's raw wastewater for growing vegetables in the adjacent, almost completely arid, Kidron (Silwan) Valley, and suggested that the high incidence of ascariasis in Jerusalem was associated with the consumption of fecally contaminated salad crops and other vegetables grown and marketed by enterprising villagers who had little or no other sources of water for agriculture. No firm epidemiological evidence was available at that time to support this hypothesis. This situation changed, however, after the State of Israel was founded in 1948 and the city of Jerusalem partitioned. The western section of Jerusalem remained in Israel whereas the Old City and the areas to the east, including the Kidron and Refaim valleys, fell under the administration of the Hashemite Kingdom of Jordan (see Fig. 4-4, which shows the location of the fields irrigated with raw wastewater). There were no commercial contacts between the two parts of the city, and the supply of wastewater-irrigated vegetables was suddenly and completely cut off from western Jerusalem. In 1952, the Israel Ministry of Health introduced regulations prohibiting the use of wastewater for the irrigation of salad crops or other vegetables eaten raw, and this prohibition was strictly enforced. Thus, the conditions of a natural experiment were created whereby the impact of totally cutting off the supply of wastwater-irrigated vegetables might be measured. - 74 - I ' N Jerusalem Municipal Sewage ' Drainage Areas - Outfall ' / Sewer Lines and Sewage Irrigated Plots ------- Jerusalem Municipal , I - Area | in4 Sewer Main Pipes / Northern Open Sewage Flow .-- , drainage Sewage area Irrigated , ' Easttrn Plots drainage area Old - ~~~~~~~City Soreq ~~ Southern 5arey ,I $r -- drainage area Refaim Valley 03 titl -0o ( _s , _ Valage' ' ' , i 0 1303 2000 m Fig. 4-4. Municipal drainage areas of Jerusalem and plots irrigated with raw wastewater up to 1970. In 1962, Ben-Ari published the results of 126,000 microscopic examinations of stools carried out by the Department of Clinical Microbiology of the Hadassah-Hebrew University Hospital between 1934 and 1960. Two-thirds of these specimens came from patients who were hospitalized because of various ailments, and one-third came from outpatients complaining of diarrhea or gastrointestinal disturbances. Of the 50,000 fecal specimens examined between 1935 and 1947, 35 percent were positive for Ascaris lumbricoides and 13 percent were positive for Trichuris trichiura (see Fig. 4-5). These rates were considerably higher than those found in similar groups residing in other cities of the country where wastewater irrigation of crops was not practiced. Between 1949 and 1960, the same laboratory examined 75,000 stool specimens and noted a sharp drop in the findings of Ascaris and Trichuris. The mean for this period was only 1 percent positive for Ascaris and 4.7 percent positive for Trichuris. - 75 - Supply of wostewater 40 - irrigated vegetables cut-off .> z Supply of wastewater . irrigated vegetables 0 30 - reintroduced IL Wastewater to tD trrigation of vegetables U20 - stopped 0 76'o* 10 0 0 1935- 19L48 -1968- 1975- 1947 1966 1970 1982 Fig. 4-5. Relationship between Ascaris-positive stool samples in population of western Jerusalem and supply of vegetables and salad crops irrigated with raw wastewater in Jerusalem, 1935-1982. Sources: Ben-Ari (1962), Jjumba-Mukabu and Gunders (1971), and Shuval, Yekutiel, and Fattal (1984). The authors explained these findings by the historical events associated with the partitioning of the city and the total cessation of the supply of wastewater-irrigated vegetables that had previously served as one of the main sources of fresh salad crops. However, other developments in western Jerusalem must also be considered in explaining the almost total disappearance of these two helminthic diseases. One such development was the great improvement in the general socioeconomic level of the city between 1949 and 1960. This included a reduction in housing density, an increased quantity of safe domestic water supply, increased domestic hot water supplies for cleansing and bathing, and improved food hygiene in public eating establish- ments. The general level of nutrition also increased, as did the per capita income. It is important to note that although slower but similar socio- economic improvements also occurred in the eastern sections of Jerusalem, there was no significant reduction in the levels of Ascaris and Trichuris infections among the residents of those sections, who still continued to be supplied with the wastewater-irrigated vegetables from nearby villages located in their areas (Alicata and Dajani 1955). When Jerusalem was reunified after the 1967 war, commerce between the two sections of the city was fully resumed, including the supply of waste- water-irrigated vegetables. As the population of Jerusalem expanded, the water supply increased and the western portions of the city became almost - 76 totally sewered. At the same time, the flow of wastewater had greatly increased and new outfall sewers extended the areas of wastewater-irrigated crops to the Battir village in the Refaim Valley. This village had also previously been under the Jordan-administered areas near the city. Since the new status of these areas was that of administered territories, the Israel regulations forbidding wastewater irrigation of vegetables were not in force. A study by Jjumba-Mukabu and Gunders (1971) provided valuable insight into the effects of the reintroduction of wastewater-irrigated vegetables into the western portion of the city, 'where Ascaris infections were almost unknown and Trichuris infections were at a very low level. Routine laboratory stool examinations of residents of western Jerusalem at the Central Laboratory of the Kupat Holim Sick Fund revealed a steep rise in Ascaris infections, from essentially none in May 1967, prior to the unification of the city, to a level of 12 percent positive a year later (see Fig. 4-5). There was also a similar steep rise in Trichuris infections. Among the residents of eastern Jerusalem, who had been continuously exposed to the wastewater-irrigated crops, 60 percent of the stools examined were positive for Ascaris. Investigators subsequently found samples of parsley and other salad crops grown in the wastewater-irrigated areas to be heavily contaminated with the helminths. The rapid increase in Ascaris and Trichuris infections between 1967 and 1969 among the residents in the western sections was thought to be partly related to the reintroduction of wastewater-irrigated vegetables to the western part of the city. Another possible explanation, however, is that with the unification of the city, many residents of western Jerusalem began eating in the numerous small restaurants and food stalls in the Old City, where levels of food hygiene were generally lower than in the western part of the city. Although only a portion of the population frequented such establish- ments, this factor cannot entirely be ruled out. A cholera outbreak in Jerusalem in 1970 clarified this issue. Following the cholera outbreak, the authorities put an end to the raw wastewater irrigation of vegetables that had been practiced in the villages around Jerusalem since the beginning of the century. Within a few years, the rate of Ascaris-positive stools dropped to its previous level of 1 percent or lower (see Fig. 4-5). Trichuris infections also declined. Despite the fact that the population continued to frequent the restaurants and food stalls of the Old City, these helminth infections have practically disappeared. Thus, this episode provided conclusive evidence of the major role played by wastewater-irrigated vegetables in the transmission of Ascaris and Trichuris infections in the population consuming these contaminated crops. It should be pointed out that, in contrast to the population studied by Khalil (1931) in Egypt, the population affected in Western Jerusalem lived at very high levels of personal and domestic hygiene. The detection of a clear and quantifiable excess transmission of helminths by vegetables irrigated with raw wastewater might be expected in Jerusalem more than in Egypt, with its much lower levels of general hygiene. - 77 - Cholera Outbreak in Jerusalem 1970--The Case for Transmission by Wastewater- Irrigated Vegetables Between 1971 and 1973 a number of reports on the 1970 cholera epidemic in Jerusalem appeared in the literature (Cohen et al. 1971; Gerichter et al. 1971; Gerichter et al. 1973), but some of the most pertinent informa- tion on the role of wastewater-irrigated vegetables in the spread of that epidemic was published in local journals not widely known outside the country. We have made a special effort to integrate and reanalyze all available data on the 1970 cholera epidemic, including some unpublished data from our own investigation of the environmental aspects of that epidemic during the outbreak itself (Fattal et al. 1985). The main findings and conclusions of that investigation are presented below. -- In the summer of 1970, numerous cases of cholera were reported in Middle Eastern countries contiguous to Israel. Some 100,000 visitors from those countries entered Israel and Israel-held territories during that summer. The first cases of cholera were detected in Jerusalem on August 20, 1970, and as the Jerusalem outbreak spread, several features became clear. The disease appeared simultaneously in all sections of eastern and western. Jerusalem and reached 59 cases by the week of September 13-20, one month after the start of the epidemic. Cholera also broke out in villages adjacent to Jerusalem. The total number of clinical cases reported in the city of Jerusalem reached 176 (see Fig. 4-6), and an additional 82 cases were confirmed in nearby villages. All acute cases were investigated in detail. Little evidence of secondary contact infections was found in family groups or among co-workers. There was also no spread to other cities in Israel, although the city of Jerusalem was open to normal commerce and tourism. Thus, it appeared that a common-source epidemic was occurring. The water supply came from deep sanitary wells outside the city and was chlorinated prior to supply. Routine bacteriological examinations of the water indicated that it met the strict Ministry of Health standards and thus was not considered a possible route of infection. All milk and dairy products were pasteurized and under strict laboratory quality control and thus were also ruled out as a possible common source. Insect vectors, too, were ruled out since the level of general sanitation in the city was high; in addition, the city had few or no sources of exposed excreta and a low housefly popula- tion. In the early stages of the outbreak, we suggested that a possible route of transmission might be the raw wastewater-irrigated vegetables grown in the Kidron and Refaim valleys (see Fig. 4-4) and marketed by peddlers and in shops in both the eastern and western parts of the city since reunification in 1967, as noted earlier. During the epidemiological case investigations, individuals often mentioned the purchase of salad crops and vegetables from village peddlers. In some cases the only family member to fall ill with cholera was the one who had consumed these products. No other common source was identified. An - 78 - 70 -~c 60 52 ~50 0. 40 37 r9 ~~~~~~30 0) 30 2 0 AUGUST SEPTEMBER OCTOBER Fig. 4-6. Weekly distribution of cholera cases in Jerusalem, August-October 1970 (n = 176). Irrigation of vegetables and salad crops with raw Jerusalem wastewater stopped by authorities on about September 15-20. Sources: Gerichter et al. (1971) and Fattal, Yekutiel, and Shuval (1984). intensive program of sampling and testing for Vibrio cholerae in the wastewater from the city's main outfall sewers,' from soil, vegetable samples from the wastewater-irrigated plots, and directly from the markets was initiated. A total of 143 wastewater samples were taken by the immersed gauze pad method and by grab samples and were tested for cholera vibrios. A total of 168 samples were examined for cholera phages during the epidemic and in the two months immediately after (Gerichter et al. 1971). During the outbreak, positives for Vibrios cholerae were detected at all wastewater outfall sewers leading from the city, whereas no positives were detected after the outbreak ended. During the epidemic, 18 percent of the wastewater samples were found to be positive for E. cholerae (El Tor, serotype Inaba, phage type-6), the same serotype isolated from the vast majority of the clinical cases. Cholera phages were detected in 67 percent of the wastewater samples taken during the epidemic, and some positives were detected up to three weeks after the last reported clinical case. Cholera vibrios were also isolated from one sample of wastewater-irrigated soil and from a specimen of vegetables grown in a wastewater-irrigated plot. Six samples of vegetables were found positive for cholera phages, including four from wastewater-irrigated plots, one from a sample of parsley taken in the market, and one from a tomato stored - 79 - in the refrigerator in the home of one of the confirmed cases of acute cholera. All samples of milk and other food products were negative. Subsequent laboratory studies showed that V. cholerae (El Tor) could survive long enough in sewage-contaminated soil and on -vegetables to make vegetable- borne cholera possible (Gerichter et al. 1971). The early positive findings of V. cholerae in samples of wastewater used for irrigation of vegetables in the IRafaim and Kidron valleys led the Ministry of Health and the health authorities responsible for the villages to order an immediate cessation of the marketing and growing of all wastewater- irrigated crops. Harvested crops were confiscated and most of those still growing were destroyed on or about September 15-20, 1970. Following that drastic action, the epidemic rapidly subsided (see Fig. 4-6), and the last clinical case was detected some 12 days later. Obviously not all wastewater-irrigated vegetables had been immediately confiscated or destroyed, and some still found their way to the market for a short period after the official action started. In the first report describing the outbreak of cholera, the authors concluded, "It seems likely that after the Jerusalem sewage had become contaminated with vibrios from the early mild cases and carriers (that entered the city from neighboring countries), vegetables irrigated with sewage in the surrounding villages constituted a major secondary vehicle for the spread of the infection" (Cohen et al. 1971). That report did not present the findings of the environmental monitoring, however, or other details concerning the association with wastewater irrigation and the subsidence of the epidemic. As Feachem (1982) has pointed out, "No solid evidence is presented to support this hypothesis (transmission by sewage-irrigated vegetables] but it is plausible. It is unfortunate that studies were not carried out during the outbreak to prove or disprove the vegetable hypothesis." In view of the full history of this outbreak--including the results of the extensive positive findings of V. cholerae and phages in wastewater, in the soil, and on the vegetables, and the subsidence of the epidemic upon the cessation of marketing of the contaminated crops--we now believe that there is indeed strong epidemiological evidence, although part of it circumstantial, that the main pathway for the secondary spread of the Jerusalem cholera outbreak of 1970 was through wastewater-irrigated vegetables. Obviously the cases of cholera that initiated the epidemic cycle resulted from imported clinical or subclinical .cases that had entered the city from neighboring countries where cholera epidemics were already in progress. Figure 4-7 illustrates the assumed cycle of transmission that occurred in this case. During November and December 1970, a second outbreak of cholera occurred in and around the city of Gaza, which reported a total of 260 acute cases during a five-week period. Here, too, V. cholerae and cholera phages were isolated from samples of wastewater and wastewater-irrigated vegetables. In the Gaza area, however, the drinking water was found to be contaminated with coliforms as well as with cholera phages (Gerichter et al. 1971). Thus, it was not possible to determine the role played by wastewater- contaminated vegetables compared with the drinking water in transmitting - 80 - FIRST CASES OF CHOLERA INTRODUCED FROM OUTSIDE INFECTED PERSON'S EXCRETA ENTER WASTEWATER CONTAMINATED VEGETABLES WASTEWATER USED TO INGESTED IRRIGATE SALAD CROPS CONTAMINATED VEGETABLES MARKETED World Bank-30138:3 Fig. 4-7. Hypothesized cycle of transmission of Vibrios cholerae from first choLera carriers introduced from outside the city, through wastewater-irrigated vegetables, back to residents in the city. cholera in this second outbreak. The low levels of personal and domestic hygiene that exist in some villages in the Gaza area may also have played a greater role in the transmission of that outbreak than in the case of Jerusalem. Sewage-irrigated salad crops were also thought to play a role in the cholera epidemic in Jordan in 1981 (Jalal 1983). Although there was some highly suggestive circumstantial evidence supporting this hypothesis, no firm conclusions could be drawn. The Jerusalem outbreak provides the first strong evidence that cholera can be effectively transmitted by vegetables and salad crops irrigated with raw wastewater. Contaminated vegetables were clearly the main secondary route of transmission in this case because all the other usual routes of environmental transmission in Jerusalem, where cholera had never been endemic, were blocked as a result of a safe, protected, chlorinated water supply, and very high levels of personal and domestic hygiene. - 81 - In areas where cholera is endemic and where all the usual routes of transmission are open, it is not clear whether wastewater irrigation of vegetables can be associated with a clearly quantifiable excess of cholera in the population. consuming wastewater-contaminated vegetables. Under such conditions, the effects of transmission by vegetables might be masked by the high levels of transmission by numerous other concurrent routes. No reports of cholera transmission by wastewater-irrigated vegetables from areas where cholera is endemic have come to our attention. Furthermore, the sewage farms of the United Kingdom were not implicated in cholera transmission in the last century, even during periods of cholera epidemics. Typhoid Fever and Sewage Irrigation in Santiago, Chile 1/ The city of Santiago, Chjle, with a population of about 4 million in 1984, produces some 100 million m of sewage per year. During the dry summer months of December through May, most of the wastewater is utilized for irrigation, either as raw undiluted sewage or in mixtures with the waters of the Rio Mapocho. It has been estimated that some 4,500 ha are irrigated with sewage. Some 20,000 tons/year of vegetables, including lettuce, cabbage, celery, cauliflower, and other salad crops normally eaten raw, are produced on the sewage-irrigated farms and marketed in all areas of Santiago. Typhoid fever is a major health problem in Santiago. The annual incidence since 1977 has ranged from 150 to 200 cases per 100,000 (Ministry of Health, Chile, 1980). The total number of officially reported cases in 1981 was 6,936 (159/100,000). Santiago has the climatological, cultural, and socioeconomic charac- reristics of many socially and economically advanced cities of the world where typhoid fever has been effectively controlled. Most important is the fact that 96 percent of the homes are connected -to the well-managed central water supply system. This system provides filtered, chlorinated drinking water, which is under good microbial .surveillance. Most milk and dairy products are pasteurized in modern plants. Seventy-five percent of the homes have modern indoor sanitation, including flush toilets connected to a central sewage system. A good indicator of the generally high level of health and sanitation in Santiago is the low infant mortality rate, which is comparable to that found in some countries in southern Europe. Typhoid fever would not normally be expected to remain a problem under such advanced conditions. When similar cities in the United States and Europe introduced safe, treated drinking water supplies at the beginning of the twentieth century, typhoid fever practically disappeared within 10-20 years, even though other socioeconomic and sanitation factors were not as advanced then as they are now in Santiago. 1/ The following analysis is based on a field investigation carried out for the World Bank (Shuval 1984) and other published reports. - 82 - Many public health authorities and scientists have suspected for years that the massive consumption of salad crops irrigated by raw sewage is the unique external environmental factor that has caused Santiago to have such an otherwise inexplicably high incidence of typhoid fever. There appears to be considerable circumstantial evidence to support that hypothesis. Much of the more recent evidence is the fruit of intensive investiga- tions by a team of scientists from the University of Maryland's School of Medicine, headed by Dr. Myron Levine, working in cooperation with Dr. Agustin Schuster, who heads the Chilean Typhoid Committee of the Ministry of Health. The epidemiologic investigations of this group are contained in a series of publications, mostly in international journals: Black et al., in press; Lanata et al. 1983; Levine et al. 1982; Ferreccio et al. 1984; Morris et al. 1984; Levine et al., in press. The main points linking the high typhoid incidence in Santiago with sewage irrigation of salad crops are summarized below. a. In Santiago the incidence of typhoid fever rises after the initiation of large-scale sewage irrigation of salad crops during the dry months of summer and early fall (see Fig. 4-8). b. The incidence of typhoid in Santiago is much higher than it is in comparable cities in Chile that do not practice sewage irrigation. (see Fig. 4-9). Although the prevalence of Ascaris and Trichuris infections is lower around Santiago than in the rural regions of the south, where no sewage irrigation is practiced, amoebiasis is more prevalent around Santiago than in, all other regions of the country (Schenone et al. 1981). c. During the summer irrigation months, the typhoid fever rates in Santiago are significantly higher than in all other areas of the country. During the winter months (July-October) the rates in Santiago and the rest of the country are the same (Ferreccio 1983). This suggests a unique additional external exposure to the risk of typhoid in the summer in Santiago, in contrast to the situation in the rest of the country (see Fig. 4-8). It also indicates that the high summer typhoid rates are not carried over to the winter by person-to-person contact as might be expected if this were the primary route of disease transmission, as suggested by some investigators. d. Reported cases of typhoid are distributed in more or less all socio- economic neighborhoods in Santiago. This suggests considerable exposure to risk regardless of levels of personal hygiene and domestic or neighborhood sanitation. High risk in all socioeconomic classes suggests external exposure and a common source. e. The incidence of typhoid fever in Santiago may be significantly higher than officially reported. Studies in other countries indicate that important communicable diseases such as typhoid often are underreported. Underreporting is particularly probable in Santiago in the medium to high socioeconomic groups that tend to use private - 83 - 28 24 _ a . - _ *_Santiago 8 20 , _ * Rest of Chile 812La- 4== 4 _ 0. Jan. Feb. Mar. Apr. May Jun. Jul. Aug. Sep. Oct. Nov. Dec. Fig. 4-8. Seasonal variation in typhoid fever cases in Santiago and the rest of Chile (average rates, 1977-1981). Source: Based on a field investigation carried out for the World Bank (Shuval 1984) and oth'er published reports. 220 - 200 _ , , 180 _- + 6 160 a5 140 - * 120 8 i00 _ -80 60 _-~ 4 0 ' Santiago 20 - Rest of Chile 0 I I I I I I I I _ 73 74, 75 76 77 78 79 80 81 82 83 84 Fig. 4-9. Typhoid fever in Santiago and the rest of the country, 1973-1984. Source: Based on a field investigation carried out for the World Bank (Shuval 1984) and other published reports. - 84 - doctors rather than public, official clinics, where reporting may be better. One study in Santiago (Levine et al. 1984) provides important serological evidence that the number of typhoid infections as determined by persons with antibodies to typhoid fever may be seven times greater than the actual number of reported cases. In addition to the health and economic implications of this finding, one can also deduce that the true incidence of typhoid in Santiago may even be somewhat higher than in other cities since a high percentage of the population uses the services of private doctors in Santiago and their cases may go unreported. f. Typhoid fever rates in very young infants (0-2 years), normally typi- fied by the short-cycle type of person-to-person contact is not a dominant form of transmission in Santiago. It is reasonable to assume that such young infants are not generally exposed to raw salad crops and thus may be exposed to lower levels of external sources of contaminated foods than older children and adults. g. Typhoid fever rates do not increase during the summer months in the Los Lagos (the lakes) region where numerous residents from Santiago spend their summer vacations. This suggests that they do not bring important person-to-person typhoid transmission factors with them and are not exposed to external sources since sewage irrigation is not. practiced in that area. h. In the summer of 1983, the Ministry of Health implemented new regula- tions to help control certain types of salad crops (Lettuce) grown with raw sewage. These regulations were strictly enforced and thus reduced the production of sewage-irrigated lettuce. The Ministry of Health reported that from January to March of 1984 (i.e., the summer peak period for typhoid) there were about 30 percent fewer new cases of typhoid fever compared with the same period the previous year. This might be initerpreted as specific evidence of the importance of sewage-irrigated salad crops in typhoid fever transmission. i. Finally, an important additional link that had been missing until now is the recent finding of S. typhi (the causative agent of typhoid fever) in wastewater canals used to irrigate land growing salad crops (Sears et al. 1984). The many different pieces of evidence, although mainly circumstan- tial, all point in the same direction, to sewage-irrigated salad crops as a prime vehicle for the otherwise inexplicably high rate of typhoid fever in Santiago. Other gastrointestinal diseases, including infectious hepatitis and amoebiasis, are most likely also transmitted by this route and, in total, may be of equal or even greater health and economic importance than typhoid fever. The fact that the prevalence of Ascaris and Trichuris infections is relatively low in Santiago may be due to the fact that those diseases are not highly endemic in the area and thus transmission by sewage-irrigated vege- tables is limited. This situation is quite different from that in other - 85 - areas, such as Darmstadt, Germany in 1945, and requires further study and elucidation. The situation in Santiago--that is, the high level of typhoid fever transmission in a city that otherwise has a reasonably high socioeconomic and hygiene level is probably closely associated with sewage irrigation of raw vegetable crops--is unusual. Nonetheless, it suggests that in situations where normally dominant routes of typhoid fever transmission are blocked (i.e., safe water supply, good personal hygiene, safe milk supplies, etc.), the disease can be effectively transmitted by wastewater irrigation of raw vegetables and can lead to a significant detectable excess of typhoid fever. Transmission of Disease to Humans by Meat or Dairy Products from Cattle or Sheep Grazing on Wastewater-Irrigated Fields The primary route of infection of humans with Taenia saginata, the beef tapeworm, and Taenia solium, the pork tapeworm, is through the consumption of raw or undercooked meat infected with the larval stage of these worms. Although there are no confirmed reports of human infections related to the consumption of beef or pork from animals that grazed on wastewater- irrigated pastures, a number of authorities consider that route of infection to be a potential risk (Silverman and Griffiths 1955; Greenberg and Dean 1958). Numerous reports have shown that cattle become infected with the worms by grazing on pasture irrigated with raw or partly treated wastewater. As early as 1937, Penfold and Phillips, (1937) and Penfold, Penfold and Phillips (1937) reported accumulations of the eggs of Taenia saginata, the beef tapeworm, in the soil near the raw wastewater outlets at the Werribbee Farm in Melbourne, Australia. Forty-six percent of the cattle that had been on the farm for six months were infested with the cysts of Cysticercosis bovis, caused by infection of the animal muscle by the T. saginata worm. These researchers also reported that 100 cases of human tapeworm infections had been treated in Melbourne during the six-month period of their study. They found that the oxen recovered from the infestation after long- term exposure of two-and-a-half to three years and acquired an immunity to further infection, but the beef muscles were left with the distinctive cysts of scar tissue. Subsequently, the Australian government banned the sale for human consumption of cattle raised on the Melbourne sewage farm (Wilson 1944). However, it is not possible to draw any conclusions from these reports as to whether the human cases of the disease in Melbourne were a direct result of consuming meat from cattle grazing on wastewater-irrigated pasture at the Werribbee Farm. There was undoubtedly a sufficient concentration of T. saginata eggs excreted by the population into the sewage system to cause the massive infections detected among the cattle exposed to pastures irrigated with raw wastewater. Since that time all wastewater has been treated in ponds or in sedimentation basins prior to irrigation of the pastures. This treatment should settle out the eggs of T. saginata. A recent report from the manager - 86 - of Werribee Farm, Mr. James B. McPherson (1980), indicates that now some 22,000 head of cattle and 20,000-50,000 sheep are grazed on the sewage- irrigated pastures. The sale of cattle is strictly regulated by law, how- ever. All cattle must now be slaughtered at a registered abbattoir controlled by the Department of Agriculture and all carcasses are subject to rigid inspection. Since all effluent is treated in ponds prior to pasture irrigation, condemnation of carcasses of Werribbee Farm animals from all causes has run at about 0.02 percent in recent years, or half the average for inspected cattle from the rest of the state of Victoria. This statistic provides important indirect evidence of the effectiveness of pond treatment in removing the settleable eggs of helminths such as T. saginata. Other reports (Jepson and Roth 1949; Silverman and Griffiths 1955; Greenberg and Dean-1958) have confirmed that cattle might become infected with the disease by grazing on raw wastewater-irrigated fields or by drinking wastewater from open canals. The eggs of T. saginata, for example, are known to survive in the soil for six months under cool moist conditions, although under hot dry conditions survival is unlikely to exceed two months (Feachem et al. 1983). Thus, the potential for transmission by wastewater irrigation cannot be overlooked. However, such transmission may be primarily a veterinary problem owing to the serious economic losses resulting from the infection of animals. This aspect of the effects of wastewater irrigation is not considered in this study. The possibility that salmonellosis may be transmitted to cattle grazing on wastewater-irrigated pasture has also been studied. Salmonella organisms are normally present in wastewater, often in high concentrations, and can survive for long periods in the moist soil. Although some experimental epidemiological evidence from studies in Switzerland, Germany, and the United Kingdom strongly suggests that cattle can become infected with Salmonellae after grazing on, pastures fertilized with wastewater sludge (Pike 1980), there is no specific indication of further transmission to humans as a result of the infection of the cattle. Salmonellosis is generally endemic as a zoonotic disease among cattle and other domestic animals, and the disease is transmitted between them by a number of different routes. Feachem (1982) has concluded that there is no clear evidence that cattle grazed on pastures fertilized with sludge, night soil, or wastewater are more at risk from salmonellosis than other cattle. Pike (1980) has likewise concluded that although transmission of salmonellosis to cattle by the use of sludge on grazing land cannot be ruled out, it would appear to be a minor cause except in cases of highly dense animal populations on pastures heavily fertilized with untreated sludge containing high concentrations of Salmonellae. These findings suggest that the problem of Salmonella transmission to cattle and subsequently to humans is a result of wastewater irrigation of pastureland can indeed be considered a marginal problem, although it cannot be completely ignored. Thus, pretreatment of effluent that provides some measure of inactivation or removal of Salmonellae organisms would be prudent. The third potential public health problem associated with wastewater irrigation of pastures is that effluent may contain Mycobacterium - 87 - tuberculosis, which can cause bovine tuberculosis. In turn, humans can become infected by consuming contaminated milk or meat. The pathogen has been found in wastewater, and although much has been written about the possibilities of such transmission, there is no concrete evidence supporting it (Greenberg and Kupka 1957). Feachem (1982) has concluded that it remains doubtful whether transmission of either human or bovine tuberculosis is significantly affected by wastewater irrigation of pasturelands. However, we believe that in areas where tuberculosis is a major public health problem, this issue should be carefully evaluated in any plans to irrigate pastures with wastewater. There are no published reports on disease transmission due to dairy products derived from milk animals grazed on wastewater-irrigated pastures. Physical contamination can occur when milk is taken from cows or sheep whose udders have become contaminated with human pathogens from the wastewater irrigated pastures. Enteric bacterial pathogens from wastewater such as salmonellae can multiply rapidly in uncooled milk and create a possible source of foodborne disease. If the milk is boiled or pasteurized before humans consume it and before cheese and other dairy products are prepared, this problem can be overcome. Regardless of whether or not pastures are irrigated with wastewater, pasteurization of all milk for direct consumption or for the production of dairy products is an essential public health measure. EVALUATION OF EPIDEKIOLOGICAL STUDIES ON THE HEALTH EFFECTS ON AGRICULTURAL WORKERS DIRECTLY EXPOSED TO CONTACT WITH WASTEWATER IN IRRIGATION Since the number of epidemiological studies evaluating the health effects of wastewater irrigation on the workers directly exposed is very limited, this section will include reports on a number of wastewater treatment plant personnel in direct or indirect contact with wastewater or wastewater aerosols, which in many respects resembles the type of exposure experienced by wastewater-irrigation workers. Intestinal Parasitic Infections Associated with Sewage Farm Workers--India The Central Public Health and Engineering Research Institute in Nagpur, India (Krishnamoorthi, Abdulappa, and Anwikar 1973), has completed one of the most extensive and detailed studies of intestinal parasitic infections to be carried out on sewage farm workers. The study is based on fecal specimens from 466 sewage farm workers at five large sewage farms in various parts of the country. Irrigation was carried out by flood irrigation methods. Stool samples were examined for Ancylostoma duodenale (hookworm), Ascaris lumbricoides (round worm), Trichuris trichiura (whipworm), and a number of other parasites; 432 stool samples were examined from a control population from areas in the vicinity of each sewage farm. Details on the control populations are not provided. Table 4-1, which is based on our re-analysis of the results of this study, indicates the prevalence of dominant parasitic infections in the sewage farm workers and the controls for each of the five areas. Figure 4-10 TABLE 4-1 Prevalence of Intestinal Parasites in Sewage Farm Workers and Control Groups, India Sewage Farm Workers Controls Farm Parasite Number positive Percentage positive Number positive Percentage positive I Jaipur n 92 -- 107 -- Hookworms 45 48.9 34 31.8 Ascaris 46 50 15 14 All parasites 72 80.4 59 55.1 11 Kodung n 114 -- 89 -- Hookworms 92 80.7 55 61.8 Ascaris 44 38.6 17 19.1 I All parasites 110 96.5 65 73 co IIl Amberpeth, n 154 -- 110 -- Hyderabad Hookworms 91 59.1 33 30 Ascaris 44 28.6 1 0.9 All parasites 119 77.3 55 50 IV Trivandrum n 66 -- 70 -- Hookworms 61 92.4 10 14.3 Ascaris 61 92.4 21 30 All parasites 66 100 23 32.9 Vu Manjari and n 40 -- 56 -- Hodapsar, Hookworms 36 90 9 23.2 Poona Ascaris 33 82.5 4 7.1 All parasites 37 92.5 13 23.2 All Farms n 466 -- 432 -- Hookworms 325 69.7 141 32.6 Ascaris 218 46.8 58 13.4 All parasites 406 87.1 215 49.8 Source: Krishnamoorthi, Abdulappd, and Anwikar (1973). - 89 - SEWAGE FARN CONTROLS WORKERS n-466 , U n -432 too 80 > 60 cn- 0 0- 20 TOTAL HOOKWORM ASCARIS POSITIVE Fig. 4-10. Prevalence of parasitic infections in sewage farm workers and controls from various regions of India. Source: Adapted from Krishnamoorthi, Abdulappa, and Anwikar (1973). summarizes the findings for all sewage farm workers and controls combined. From these data, it can be seen that 87 percent of the sewage farm workers were positive for one or more parasitic infections (total positive), compared with 50 percent for the controls. Seventy percent of the sewage farm workers were positive for hookworm, compared with 33 percent for the controls; and 47 percent of the sewage farm workers were positive for Ascaris, compared with 13 percent for the controls. Despite considerable variation in prevalence levels between farms in different regions of India, there is a general pattern of excess infections among sewage farm workers compared with the control group in that same region. These results indicate that sewage farm workers carried a significant excess of hookworm and Ascaris infections compared with the controls (p > 0.01). The percentage positive for the other parasitic infections taken individually did not indicate clear-cut and consistent differences between the sewage farm workers and the control population. When persons positive for one or more parasitic infections are combined (total positives--all parasites) there is also a significant excess among sewage farm workers. Most of this is due, however, to the dominant roles played by Ascaris and hookworm. Hookworm infections occur when the hookworm larvae penetrate lesions in the skin, usually on the feet, of persons exposed to soil contaminated with hookworm eggs. The moist conditions of sewage-irrigated fields provide optimal conditions for eggs to ha~tch and infect workers who do not usually wear shoes or boots. - 90 - In addition to the significant levels of infection among sewage farm workers, the India study found that the intensity of the infection (that is, the number of parasite eggs found per stool examination) was considerably higher among the sewage farm workers than among the controls. Figure 4-11 presents our re-analysis of the combined intensity of infection data. The number of sewage farm workers with medium and heavy infections is more than double the controls. The authors give equal weight to the importance of the intensity of the infection caused by exposure to hyperendemic conditions and by repeated exposure to infective parasites. Raw sewage was used for irrigation in all cases, except at the Amberpeth Farm at Hyderabad, where septic tank treatment and partial dilution was practiced. Sewage farm workers there also had higher infection rates than did the controls. No details are provided as to the detention period of the septic tank, however, so it is difficult to evaluate the effectiveness of this treatment process. Although the above report appeared in symposium proceedings on environmental pollution that did not receive wide distribution outside of India, the basic findings have been disseminated worldwide by the Central Public Health Engineering Research Institute of Napur, India (see CPHERI 1971). The CPHERI document, which is widely quoted in the literature, did not include the final analysis of the data since it covered only 360 sewage farm workers and 306 in the control group. However, the results were essentially the same. The document also cited two earlier studies, both unpublished. The first, by T. B. Patel (Report of Health Hazards from Sewage Farming in Ahmedabad 1962-1964) found 75 percent of 152 sewage farm workers positive for parasites compared with 60 percent of 679 in the control group. The second study, by S. A. Kabir and T. T. Rajabhooshanam (The Final Report on the Health Hazards of Sewage Farming, Madurai 1968-1969), found 78.4 percent of 663 sewage farm workers positive to parasites, and only 19.3 percent of 2,644 controls positive. Attempts to obtain the above unpublished reports were unsuccessful. Information on other diseases and clinical examinations not included in the paper by Krishnamoorthi, Abdulappa, and Anwikar is presented in the CPHERI publication (1971), which indicates that each worker was given a thorough examination, including a hemoglobin test. Anemia was reported in 50.3 percent of the sewage farm workers, whereas only 23.6 percent of the control group were anemic. Attempts to obtain the full report on the methods and criteria for establishing anemia levels were unsuccessful. However, if we assume equivalent socioeconomic and nutritional levels for both the sewage farm workers and the controls, the high prevalence of anemia among the sewage farm workers may be related to the high infestation of hookworm, which are known to be associated with anemia in cases of very intensive infestation. However, a breakdown of anemia by those positive and negative to hookworm would have provided a basis for drawing more reliable conclusions on* this point. At the time of the study, 45.6 percent of the sewage farm workers reported gastrointestinal symptoms (dysentery, enteritis, and so on) compared with 13 percent for the controls. No details are available on how these determinations were made. - 91 - SENACE FARM CONTROLS WORKERS 90 _ (i)80 Tn=6 70- 60 - ~ ~ ~ ~ 2 uLJ C/> Z-- 50 g40- c 30 n-59 - ~~~~~~~~~~~n-131 10 ASCARIS HOOKWORM Fig. 4-11. Intensity of parasitic infection in sewage farm workers and controls in various regions of India. Source: Adapted from Krishnamoorthi, Abdulappa, and Anwikar (1973). The study by Krishnamoorthi, Abdulappa, and Anwikar (1973) appears to be carefully worked out according to established laboratory examination procedures. Our main concern has been the lack of details in the original article on the nature of the control group. We wondered, for example, whether the controls selected from other farms in the vicinity of the sewage farms may have been practicing dry farming except during the monsoon season. If, indeed, the sewage farm workers were exposed to moist soil for 12 months of the year and the controls were exposed to dry soil except during monsoons, those exposed to moist soil for a longer period could be expected to suffer from higher infection rates, even under conditions of equal soil pollution. This is particularly true for the two dominant helminths of this study, hookworm and Ascaris. Both require moist soil conditions for effective infection of humans. We obtained further information on this critical point from B. B. Sundaresan, director of the National Environmental Engineering Research Institute at Nagpur, India, where the original study was carried out. He informed us that "the control population examined here includes agricultural workers who actually practice irrigation with clean water throughout the - 92 - year." This study therefore provides strong evidence that in areas where hookworm and Ascaris infections are endemic and where levels of personal hygiene of the sewage farmers are low, as would be the case in most developing countries, sewage farm workers are exposed to a significant degree of heavy infection from both hookworm and Ascaris. Because both of these are long-term cumulative infections, lengthy occupational exposure will cause the infestation to build up to high levels of intensity. This study indicates that a quantifiable excess of Ascaris and hookworm among sewage farm workers can occur even in a country where very low levels of hygiene are the rule, and where both diseases are highly endemic in the general population. The Epidemiological Significance of Urban Sewage in the Spread of Possible Zooparasitic Infections The original paper on zooparasitic infections connected with sewage was published in German (Sinneker 1958) and was not available; thus, we had to rely on information obtained from brief abstracts in English which the author reports on investigations of intestinal parasitism in workers in both urban sewage systems and sewage-irrigated fields. The percentage of infections was 14 and 9 percent, for E. histolytica, 2 and 16 percent for Ascaris lumbricoides, and 5 and 11 percent for T. trichiura. Viable eggs of these worms were found on vegetables grown in the irrigated fields. One abstract fails to include the level of intestinal parasites in a control population not occupationally exposed to wastewater. Another source that reviewed this same article provides conflicting information as to the actual rates of Ascaris found. According to Feachem et al. (1983), the Ascaris rates were 3 percent among sewage maintenance workers, 16 percent among sewage treatment plant workers, 30 percent among sewage irrigation workers, and only 8 percent in a control group. Since the original paper could not be obtained, we could not reconcile these conflicts nor could we establish the methods used, or the nature of the controls. Thus, it is difficult to interpret these findings in a meaningful way as to whether sewage treatment plant workers and sewage farm workers in Germany have an excess of these parasitic infections over the normal population levels. Sewage Workers' Syndrome A number of studies deal with the health of wastewater plant workers possibly affected by their contact with wastewater pathogens. Although wastewater plant workers do not, strictly speaking, fall into the category of wastewater irrigation farm workers, there are many similarities in the degree of exposure to pathogens carried by the wastewater stream. Thus, any health effects studies carried out on this population can be considered relevant to the purpose of this report. The starting point of one important study consisted of occasional acute attacks of chills, fever, and malaise in workers at a sewage treatment plant at Gothenberg, Sweden (Rylander et al. 1976; Rylander and Lundholm 1980). Wastewater sludge was heat-treated and then formed into a dry powder. The workers were exposed to dust particles in the air that produced very high concentrations of airborne bacteria (mostly gram-negative) ranging - 93 - from 104 -107 colonies per cubic meter of air. When these workers were compared with age-matched controls, it was found that 50 percent of the workers and none of the controls had at some time experienced the acute fever syndrome. Twenty-five percent of the sewage plant workers also had occasional inflammation of the eyes. Laboratory tests of blood serum revealed slightly higher levels of immunoglobulins in the plant workers. The authors ascribed the symptoms and biochemical findings to the effect of endotoxins. They concluded that the results of these studies cannot serve to judge the risk for the development of chronic disease after several years of exposure since they did not follow up this population for an extended period. In our opinion, the fever-malaise syndrome that appears a few hours after exposure to sludge dust is a typical acute hypersensitivity (allergic reaction) to foreign protein, and may not necessarily be due to endotoxins. Its occurrence at the Gothenberg plant may have been due to the high concen- tration of bacterial protein containing dust with a large proportion of small respirable-size particles. These conditions were specific to the Gothenberg process of drying sludge to powdered form, and do not necessarily have any- thing to do with the presence or absence of bacterial or virus pathogens in the air that would result from aerosolization in such treatment processes as activated sludge or the aerosolization of pathogens in sewage sprinkler irrigation. The findings of Rylander et al. therefore do not seem relevant to the issue of health effects related to occupational exposure of sewage farm workers. Disease Rates among Copenhagen's Sewer Workers The study described in this section also pertains to the health of wastewater treatment plant workers rather than sewage irrigation workers (Dean 1980). The analysis summarizes the findings of five studies of Copenhagen sewer workers carried out between 1974 and 1977 that were documented partly in unpublished reports and partly in published monographs of the Copenhagen University Institute of Hygiene, the Public Health Office, and other public health institutes. The following parameters were investigated: (1) clinical morbidity; (2) sick leave records; (3) mortality statistics of sewage workers compared with national mortality statistics; (4) toxic substances in the atmosphere; and (5) chemical and immunological tests of the blood and urine of workers. Clinical morbidity. The ascertainment of illness was apparently based on responses to a health questionnaire as well as on consultations with physicians. No details,of the methodology were given in the overview. It was concluded from the questionnaires that sewer workers have a higher than normal incidence of acute gastrointestinal symptoms, and that their frequencies are directly related to the intensity of exposure to sewer odor and droplet spray or, in the words of the author, "splash." The physicians concluded that sewage workers "have a high level of chronic problems including fatigue, difficulty in concentrating, headaches, dizziness as well as psychic problems." - 94 - Sick leave. In one study, sick leave records of sewer workers were compared with those of male office workers of comparable ages. Among those older than 30, but most markedly those older than 50, sewer workers took more leave than office workers. In another study of sick leave records, sewer workers were compared with various types of other manual workers employed by the municipality. There was no evidence that sewage workers take more leave than the other groups. Mortality. Death rates of sewage workers during 1957-1974 were compared with the death rates of all males of comparable ages in the Copenhagen area. There was no excess of observed versus expected death in sewage workers who had been employed for one to eight years. There was a statistically significant excess of death in sewage workers employed nine years or more. This observation was based on 24 deaths among sewer workers employed more than nine years. When the causes of death were examined, no connection to the possible special occupational risks could be found. A very high proportion of the deaths occurred within one year after employment terminated. These studies seem to be plagued by severe methodological problems, which are due in part to the lack of uniformity in the methods used in gathering morbidity statistics and in part to the inherent constraints of the health questionnaire. The authors themselves point out that one of the, serious biases is the preconceived notions of sewage workers. They state, "Sewer workers consider their job to be unhealthy. A detailed analysis of the responses to the questionnaire is not included because the responses are difficult to reproduce and are liable to be influenced by local customs and recent news items." From the text, it can be gathered that, in one of the studies comparing sewage workers with matched controls of city gardeners and office workers, weight was given to the physicians' impression of the symptoms sewage workers complained about: "However not enough money was supplied for a medical examination of the controls." Thus it is difficult to give any weight to the physicians' impressions that the sewage workers have a "high" level of illness problems. With respect to the finding of excess deaths in sewer workers employed nine years or more, a very high proportion of these deaths occurred within one year of the termination of the employment. A proper control group of Copenhagen residents of similar age immediately after retirement was not included. This is a serious methodological weakness, since it is well known that in the general population, retirement per se increases the risk of death. No statistical analysis considering this factor was carried out. The author himself concludes that "the available evidence is insufficient to assign cause for the poor health and reduced life expectancy of sewage workers." In our opinion, there is no conclusive evidence of the existence of poor health or reduced life expectancy among sewage workers in Copenhagen. Health Risks of Human Exposure to Wastewater in Three Cities of the United States A prospective seroepidemiologic study of note was carried out by Clark et al. (1981a), whose primary objective was to determine the health - 95 - effects of occupational exposure to biological agents in municipal waste- water. They studied municipal wastewater workers and controls in three metropolitan areas: Cincinnati, Ohio; Chicago, Illinois; and Memphis, Tennessee. The primary study group consisted of more than 100 workers who were recruited at the time they began work at activated sludge plants, and who remained in the study for a minimum of twelve months. A group of 30 experienced sewage treatment plant workers in Chicago was also included, and in Cincinnati, two other groups exposed to wastewater were recruited consisting of about 50 sewer maintenance workers and 50 primary wastewater treatment plant workers. The procedure involved quarterly collection of blood, throat, and rectal swabs; yearly medical examinations; collection of illness information; work observation; and environmental monitoring. The study period extended from April 1975 through the fall of 1978. Observations were made at work to evaluate the level of the workers' contact with wastewater, and in conjunction with the biological air monitoring, to assess the extent of contact with wastewater-generated aerosols. The environmental monitoring included viral and bacterial analysis of wastewater and the use of the six-stage Anderson samplers to determine the respirable concentrations of bacteria in the work areas of the plant. A total of more than 500 volunteers, including both subjects and controls, participated in this study. The control groups in Cincinnati, Chicago, and Memphis were highway maintenance workers, water treatment workers, and gas and electric utility workers, respectively. The authors reached the following conclusions: 1. Gastrointestinal illness rates were higher among the inexperi- enced workers exposed to wastewater than among the experienced workers and Controls. Wastewater workers were not found to be subject to any detectable risks because parasites were present in the wastewater. There was only slight evidence, if any, to suggest that there were risks related to viruses and bacteria in wastewater. 2. Immunoglobulin levels were not found to be consistently higher in the wastewater-exposed workers than in controls in any of the cities studied. 3. Wastewater workers were not found to be a source of viral infections for their family members. 4. In a few instances, levels of antibodies to certain viruses appeared to be related to the level of exposure to wastewater aerosols. 5. Bacterial aerosol levels in buildings where wastewater sludge was being processed were generally higher than levels adjacent to outdoor aeration tanks at the same treatment plants. 6. Since the seroepidemiological approach did not detect any significant health effects of occupational exposure to wastewater, it is unlikely that this approach would detect potential health impacts in populations with lower levels of exposure to wastewater. - 96 - The study by Clark and his associates (1981a) is undoubtedly one of the most carefully designed and executed projects of this type. The extensive serological tests enabled them to examine the workers' blood sera for 28 different viruses, including polio, coxsackie, ECHO, reovirus, adenovirus, cytomegalovirus, herpes simplex, and hepatitus A and B. They tested for five different Salmonella bacteria, Leptospira, and Legionella pneumophila, as well as for four different immunological factors. Blood samples were taken quarterly. Illness information was obtained by monthly health diaries maintained by the workers, supplemented by telephone and on-the-job contacts. Illness symptom information from all sources was combined in a manner designed to avoid double counting and was categorized as respiratory, gastrointestinal, and "other." The findings of this study must be considered methodologically sound and as well founded as can be expected under the normal constraints of a field study of this type. The major finding that immunoglobin levels were not consistently higher in the wastewater-exposed workers than in the controls in the cities studied, led the authors to the reasonable conclusion that, under normal circumstances, a seroepidemiological approach is not likely to be sensitive enough to detect potential health impacts in populations less exposed to wastewater, such as those who live in the proximity of sewage treatment plants. However, they did detect higher gastrointestinal illness rates in the inexperienced wastewater-exposed workers than in the experienced workers and controls. Only in a few instances did the levels of antibodies to certain viruses appear to be related to the level of exposure to wastewater aerosols. In general, this careful study demonstrates that there may indeed be some health effects of occupational exposure of wastewater treatment plant workers during the first year of exposure, with a slight increase in relatively benign gastrointestinal illness. The increase in certain viral antibody levels as a function of exposure to wastewater aerosols suggests that some infection by this route can occur. Evaluation of the Health Risks Associated with the Treatment and Disposal of Municipal Wastewater and Sludge at Muskegon, Michigan In a second study by the group at the University of Cincinnati, the authors looked at the occupational exposure of sewage. workers to different agents, including organic chemicals, fungi, and endotoxins, as well as the health effects on workers exposed to spray irrigation of wastewater and the land application of sludge (Clark et al. 1981b; Linneman et al. 1984). Only the latter two aspects of this study are reviewed here. In order to assess the health risks of irrigation workers from exposure to viruses in aerosols,. the researchers examined the ambient air environment at the Muskegon Country Wastewater Management System. Spray irrigation with treated wastewater effluent is practiced as this plant. Air samples for animal viruses and coliphage assay were collected in an Army prototype XM2 biological sampler/collector. Blood samples, throat and rectal swabs, and data on illness symptoms were collected on a monthly basis from June through October. - 97 - The authors reported the following findings: 1. No animal viruses were detected in air samples collected at the Muskegon County Wastewater Management System using the Army prototype XM2 biological sampler/collector. 2. Animal viruses were detected in the raw effluent samples but decreased in concentration as the wastewater was aerated and stored in the lagoons. No such viruses were detected in wastewater at the pump station just prior to distribution to the spray irrigation rings. 3. When the centrifugation-filtration method was used, all raw effluent samples of wastewater were found to contain viruses ranging from 50 to 400 plaque-forming units (pfu) per liter. 4. Coliphages were detected in air samples at the aeration basins in concentrations of 0 to 9/m3 of air. E. coli 13706 coliphage was recovered more often than the other two tested, 15597 and 11303. 5. Illness and virus isolation rates were not significantly different in the study group of workers engaged in spray irrigation of wastewater compared with a control group of road commission workers. 6. Antibody titers to coxsackievirus B-5 were significantly higher for one subgroup of wastewater workers, the spray irrigation nozzle cleaners, when compared with other wastewater workers or the road commission workers. This suggests that there may be a risk of viral infection only in those with the greatest and most direct bodily exposure to wastewater. 7. The prevalence of Hepatitis A antibody was correlated with age, as would be expected in normal populations, and there was no increase in the prevalence of Hepatitis A antibody in those exposed to spray irrigation. I 8. Antibody titers to poliovirus 1, 2, and 3, coxsackievirus B2, and ECHO-virus 7 and 11 were not different in the wastewater and road commission worker groups. 9. The quantity of bacteria in the air downwind of treatment sources at the Muskegon wastewater site contained higher total numbers of bacteria, and higher percentages of gram negative bacilli, fecal- indicator bacteria, and pathogenic bacteria than did upwind air. 10. The mean respirable concentration of total airborne bacteria found 1 m downwind of the aeration basin at the Muskegon wastewater site was 2,800 colony-forming units (cfu) per cubic meter, which was significantly higher than that found 18 m downwind of the spray irrigations rings (that is, 700 cfu/m3). Concentrations 1 m upwind - 98 - of the spray irrigation basin and 18 m upwind of the field rigs were 490 cfu/m2 and 660 cfulm', respectively. 11. Airborne levels of respirable Klebsiella spp. found downwind of the wastewater operation were relatively high compared with most other recent studies. Like the previous study by Clark et al. (1981a), this investigation is well designed and well executed and has incorporated sound principles of epidemiological research. It must be pointed out, however, that initial levels of animal viruses detected in the wastewater prior to distribution in the spray irrigation rigs was below the levels of detection. Since the initial levels of animal viruses in the raw wastewater effluent samples ranged from 40 to 50 plaque-forming units per liter, it can be assumed that the levels of virus inactivation or removal achieved in the treatment processes were very high, around the 99.9 percent level. Thus, the level of exposure to aerosolized viruses was bound to be very low. However, the fact that fecal- indicator bacteria and pathogenic bacteria were found in the air at the Muskegon wastewater sprinkler irrigation site provides support for the assumption that some potential health risk exists among sewage irrigation workers exposed to such spray irrigation procedures. Even so, illness and virus isolation rates were not significantly different in the study group of workers engaged in the spray irrigation of wastewater and in the control group of road commission workers. Antibody titers to Hepatitis A virus and to the other viruses studied were no different in either group. Normal high levels of immunity among adults may preclude the detection of differences between exposed and control groups, except in extreme cases. Only in the case of the antibody titers to coxsackievirus B5 were there significantly higher rates in the subgroup of spray irrigation nozzle cleaners as compared with the others. This study suggested that the only detectable health effect, in the form of increased antibody levels, occurred in the occupationally exposed group with the greatest and most direct exposure to wastewater. There were no detectable differences in morbidity levels in this group, however. I The Health of Sewage Treatment Plant Workers in Canada A one-year prospective epidemiological study was conducted on 77 workers at two secondary sewage treatment plants in Winnipeg, Manitoba, in Canada (Sekla et al. 1980). The work environment was examined by microbio- logical testing of air and sewage samples. Microbiological tests of air samples were conducted with the aid of settling plates and an impinger. There was only one isolation of a pathogenic bacterium from air samples, an atypical strain of Yersinia enterocolitica. Two of the 13 air samples assayed yielded enteric viruses, and both were collected from the secondary aeration area of the northern sewage treatment plant. A polio-type 2 virus and an echo-type 7 were identified. Fifty-four salmonella strains were isolated from 38 samples of sludge and 16 samples of effluent representing 13 different serotypes. Enteric _ 99 _ viruses were isolated from raw sewage and effluents in amounts varying from 500 to 2,200 plaque-forming units (pfu/1) sewage and from 200 to 1000 pfu/l of effluent. These findings. led the researchers to conclude that work environ- ment has the potential to transmit pathogens to the workers. The health of the 77 employees working in both plants was assayed by means of a questionnaire inquiring whether they had experienced any of the following symptoms or problems during the preceeding ySear: fatigue, headache, loss of weight, fever, gastrointestinal problems, and pneumonia or flu-like diseases. Although it was difficult to compare the various employment categories within the plant (operators, maintenance workers, laboratory, and clerical) because of their small numbers, operators appeared to be the only group reporting an excess of illness symptoms such as fatigue on most days, headaches every day, or on most days. There was no difference between the categories with respect to the proportion that reported fever, gastrointestinal problems, and pneumonia or flu-like disease. However, the frequency of flu episodes was higher among operators than in other groups. The employees were also asked where they had worked in the plant before developing these symptoms. The most frequently reported location was the secondary aeration building. A series of serological tests was carried out on the sewage treatment plant workers and a group of controls comprising municipal employees in occupations unrelated to wastewater treatment. Operators were found to have 27.7 percent positives to intestinal protozoa, whereas workers in the nonoperating category ranged from 0 to 13 percent positive. In the serological tests, few differences were found between the employees in the sewage treatment plants and in the control group other than a slight excess of Yersinia enterocolitica and reovirus. The authors concluded that, although the work environment contains potential pathogens, the sewage workers as a group did not differ significantly from other Manitobans tested as a control for this and other studies. In the authors' opinion, the relationship between pathogens in the work environment and illness in the employees was not demonstrated unequivocally. Previous experience has demonstrated, however, that a health questionnaire based on a one-year recall period cannot be considered a reliable epidemiological instrument for determining morbidity in studies of population groups. Most researchers agree that recall periods of illness are reliable for only up to one week, possibly two. Blum and Feachem (1983), in their analysis of methodological problems frequently encountered in environ- mental epidemiological studies, observe that recall periods of 24 to 48 hours are preferable. Thus, any conclusions based on the health questionnaire must be discounted. Because both the number of employees in each work category and the size of the control group were small, none of the differences reported were statistically significant. The methodological limitations of the research make it difficult to draw any conclusions as to the health effects of occupational exposure of the sewage treatment plant workers involved in this study. - 100 - Cholera Outbreak in Jerusalem, 1970: The Effects on Wastewater-Irrigation Workers The 1970 cholera outbreak in Jerusalem mentioned above in relation to the transmission of cholera by wastewater-irrigated vegetables also provided some indirect circumstantial evidence concerning the health effects on irrigation workers exposed to the wastewater stream during the cholera outbreak (Fattal, Yekutiel, and Shuval, 1984). A serological study was carried out among a sample of control population in Jerusalem not known to have direct contact with clinical cases and among the population of Battir village, where the growing of crops by wastewater irrigation was the main agricultural occupation (Gerichter et al. 1973). About 8 percent of the sample of Jerusalem population were positive for cholera antibodies, and 57 percent of the sample of the Battir residents were positive for cholera. These villagers also had the highest incidence of clinical cholera cases in the area. This study indicates that initially the very high rate of clinical and subclinical cases of cholera among the residents of the village of Battir was possibly associated with the fact that the primary agricultural occupation of the adults, both male and female, and of many children was growing vegetables by irrigation with Jerusalem wastewater. Inasmuch as all age groups and both males and females had similar rates, occupational exposure did not seem to be the exclusive mode of infection. Since clinical cases from that village were not the first reported in the epidemic, it is not unreasonable to assume that they did not initiate the epidemic cycle, but became infected only after the Jerusalem wastewater had already become heavily contaminated with a high concentration of V. cholerae excreted by the numerous clinical and subclinical cases in all sections of the city. The authors concluded that there is a high risk of cholera infection among nonimmune workers practicing wastewater irrigation with effluent from a city experiencing a cholera outbreak. The workers might have first become infected by contact with the wastewater and then infected their families by contact. A confounding factor, however, is that the villagers also undoubtedly consumed some of the locally grown wastewater-irrigated vegetables, so that occupational exposure was not the exclusive route of infection. WASTEWATER USED IN AGRICULTURE THAT CAUSES DISEASE OR INFECTION IN NEARBY NONAGRICULTURAL POPULATION GROUPS Under this category, we consider family contacts of wastewater irrigation workers and of persons exposed to aerosols from sprinkler irrigation with wastewater, or other direct and indirect routes of exposure to pathogens of wastewater origin among population groups living in the immediate vicinity of wastewater irrigation. This section covers a number of studies of groups living in proximity to wastewater treatment plants, such as activated sludge plants that generate aerosols through aeration processes of the wastewater. In addition, we cite a number of specific studies of populations living in proximity to wastewater sprinkler irrigation sites. - 101 - The Use of Wastewater in Irrigation District 03, Tula, State of Hidalgo, Mexico The following study was carried out on the general population, and covers irrigation workers and their families, as well as nonagricultural workers living in farm communities (Rivera Ramirez 1980). The study was prepared as a Master of Public Health thesis by four students at the Mexico City School of Public Health under the supervision of Prof. Luis Ramirez. The information from this study and another on the same subject was obtained from the files of the School of Public Health. The second study is reviewed below. The working hypothesis of the first study was defined as follows: "The use of wastewater in the Tula cultivation area in the State of Hidalgo increases the incidence of gastrointestinal disease infections caused by bacteria, protozoa and helminths among the populations exposed." The researchers' dependent variable was the incidence of infections and gastrointestinal diseases due to protozoan or helminthic organisms and the number of people exposed. The independent variable was the use of wastewater. The study was carried out in two agricultural communities, one using uncontaminated water for irrigation and the other using "aguas negras," or untreated wastewater, from Mexico City for irrigation purposes. According to the authors, the Tula community using wastewater for irrigation increased its use of wastewater during 1975-1979 as a result of a newly constructed wastewater drainage system that led to an expansion of the wastewater irrigation network. After reviewing the medical files of the Tula Health Center for these years, the authors concluded that the cases of diseases of gastrointestinal origin were on the increase, specifically amoebiasis, and that this coincided with the period during which irrigation with wastewater increased in the Tula Irrigation District 03, after wastewater was channeled into the Requena Reservoir at Tula and had become the principal source of supply in the region. It was thus decided to concentrate exclusively on amoebiasis. The level of water contamination for the area of Jilotepec, the control community, was low in comparison with the high level of contamination of the wastewater in the Tula Reservoir. The authors concluded, "Comparing the two locations, based on the differences found, we were able to confirm that the risk of exposure to amoebiasis is greater in the area where wastewater is used-for irrigation." They also point out, "The fact that the presence of fecal material was found by the laboratory in sources of drinking water supply (bore holes and wells) further complicated the situation. . . . . The specific isolation of [the causative agent of] amoebiasis in the sources of water supply provides an additional explanation for the increase in the frequency of this disease among the population." They conclude that other factors as well are involved in the increase of amoebiasis in Tula. These factors cannot be analyzed independent- ly since they were not among the variables included in our study. However, before this area was irrigated with wastewater, "the level of amoebiasis was more or less stable with fewer cases than reported after wastewater began to be used for irrigation in agriculture." - 102 - The possibility that the increase in amoebiasis transmission was exclusively associated with contamination of the drinking water wells as a result of sewage irrigation cannot be overlooked. This raises some questions about the authors' own awareness of this confounding factor. We also questioned the reliability of the morbidity data of the local health clinics. To further investigate the validity of the findings, Dr. Jacobo Finkelman, director of the WHO's Centro Panamericano de Ecologia Humana y Salud in Mexico, met with Rivera of the School of Public Health of Mexico. The findings of this further investigation were as follows: 1. There was already widespread wastewater contamination of the whole region from wastewater irrigation prior to 1975. Although the use of wastewater for irrigation purposes had expanded during the period of the study, it is difficult to determine the full extent of the increased exposure to pathogens during that time. 2. The reliability of the information gained from the local health clinics where the morbidity data were gathered is indeed question- able. These health clinics were responsible mainly for the implementation of preventive programs, and apparently very few sick people are normally treated there. The main clinical services for the community are provided by the health services of PEMEX (the Government Oil Monopoly), the Mexican Institute of Social Security, or private physicians. None of these sources were used in gathering morbidity information, and it is believed that they are the primary sources of more accurate morbidity information, whereas the sources used in the study can be considered only secondary sources. 3. Many of the diagnoses reported by the local clinics were not backed up by proper laboratory examinations, so that it is not certain whether all cases of so-called amoebiasis were properly authenticated. 4. To the extent that laboratory tests were carried out in the health clinics, it is believed that these laboratory procedures were not standardized and that they were not subject to any external control procedures. Thus, there is even some question as to the validity of the laboratory-confirme-d cases. For these reasons, both Finkelman and Ramirez agreed that the conclusions of the study cannot be considered reliable. According to the dean of the School of Public Health of Mexico City, "more serious research in-this field is considered of the utmost importance." Use of Wastewater for Irrigation in District 03 and 88 and Its Impact on Human Health, Mexico The second epidemiological study on the health effects of wastewater irrigation near Mexico City (Sanchez Leyva, October 1976) is reviewed below. This study was carried out by four students at the School of Public Health of Mexico City under the supervision of Rafael Sanchez Leyva. It proposed to - 103 - investigate whether "the use and consumption of wastewater in irrigation districts 03 and 88 causes an increase in the prevalence of gastrointestinal infections and disease of protozoan and helminthic origin among the school age population." The exposed communities utilized wastewater from Mexico City for irrigation, whereas the control community used clean water for irrigation. The socioeconomic and environmental conditions (except for wastewater irrigation) in all three communities were found to be similar. Clinical examinations were made and fecal specimens taken from 405 students, 207 in the exposed communities and 98 in the control group. Children were asked to report on episodes of diarrheal disease during the three months prior to the clinical examination and interview. The communities were compared for reported episodes of diarrheal disease symptoms, nutritional status, and parasites in stool specimens. No consistent and significant prevalence of gastrointestinal complaints or infections was found in the communities irrigating with wastewater as compared with the control community. The conclusion of thekresearchers was that "the direct or indirect use of wastewater does not increase the prevalence of gastrointestinal infections or disease due to protozoa or helminths in the population concerned." The authors state, however, that, since only the school-age children were studied, the findings cannot be extrapolated to the general public, particularly to farmers working in direct contact with wastewater. During a site visit to the area, we observed that since the farmers' houses usually abut on wastewater-irrigated fields and wastewater irrigation canals, there is ample opportunity for the children to come in direct contact with wastewater and/or wastewater-contaminated fields, and indeed were observed to do so. The information gained on past disease symptoms and episodes cannot be relied upon, since, as stated in previous sections, studies have shown that a three-month recall period is too long to be reliable. Although the stool examinations were carried out by acceptable procedures in the same central laboratory, they provided no evidence of excess prevalence of protozoan or helminthic disease in children of the wastewater irrigating communities as compared with the controls. This finding is thought to be the most reliable of any in the two studies. It should be mentioned that the authorities in Mexico City consider that raw wastewater is utilized for irrigation in the areas studied. During our field observations, we found that most of the wastewater flow from Mexico City first enters large storage reservoirs, which also hold a certain amount of flood flows, before it moves through the complex system of open canals distributing wastewater to the farms for irrigation of the fields. These reservoirs have detention periods of as long as six months, but usually not less than one month, even at the end of the long dry irrigation season. Thus, the storage reservoirs serve as sedimentation basins that may remove a high level of the large, rapidly settleable protozoans and helminths (see Table - 104 - 5-14). Some reduction in pathogenic bacteria and viruses may also be achieved in these storage reservoirs. Although they were designed for operational purposes in the large complex irrigation system, they have, in effect, provided a reasonably high degree of wastewater treatment approximating that of stabilization ponds. One might hypothesize that the findings of this study, which show no excess helminth or protozoan infection among children in wastewater irrigating communities, could be associated with the effective reduction of these pathogens in the effluent of the storage reservoirs. Despite its methodological problems, this study does provide some circumstantial epidemiological evidence as to the degree of health protection that can be gained by adequate wastewater treatment. Although the authors' conclusions--that there was no evidence in children of increased disease associated with wastewater irrigation--may be in part justified, we tend to accept the opinion of the dean of the School of Public Health of Mexico City concerning both studies: "More serious research in this field is considered of utmost importance," particularly in the light of the fact that the wastewater irrigation project in Mexico City covers an area of 42,000 ha, and is one of the largest in the world. Health Effects of Aerosols Emitted from an Activated Sludge Plant, Skokie, Illinois An eight-month environmental health study was carried out in a 1.6-km area surrounding an activated sludge plant (treating 200 mgd) located in Skokie, Illinois, adjacent to Chicago; 724 people (246 families) volunteered to record self-reported illnesses at biweekly intervals (Northrop et al. 1980). A total of 1,298 throat and stool specimens from a selected subsample of 161 persons were analyzed for pathogenic bacteria and viruses. At the beginning and end of the study period, 318 persons submitted paired blood samples that were used to determine prevalence and incidence of infections of five coxsackie and four echo virus types. The population studied was found to be relatively homogenous from a socioeconomic and ethnic point of view. Environmental monitoring was carried out at 60 community monitoring sites. Regular measurements were taken of bacteria-containing particles (total viable particle, TVP) and total coliforms (TCP). Environmental monitoring and measurements of wind directions were used to develop a personal exposure index for the residents of the study area, as well as maps showing bacterial concentrations as isoplateaus. Although the plant itself was identified as a source of viable particles and total coliforms, the concentration of viable aerosolized bacteria at the plant and its nearby vicinity was much lower than that reported by other investigators at similar plants. The investigator concluded that "no remarkable correlations were found between the exposure indices and the rate of self-reported illness or of bacterial or viral infection rates determined by laboratory analysis." In the author's opinion, the overall negative conclusions of no obvious adverse - 105 - -health effect is partly due to the low number of people exposed to the highest pollution levels of aerosolized pollutants emitted by the plant. This is a carefully designed and meticulously executed research study that has been expertly analyzed. However, because the size of the sample exposed to the highest aerosol concentration is small, it is difficult to say whether the negative findings are the result of a methodological problem, or whether in fact the health of the sample group was not affected by its proximity to an activated sludge wastewater treatment plant producing aerosolized bacteria. Acute Illness Differences with Regard to Distance from the Wastewater Treatment Plant in Tecumseh, Michigan This study encompassed a population of 4,889 people who live within a radius of some 3,000 m from the Tecumseh, Michigan, wastewater treatment plant (Fannin et al. 1980). After recruitment, each family was contacted weekly by telephone or personal visits, and respondents were questioned regarding occurrences of short-term illnesses within the family during the past week. No environmental air monitoring for aerosols was carried out. The only parameter used was the radial distance of dwellings from the wastewater treatment plant. Persons within 600 m of the treatment plant had a greater- than-expected rate of respiratory and gastrointestinal illness. In the opinion of the authors, the data suggest that the higher illness rates were related to higher densities of lower socioeconomic families rather than distance from the wastewater treatment plant. However, a conflicting finding was that within the 2,400-m perimeter concentric ring, which had a higher income and education level than other groups, there was a greater-than- expected incidence of all illnesses. There is no ready explanation, according to the authors, for the higher-than-expected illness incidence within the higher socioeconomic group. One of the main weaknesses of this study is that no actual environmental measurements were made of aerosol concentrations in the air at the various distances from the wastewater treatment plant, so that actual levels of exposure could not be determined. Lack of information on wind directions also was a drawback. The findings of this study are inconclusive as well as conflicting. Health Effects from Wastewater Aerosols at a New Activated Sludge Plant (John Egan Plant), Schaumburg, Illinois This study encompassed approximately 1,100 households living within a radius of 5 km from the plant (Johnson et al. 1980). About half of the households were located between 350 m and 3.5 km from the plant, and the other half were located in a ring 3.5-5 km from the plant. The health information was obtained through a questionnaire filled in by the household head on the occurrence of acute disease within the last twelve months and acute symptoms within the last three months for each person within the household; 226 individuals were monitored for clinical specimens, which included fecal - 106 - specimens, throat swabs, and blood samples. Extensive air sampling for aerosolized microorganisms was also carried out. Elevated levels of total coliform, fecal coliform, coliphage and fecal streptococci, and psuedomonades were found above the background levels at downwind sites in the proximity of the sewage treatment plant. However, these elevated levels returned to background levels at residential distances of 350 m and onward. Data collected with Anderson samplers, which were used to determine particle size distribution of microbial aerosols, revealed that about half of the viable particles were in the primary respirable range of 1-5 pm in diameter. The researchers concluded that the wastewater aerosols from the aeration basin at the activated sludge plant constituted a significant aerosol source of total and fecal coliforms. However, the environmental monitoring did not detect higher-than-background levels of indicator coliforms or standard plant count at distances of the nearest residences, which were in the range of 350 to 600 m away. The residents in the areas near the sewage treatment plant reported a higher incidence of skin disease and several gastrointestinal symptoms. After the treatment plant became operational, antibody tests were made for 31 human enteric viruses. An attempt to isolate pathogenic bacteria, parasites, and viruses yielded virtually no clinical evidence of infectious diseases associated with wastewater treatment plant aerosols. The increases in clinical symptoms of gastroenteritis occurred primarily in the predominantly downwind quadrants (north and south) and were not observed in the households more than 2-5 km from the plant. Thus, the authors concluded that, although the elevated reported incidence of gastrointestinal symptoms and skin diseases might have been associated with the wastewater aerosols from the Egan plant, "the current evidence is insufficient either to associate or to disassociate such effects with aerosol exposure." In the authors' opinion, the primary difficulty in designing a definitive health effects study on this question is the lack of a sufficiently large, sensitive population group (that is, young children, whose immunologic defense against infectious diseases is developing) that resides close enough to the source to receive a high dose of the aerosolized agent or agents. The authors believe that most of these problems can be mitigated in part by investigating the health effects of wastewater aerosols at new sites where wastewater is applied to land on a large scale by sprinkler irrigation. Thus, a previously unexposed and nonimmune population can be measured in a before-and-after situation. We might add to the authors' own reservations that there are serious questions as to the validity of the morbidity data collected in this study. The only suggested positive findings were based on information gathered through the health questionnaire, which required heads of households to report on the occurrence of acute diseases within the last 12 months and to record symptoms within the last 3 months for each person in the household. There is ample epidemiological evidence that the maximum reliable recall period for clinical episodes is one or two weeks at the most. Thus, studies based on recall periods of 3-12 months cannot be considered reliable. Despite the excellent program for environmental monitoring that was incorporated into this - 107 - study, we believe that, at best, the results can only be considered inconclusive, owing to the methodological weaknesses in the collection of the morbidity data. Wastewater Aerosols and School-Attendance Monitoring at an Advanced Wastewater Treatment Facility, Durham Plant, Tigard, Oregon The investigation at Tigard, Oregon, was designed as a before-and- after study to determine the effects of operations at an activated sludge plant on the attendance in an elementary school next door (Camann 1980). Wastewater aerosols are generated by the aeration basin, which is located 400 m from the classrooms, and by an aerated surge basin within 50 m of the school playground. The Durham Elementary School, a small 6-room school with 123 students, was the study school, and the remaining 5 elementary schools in the Tigard school district were selected as control schools. Air monitoring determined that the activated sludge aeration basin was a much stronger source of aerosolized microorganisms than the surge basin. The geometric mean concentration in the air at 30-50 m downwind from the aer tion basin was 12 cfu/m3 (colony forminf units) total coliforms, 4.2 cfu/mi of fecal strep- tococi, and 1.5 pfu/m of coliphage. Enteroviruses were not detected in the air. An analysis of the attendance records of the exposed school and the controls showed that the overall attendance at the exposed school actually improved after the sewage treatment plant was put into operation. Thus the authors concluded that at this level of exposure the wastewater treatment plant aerosols had no adverse effect on the incidence of communicable disease, as discerned from total school absenteeism. School absenteeism, however, is too crude a measure among school children of this age, who are so highly susceptible to respiratory diseases normally transmitted by direct contact, particularly during the winter period. An Evaluation of Potential Infectious Health Effects from Sprinkler Application of Wastewater to Land, Lubbock, Texas An epidemiological study of health effects in Lubbock, Texas was sponsored by the U.S. Environmental Protection Agency (Camann et al. 1983). The final report was not available at the time of writing and the information presented here is based on the unpublished interim report. This study was designed to overcome many shortcomings of earlier studies in the hope of arriving at valid conclusions as to the possible health effects resulting from the dispersion of aerosolized pathogens by wastewater sprinkler irrigation. The study investigated the effect on a population living in a rural community adjacent to a proposed wastewater land application site at Hancock Farm near Lubbock. The before-and-after design was adopted to settle questions that have arisen in many previous studies as to whether pathogens found in wastewater aerosols may have been shed as a result of infections spreading in the community by other routes, such as contact infections, and not vice versa. The study population comprised 150 families, or approximately 450 persons. The "health watch" consisted of the following measures: (1) semiannual serological surveys and examination of monthly fecal specimens from selected donors; (2) illness reporting by means of household health diaries, - 108 - accompanied by laboratory examinations of illness specimens; and (3) activity diaries kept during the wastewater irrigation period. This type of procedure allows researchers to evaluate special occupational risks. An intensive air monitoring program was carried out prior to and after the initiation of land application of wastewater. The morbidity studies, as well as the air pollution studies, began well in advance of the start-up of the Hancock wastewater farming operations, and thus provide clear- cut before-and-after data on both parameters. The wastewater used for land application by sprinkler irrigation was derived from the fully treated effluent of the city of Lubbock; thus, the rural population would be challenged by enteric agents derived from a different population. This unique feature of the investigation was based on the conclusions of a number of previous studies, which recognized the problems associated with a community exposed to its own wastewater. At this time, only the unofficial preliminary conclusions of the interim report for the period ending December 1982 are available. On the basis of the air monitoring program, the researchers reported that "sprinkler irrigation of wastewater was found to be a substantial aerosol source of all monitored microorganisms." The density of these microorganisms in ambient air increased substantially for at least 400 m downwind. From the crude morbidity data collected in the first year of irrigation, the researchers concluded that "no obvious significant connection between health effects and wastewater exposure has been observed." However, final conclusions must await verification from the processed data, statistical analysis, and epidemiological interpretations. The Lubbock, Texas, study appears to be a well-conceived and care- fully designed research project. It takes advantage of the,unique before-and- after situation in which the population is exposed to pathogens from a source outside of their own community. This should provide optimal conditions for detecting health effects, if any, in association with land applications of fully treated wastewater effluent in the neighborhood *of residential areas. In addition, the health diary program is well designed. There are certain inherent weaknesses in health diaries, however, particularly when they are administered over long periods. Other studies have shown that participants in health-diary recording tend to become "tired" and less accurate in their reporting as time goes on, or tend to report selectively, according to their own personal bias of what they perceive to be important symptoms and diseases. Another limitation of this study is the small size of the exposed population. It should also be pointed out that fully treated wastewater effluent is being utilized in this case. It will be difficult to draw conclusions from the results of this study as to the possible health effects of lower-quality effluents. We must, however, await the final analysLs of this important project to see whether it provides further evidence on the possible health effects of pathogen dispersion through aerosolized wastewater among those living close to land application sites. - 109 - Risk of Communicable Disease Infection Associated with Wastewater Irrigation in Agricultural Settlements in Israel This study was carried out in 207 kibbutzim (collective agricultural settlements) in Israel; 77 of these kibbutzim (population of some 36,500) practiced wastewater irrigation, primarily by sprinkler methods, whereas the remaining 130 kibbutzim (population of 46,000), did not utilize wastewater for any purpose (Katzenelson, Buium, and Shuval 1976). The morbidity data were based on the communicable disease cases reported to the Ministry of Health in accordance with the legal requirements for communicable disease reporting. The information on the irrigation status of the kibbutzim was obtained from the registry of the Ministry of Agriculture, which keeps files on the wastewater irrigation practices of the various settlements. In most cases, residential areas were within 1,000 m of the plots that were sprinkler-irrigated with wastewater. In some cases, however, the communities were at a greater distance. In general, the effluent used for irrigation was the kibbutz's own wastewater, although in some cases wastewater from neighboring towns was used. In almost all cases, the wastewater was treated in oxidation ponds with detention periods of 5-7 days. The effluent from such oxidation ponds was generally of very poor quality, with coliform concentrations ranging from 105 to 107 per 100 ml. Effluent of this quality is similar to the raw wastewater in many areas. One of the main findings of this study was that the incidence of salmonellosis, shigellosis, typhoid fever, and infectious hepatitis was 2-4 times higher during the summer irrigation periods in kibbutzim practicing wastewater irrigation than in communities not practicing wastewater ,irrigation. The authors cautiously suggest that this retrospective study provides some epidemiological evidence for an increased risk of enteric communicable diseases among the utilizers of wastewater. However, they also point out some of the methodological problems they encountered; for example, the actual concentrations of pathogens in the air of the residential areas were not measured directly and the pathogens from the wastewater irrigation areas may possibly reach the popul-ation by alternate pathways, such as on the body and clothes of irrigation workers who live in the community and who return from the fields at noontime and at the end of the day. Further in-depth analysis and other field work experience in many of the same communities point to a number of serious methodological problems with the above study. For one thing, the morbidity data were drawn exclusively from the communicable disease reports in the files of the Ministry of Health. Numerous studies in Israel and elsewhere have demonstrated that official reports on communicable diseases are often biased and represent only a small portion of the actual incidence of communicable disease. Furthermore, the irrigation status of the kibbutzim was determined from the files of the Ministry of Agriculture rather than through field observations. Later field observations determined that in a large number of cases, the Ministry of Agriculture's records of the wastewater irrigation status were incorrect, either because the community had initiated sewage irrigation without informing the Ministry of Agriculture, or because it had ceased to irrigate when - 110 - previously had recorded doing so. Because of these and other methodological drawbacks, no definitive conclusion can be drawn from this study. Health Risks Associated with Wastewater Utilization in Agricultural Settlements in Israel: a Historical Epidemiological Study A historical epidemiological study was sponsored by the U.S. Environmental Protection Agency as a follow-up to the work of KatzeneLson, Buium, and Shuval (1976) reviewed above (Fattal, et al. 1981; Shuval, Fattal, and Wax 1983). In view of the methodological problems of the previous study, it was decided to draw the morbidity data directly from the patient files at the clinic in each kibbutz, and to validate the environmental data as to whether the kibbutz practices wastewater irrigation or not by actual field observations. The study encompassed 78 kibbutzim and included morbidity data gathered over a four-year period. The kibbutzim were divided into the following four categories: I. Kibbutzim practi-cing sprinkler irrigation with wastewater, in which residents are exposed to wastewater aerosols. There were 30 such kibbutzim, with a total population of 13,513. II. Kibbutzim not eyposed to wastewater aerosols nor practicing any form of wastewater irrigation utilization. This category consisted of 28 kibbutzim with a total population of 11,096, and it served as a qontrol group for categories 1 and '3. III. Kibbutzim not exposed to wastewater ad"rosols, but utilizing wastewater as feed water for fish ponds. There were 10 in this category and the population totaled 5,005. IV. Kibbutzim that switched from nonwastewater use to wastewater use, or vice versa, serving as controls for themselves. There were 11 such kibbutzim and their population totaled 3,040. Morbidity data were culled directly from the personal medical files of each kibbutz member at the community clinic. The environmental data were collected during visits to the settlements and interviews with the appropriate authorities. Meteorological data on wind direction and wind speeds were gathered from the nearest meteorological service weather stations. Because of the large number of complex intra- and inter-Kibbutz confounding factors and conflicting findings, it was decided that conclusions could not be drawn directly from the data in the sixty-eight kibbutzim in categories I to III. However, from a methodological point of view, the 11 "switch" kibbutzim (category IV), although fewer in number, had the potential of making a much superior design, and overcame most of the pitfalls that could not be properly controlled in the 68 kibbutzim in categories I through III. Basically, the switch-category kibbutzim can be considered a before-and-after design. - 111 - The data from category IV indicated a significant (p > 0.01) increase in the risk of enteric disease in the summer-irrigation months during the effluent-irrigating years, compared with the nonirrigating years, for the 0 to 4 age group, regardless of whether the switch was from noneffluent to effluent, or vice versa. The estimated excess risk was between 34 and 90 percent, depending on the method of statistical calculation. It was noted that a consistent pattern of excess risk of enteric disease in the 0-4 age group was found in 9 out of the 11 kibbutzim. No significant excess was found in other age groups, or for all ages. ,. However, on an annual basis the evidence as to an excess of enteric disease in effluent irrigating kibbutzim was not consistent. Although by one method of analysis it appeared that there was a small but significant excess in the 0-4 age group as well as some of the other age groups, this was not found when an alternative method of analysis was used. This conflicting evidence therefore makes it impossible to detect with certainty any excess of enteric disease in the morbidity rates in the effluent-irrigating kibbutzim compared with those in kibbutzim not utilizing effluent. Thus the main finding of this study is a seasonal excess of enteric disease during the effluent-irrigating summer months, which levels out and disappears when annual rates are compared. This seasonal cycle suggests that, where wastewater irrigation is practiced, there may indeed be an increase in the transmission of clinical gastroenteritis, particularly in the highly susceptible 0-4 age group, on the initiation of wastewater irrigation in the spring. However, in the unique kibbutz situation with its intimate communal living--including communal care of all infants and children and a common dining hall--the continuous multiple exposures to enteric pathogens can result in a certain annual saturation level of enteric disease regardless of wastewater utilization practices. Unlike the earlier study, this study did not find any significant differences in the rates of salmonellosis, infectious hepatitis, or typhoid fever between effluent-irrigating and noneffluent-irrigating kibbutzim. An apparent, but not significant, excess of shigellosis was found by Fattal et al. (1981), compared with the 220 percent excess found by Katzenelson, Buium, and Shuval (1976). The authors of the follow-up study are aware of the methodological problems that arise when retrospective methods are used. They indicate that further information will be available on this question on the completion of the prospective epidemiological study being carried out by the same group in 30 kibbutzim (see following section). Despite their conflicting results, these two studies do provide further evidence of the seasonal transmission of mainly benign gastrointestinal disease, possibly of virus etiology, to populations (mainly children) living adjacent to sites irrigated by wastewater sprinklers. It should be noted, however, that the effluent used for irrigation in both studies was only partly treated in stabilization ponds of 5-7 day detention and the microbial quality of the effluent was very poor. Thus the findings -112- may not apply to situations in which the sprinkler irrigation effluent is of much higher quality. A Prospective Epidemiological Study in Agricultural Communities Exposed to Aerosols from Sprinkler Irrigation in Israel This prospective epidemiological study, also sponsored by the U.S. Environmental Protection Agency, was carried out between 1980 and 1982, as a follow-up to the two previous studies carried out by the same group at the School of Public Health at the Hebrew University of Jerusalem (Shuval et al. 1985; Fattal et al. 1985). This study investigated the association between enteric infection and/or disease and wastewater utilization in agriculture in 30 kibbutzim in Israel with a total population of 15,605. The 30 kibbutzim-were divided into three major categories according to wastewater utilization: Category A. Ten kibbutzim sprinkler-irrigating with wastewater effluent within 600 m of.residential areas. The population here was assumed to have been exposed to aerosolized sewage. Subcategory Al. Four kibbutzim using wastewater effluent from neighboring towns and also their own effluent. Subcategory A2. Six kibbutzim sprinkler-irrigating only with their own wastewater effluent. Category B. Ten kibbutzim sprinkler-irrigating with wastewater effluent at a distance of 1,000 m or more from the residential areas, and/or using wastewater effluent for enrichment of fish ponds. This population is assumed not to have been exposed to aerosolized sewage, but could be exposed by contact with sewage irrigation workers. Category C. Ten kibbutzim not using wastewater for any purposes, but using water from clean sources and serving as a control group for A and B. Medical data regarding disease morbidity were collected directly from the patients' files and from the daily logs of physicians and nurses at each kibbutz clinic. Limited air monitoring programs were carried out in six kibbutzim practicing wastewater irrigation to gather representative infor- mation on the extent of the dispersion of aerosolized bacteria of wastewater origin. Microbiological tests were performed on drinking water, as well as on the water and wastewater used for irrigation and in fish ponds. A seroepidemiological study of the prevalence of viral antibodies (coxsackie A9, Bi, B3, B4; ECHO 4, 7, 9; polio 1, 2, 3; and hepatitus A virus (HAV], varicella-zoster) and for Legionella pneumophila sero types 1 and 8 and seven other Legionella species was carried out in order to determine whether population groups exposed to pathogens present in the wastewater showed higher - 113 - antibody prevalence than nonexposed controls. In this study, 1,810 paired blood samples were checked for viral antibodies; and 896 paired and 1,136 single blood samples were tested for Legionella antibodies. In the aerosol monitoring studies, coliforms, fecal coli, and fecal streptococci were detected frequently in air samples -730 m downwind of wastewater irrigation sites at concentrations in excess of background levels. On a number of occasions, enteroviruses were detected in air. samples at the most downwind air sampling sites of 730 m. In the kibbutzim included in this study, the residential areas exposed to wastewater sprinkler irrigation were within the 730 m range of viable aerosol dispersion detected in the air monitoring program, and thus suggested the theoretical possibility of aerosol transmission of pathogens. Except for ECHO 4 virus, there was no significant and consistent excess of viral antibodies in the groups in the kibbutzim exposed to waste- water aerosols as compared with the control kibbutzim population that was not exposed to wastewater of any kind. Furthermore, the consistent excess (of some 100 percent) of ECHO 4 virus antibodies occurred only in the communities of category Al--that is, those exposed to wastewater aerosols originating from nearby towns, particularly in the 0-5 age group (p < 0.001). The greater prevalence of ECHO 4 virus antibodies in that category could be explained by a major national ECHO 4 epidemic that lasted from January 1980 to September 1980. Since there had been no major epidemic of that virus in the country for several years prior to the 1980 outbreak, it is assumed that a high percentage of the infants and children were not immune. Wastewater irrigation began in April or May, and the initial blood samples were drawn during May-July 1980. Sprinkler irrigation with wastewater from neighboring communities thus appeared to play an important role in the initial introduction of the ECHO 4 virus into the kibbutz population. The results from the tests for antibodies to Legionella species indicated a significant (p < 0.02) excess in the percentage of positive sera to L. pneumophila sero types 1 to 8 among irrigation workers and fish pond workers in general, whether or not they were exposed to wastewater or clean water in irrigation. No excess of antibodies to Legionella pneumophilia was found among wastewater irrigation workers compared with clean water irrigation workers. When the data on the prevalence of ECHO 4 antibodies were analyzed by occupation group, no significant excess. was -found in irrigation workers exposed to wastewater over those not exposed, or over male controls not involved in agricultural occupations. Since the countrywide viral meningitis epidemic was confined to the urban areas in its early stages, the researchers hypothesize that the urban wastewater in Israel was highly contaminated with an ECHO 4 virus variant as a result of the epidemic of viral meningitis associated with the ECHO 4 virus. It was assumed that aerosolized wastewater from sprinkler irrigation, with wastewater in the neighboring towns introduced the ECHO 4 virus into the - 114 - kibbutzim utilizing that form of wastewater. The most suspectible group consisted of nonimmune infants and young children who may have become infected by respiring pathogens in the air. It was noted that ECHO 4 virus is one of several enteroviruses that can lead to infection through the respiratory route in addition to the fecal- oval route. It was further determined that children were the main introducers of ECHO 4 virus into households and that they infected women more than men. Thus, this is a unique case in which a low level of immunity had developed after some years without an epidemic and provided an opportunity to demonstrate that, under such circumstances, sprinkler irrigation could lead to the dissemination of pathogens into a community by infecting the infants and children, who are highly susceptible and who later infected other members of the community. The morbidity data, however, gave no indication of an excess of clinical disease (meningitis or similar symptoms) in Category Al. It seems that seroepidemiological techniques are at times able to detect effects not revealed by morbidity data, but this may be due in part to the number of subclinical cases. Although the evidence is circumstantial, this is the first demonstra- tion that aerosolized wastewater from an infected community can transmit infection to residents in neighboring communities. Because the community has a high level of immunity to most other enteric viruses endemically present, however, aerosolized wastewater plays a small role in transmitting enteric virus disease in all communities there. According to other reports from the Jerusalem group (Morag et al. 1984), the acquisition of antibodies to coxsackie and ECHO viruses normally occurs at a very early age. In the 2-4 age group, 40-69 percent had anti- bodies to those viruses. In the 5-17 age group, 85 percent had antibodies to five or more of the seven "classical" enteroviruses studied. In contrast, only 4 percent of the children in the 2-4 age group possessed antibodies to hepatitus A virus (HAV), and in the 5-17 age group only 10 percent had antibodies to HAV. The HAV virus antibodies "jumped" to 63 percent in 18-24 age group and reached a peak of 95 percent for those aged 50 and older. This study suggests that even in communities with a very high level of personal hygiene and sanitation, infants and children are exposed to most of the "classical" enteroviruses at a very young age and acquire antibodies to those viruses either as a result of an infection leading to an acute disease episode or a subclinical condition. - HAV virus apparently is not endemic in the kibbutzim studied and is acquired by young people when they leave the community to serve in the army at the age of 18. Other studies in Israel in communities of lower socioeconomic status and poorer levels of personal and family hygiene have shown that infants and children acquire very high levels of HAV virus antibodies similar to the immunity levels acquired for the other classical enteroviruses. Medical records of the kibbutz population studied show 104,268 clinic visits for 172 defined symptoms and diseases. But there was no significant - 115 - excess of enteric disease in any age group (including the highly susceptible children under 5) or in any occupational group in kibbutzim exposed to waste- water aerosols within 600 m of the residential areas (Category A) as compared with Category B, which used wastewater but was not exposed to aerosols, or in the controls, which utilized only clean water (Category C). There was some suggestion that children 0-5 years old of workers who had come in contact with wastewater had an excess of enteric disease compared with those whose parents had had no such contact. However, inconsistencies in the data prevented the authors from drawing any firm conclusions here. Thus, this carefully designed large-scale prospective epidemiological study, which concludes the series of studies in Israel', does not provide any concrete evidence of excess morbidity among sewage irrigation workers or among population groups residing close to aerosolized sewage sources as a result of sprinkler irrigation. This study, like the previous ones carried out in Israel and reviewed here, examined cases in which very poor quality effluent containing high concentrations of enteric microorganisms was used for irrigation. The high levels of immunity to the locally endemic diseases found in the kibbutz population complicates the extrapolation of these findings to situations with much lower levels of immunity as might exist in countries with very high standards of living. However, when making such extrapolations, we will probably find little indication of detectable risk of disease transmission by aerosolized sewage from sprinkler irrigation in the vicinity of residential areas. In conclusion, this study does provide circumstantial evidence indicating that aerosolized enteric viruses from wastewater sprinkler irrigation can be dispersed over distances of several hundred meters into adjacent residential areas, in concentrations high enough to lead to infection (not disease) of highly susceptible children by the respiratory route. Fur- thermore, although the aerosol route of infection resulting from sprinkler irrigation is theoretically open, it does not lead to measurable or detectable levels of excess infection in the case of most enteroviruses currently-endemic in the community, since other routes of transmission, primarily direct contact, are dominant. The fairly uniform levels of antibodies to ECHO 4 virus among all irrigation workers and the control adult male population indicate that immunity among adult males prior to the 1980 ECHO 4 epidemic was sufficiently high to prevent significant excess infection among the wastewater irrigation workers, whom one would expect to be the most heavily exposed and at risk to infection. This finding confirms the results of a number of the studies on workers at wastewater treatment plants reviewed in the preceding sections. - 116 - GENERAL CONCLUSIONS AS TO QUANTIFIABLE HEALTH EFFECTS ASSOCIATED WITH WASTEWATER InRIGATION, WITH PARTICULAR REFERENCE TO THE DEVELOPING COUNTRIES Past research on the health effects associated with wastewater irrigation clearly demonstrates that many types of pathogenic microorganisms-- including bacteria, viruses, helminths, and protozoa--are present in high concentrations in raw or even partly treated wastewater and survive for days, weeks, and at times months in the soil and on crops that come in contact with wastewater. Some of these pathogens have also been detected in aerosolized wastewater at considerable distances downwind from sprinkler- irrigation sites. However, we repeat that the mere detection of pathogenic microor- ganisms in the soil, in food crops, or in the air, is not in itself sufficient proof that human beings are, in fact, becoming infected or sick as a result of contact or exposure to such pathogens. In this section we look at studies that have attempted to measure the quantifiable impact on human health when wastewater has been used in agriculture. We base our conclusions on what we consider the soundest epidemiological evidence drawn from the studies that meet modern scientific criteria and that demonstrate a causal relationship between the disease and exposure to wastewater reuse in some form. However, in addition we must give some weight to the massive evidence on pathogen dispersion and survival in the wastewater stream, soil, and on crops, since the repeatedly demonstrated presence of these pathogens and the epidemiological evidence even when not fully validated must at least be considered as indicative of certain potential health hazards, particularly in cases where there are no epidemiological studies to provide quantitative evaluation of the degree of the hazard. When these factors are taken into account, the following general conclusions can be drawn concerning the health effects of wastewater reuse in agriculture. 1. In those areas of the world where the helminthic diseases caused by Ascaris and Trichuris are endemic in the population and where raw, untreated wastewater is used to irrigate salad crops and/or other vegetables generally eaten uncooked, one of the important routes of transmission of these infections to the general population can be through the consumption of such wastewater-irrigated salad and vegetable crops. Khalil (1931) demonstrated this in his pioneering studies in Egypt. Similarly, the Jerusalem study (Shuval, Yekutiel, and Fattal 1984) provided strong evidence that both Ascaris and Trichuris,infections are massively tranismitted by irrigation of salad and vegetable crops with raw wastewater. The disease almost totally disappeared from the community when wastewater irrigation was stopped. Despite their limitations, the studies from Darmstadt, Cermany (Baumhogger 1949; Krey 1949; Schlieper and Kalies 1949), provided additional evidence on this point. Thus, it appears that, both in areas of relatively high levels of municipal sanita- tion and personal hygiene and in areas of much lower levels of municipal sanitation and personal hygiene, irrigation of vegetables with raw wastewater serves as a major pathway for the continuing and long-term exposure of the population to Ascaris and Trichuris infections. Both of these infections are of a cumulative, chronic nature, and repeated long-term reinfection increases the individual worm load as well as the possibility of potential negative - 117 - health effects. The extent of the excess infection in the exposed population compared to the controlled group varied between 30 and 60 percent. (The nature of those health effects are discussed later.) 2. Cholera can be disseminated by vegetable and salad crops irrigated with raw wastewater carrying cholera vibrios to populations consuming such vegetable products. This phenomenon is of concern particularly in nonendemic areas with relatively high sanitation levels, where the common routes of cholera transmission, such as contaminated drinking water and poor personal hygiene, are closed. Under such conditions, the introduction into a community of a few cholera carriers, or subclinical cases, could lead to the massive infection of the wastewater stream, as occurred in Jerusalem in 1970, which helped to disseminate the disease among the consumers of the vegetable crops irrigated with the raw wastewater. 3. There is strong circumstantial evidence from Santiago, Chile, indicating that typhoid fever can be transmitted by vegetables and salad crops irrigated by raw sewage under -conditions of fairly good community hygiene where other normal routes of typhoid transmission such as contaminated drinking water and poor personal hygiene are blocked. It is interesting to note that typhoid transmission by sewage irrigation was not detected in any of the studies from developing countries. Apparently in such situations, even if some typhoid transmission does occur by this route, it is masked and thus becomes undetectable owing to massive concurrent transmission by the other normal routes of transmission of the disease, which are apparently the dominant ones under conditions of very poor personal and community hygiene. 4. To our knowledge, no epidemiological studies conclusively demonstrate that tapeworms (Taenia saginata) have been transmitted to populations consuming the meat of cattle grazing on wastewater-irrigated fields, or fed crops from such fields. However, there is strong evidence from Melbourne, Australia, and from Denmark that cattle grazing on fields freshly irrigated with raw wastewater or drinking from wastewater canals or ponds can become heavily infected with the disease (cysticercosis). This, undoubtedly, can cause serious veterinary problems and economic loss to farmers. It is thus not unreasonable to assume that the meat of the infected animals can cause infection in the consumers of such meat, since the almost exclusive mode of transmission of these helminthic diseases to humans is through the consumption of raw or undercooked meat of infected cattle or pigs. The main route of infection of the cattle and pigs is through exposure to human excreta, or wastewater carrying Taenia eggs excreted by human cases. Thus, in areas where tapeworm disease is endemic and pastures are irrigated with raw wastewater from endemic communities, it is highly probable that this practice can provide a major pathway for the continuing cycle of transmission of the disease to animals and to humans. Therefore, despite the lack of published quantifiable epidemiological evidence, it would be a prudent policy in this case to assume that the potential for tapeworm transmission does exist in endemic areas when pastures are irrigated with raw wastewater. - 118 - 5. Sewage farm workers exposed to raw wastewater in areas where Ancylosoma (hookworm) and Ascaris infections are endemic have been shown (Krishnamoorti, Abdulappa, and Anwikar 1973) to be exposed to significantly excess levels of infection of these two diseases, as compared with other agricultural workers in similar occupations. The risk of becoming infected with hookworm is particularly great in areas where farmers customarily work barefoot in the soil and are exposed to the penetration of hookworm larvae through the broken skin of the feet. Even more important, however, is the fact that the intensity of the parasitic infections (the number of worms infesting the intestinal tract of an individual) of the sewage farm workers was very much greater than that of the controls. In the case of hookworm, the severity of the health effects is a function of the worm load of individual carriers. Studies have shown that the worm load is a function of the degree and length of time of exposure, which enable reinfections to occur and worm loads to build up. Another important finding in the Indian study was that sewage farm workers appeared to suffer more from anemia than the controls. Anemia is considered to be one of the typical debilitating symptoms of severe cases of hookworm infestation. There is also evidence of the debilitating heaLth effects, with their direct economic impact on human productivity, that can result from continuing occupational exposure to irrigation with raw wastewater in areas where hookworm is endemic. The extent of the excess infection with hookworm and Ascaris among the exposed sewage farm workers compared with control populations is about 35-40 percent. 6. Sewage farm workers are liable to become infected with cholera if they practice irrigation with a raw wastewater stream derived from an urban area in which a cholera epidemic is under way. This situation can occur, as it did in the 1970 Jerusalem outbreak, in an area where cholera is not normally endemic and where the level of immunity to cholera among the rural sewage farm worker population is low or nonexistent (Fattal, Yekutiel, and Shuval 1984). In rural areas where cholera is endemic and transient immunity is high, it is difficult to predict whether irrigation with raw wastewater will result in a detectable increase in cholera levels among the sewage farmers exposed to the common multiple routes of transmission, such as contact infection under conditions of poor hygiene and contaminated drinking water. Although firm quantifiable evidence is not available on this point, it is prudent to assume that a certain degree of excess cholera would be transmitted to sewage farm workers in developing countries even under such conditions. Furthermore, it is not unreasonable to assume that, even under conditions of very poor hygiene in the rural farm area, a cholera outbreak starting in the urban area could spread to the rural area if the town's wastewater was used for irrigation. 7. There is only limited and often conflicting evidence as to the adverse health effects from bacterial and virus diseases among wastewater irrigation workers or wastewater treatment plant workers exposed directly to wastewater or to wastewater aerosols. Morbidity and serological studies in such population groups have not generally been able to demonstrate any clear patterns of excess prevalence of virus diseases. At Muskegon, Michigan, the - 119 - most highly exposed group of wastewater irrigation workers was found to have excess levels of antibodies to one of the enteroviruses (Clark et al. 1981b; Linneman et al. 1984), but no excess of morbidity was found in this group. There is some indication that new sewage plant workers suffer more gastrointestinal and respiratory symptoms than nonexposed workers during their first: year of employment. In conclusion, it can be said that sewage contact workers may suffer from some relatively benign gastrointestinal diseases and may develop antibodies to certain viral pathogens, but as a group show few signs of serious health effects. It is hypothesized that, in general, occupational groups of this type have acquired relatively high levels of immunity to most of the common enteric viruses endemic in the community at a much younger age, and thus, by the time that they are exposed occupationally, the number of potential susceptibles is small, and any excess infection resulting from occupational exposure is too small to provide statistically significant findings. It can be assumed that this situation exists particularly in developing countries, where infants and children are exposed to most endemic enteric virus diseases at a very young age and thus have acquired lifelong immunity. This is not the case for most bacterial and protozoan diseases, however, and although there is no clear evidence from available epidemiological studies, it can be assumed that wastewater irrigation workers are exposed to bacterial pathogens. However, multiple routes of concurrent infection with these diseases may well mask any excess among wastewater irrigation workers. 8. There is little evidence of disease and/or infection in populations residing near wastewater treatment plants or wastewater irrigation sites associated with pathogens in aerosolized wastewater resulting from sprinkler irrigation or aeration processes. Most studies have not shown any demonstrable infection as a result of the dispersion of aerosolized pathogens by such processes. Majeti and Clark (1981) have concluded from their review on this subject that "data on health effects from existing epidemiological studies do not show any correlation between airborne pathogenic microorganism levels at wastewater treatment plants and incidence of disease in treatment plant workers or in nearby populations. The data on health effects from the existing epidemiological studies concludes that exposure to pathogenic microorganisms in wastewater aerosols is not a unique way of initiating enteric infections." However, most researchers agree that studies to date have been inadequate, and that it is necessary to study relatively large populations of highly susceptible infants and children exposed to pathogens not endemic in their own community in order to obtain statistically significant results. The only study that did include a large population of susceptible infants and children was carried out in kibbutzim in Israel (Fattal et al. 1984). The circumstantial evidence from the seroepidemiological study indicates that transmission of an enteric virus pathogen to population groups residing near wastewater (partly treated) sprinkler irrigation sites is possible. In this particular study, it was hypothesized that a nonendemic strain of ECHO 4 virus, which had recently entered the country and was causing a nationwide epidemic in urban areas, penetrated the rural communities under - 120 - study through the pathway of dispersion by wastewater sprinkler irrigation. It was concluded that the aerosolized pathogens led to the infection of infants and children, who later infected their mothers, and thus the rest of the community. No excess of clinical disease was detected, however. This same study did not find excess prevalence of other common enteroviruses in the same wastewater irrigating communities compared with communities not irri- gating with wastewater. In earlier work in Israel (Fattal et al. 1981; Shuval, Fattal, and Wax 1983), circumstantial evidence indicated that clinical gastrointestinal disease, most probably of virus etiology, occurs at signficantly higher levels among children residing in rural communities practicing wastewater sprinkler irrigation in fields within 600-1,000 m of residential areas than in nonexposed communities. However, there was essentially no annual excess of clinical gastrointestinal disease. At the same time, there was evidence that shigellosis (bacillary dysentery) rates were somewhat higher in communities practicing wastewater irrigation. In neither case did it appear that these diseases were necessarily transmitted by aerosolized wastewater from sprinkler irrigation. Therefore other possible routes of transmission, such as direct family contacts with farmers returning from wastewater-irrigated fields, must be considered. In the last of the three studies carried out in Israel, the extensive morbidity data provided no evidence of excess enteric disease among sewage contact workers or population groups exposed to wastewater aerosols generated by sprinkler irrigation. Since this was the only prospective epidemiological study of the series, some epidemiologist might suggest that its conclusion be given precedence over the conflicting findings of the previous two studies. These findings provide support for the assumption that, in general, the relatively high levels of immunity against most viruses endemic in the community essentially block environmental transmission by wastewater irrigation, or keep it at such a low level that the additional health burden is not measurable. Thus, the primary route of transmission of such enteroviruses even under good conditions of hygiene is through contact infection at a relatively young age. As mentioned earlier, such contact infection is even more intensive in developing countries, and it can be assumed that wastewater irrigation practices would not normally be expected to result in any meaningful added health problems as far as viruses are concerned. A new virulent virus introduced into urban areas might, however, be transmitted to rural areas irrigating with the town's wastewater. In general, the transport of aerosolized pathogens through wastewater sprinkler irrigation is not a problem in most developing countries where sprinkler irrigation is not normally practiced. 9. Credible epidemiological studies provide limited evidence as to the reduction in negative health effects resulting from effective pathogen removal by wastewater treatment. There is some suggestive evidence in this direction, however, from the studies carried out in Darmstadt and Berlin (Baumhogger 1949; Krey 1949; Schlieper and Kalies 1949). Although there was apparently massive dissemination of Ascaris infection among the residents of - 121 - Darmstadt who consumed salad crops and vegetables irrigated with raw wastewater or who were occupationally exposed to wastewater irrigation, such dissemination did not occur in Berlin, where biological wastewater treatment and sedimentation were applied to the wastewater stream prior to irrigation of similar vegetables and salad crops. Similar suggestive evidence can be derived from the study of the effects of wastewater irrigation practiced near Mexico City on the health of school children (Sanchez Leyva 1976). In that study, no differences were found in the degree of helminth infection among children in the wastewater irrigating villages and in the control village not practicing wastewater irrigation. The fact that the wastewater of Mexico City was stored in a large reservoir before use in irrigation and thus underwent many weeks or even months of sedimentation and pathogen die-away may provide some explanation for this finding. Such long-term storage and sedimentation can be particularly effective in removing the large, easily settleable protozoans and the helminths, which were the primary pathogens studied in this case. It might also be that the absence of negative health effects in the Lubbock (Camann et al. 1983) and Muskegon studies (Clark et al. 1981b; Linneman et al. 1984) were associated with the fact that reasonably well- treated effluents were used for irrigation in these areas. Although results from such limited field epidemiological studies are, in themselves, insufficient to provide conclusive evidence as to the health benefits to be gained by effective pathogen removal by wastewater treatment, we have every reason to believe that appropriate wastewater treatment resulting in effective removal of priority pathogens can indeed provide a high level of health protection. (This suggestion is discussed in detail in Chap. 7.) POTENTIAL TRANSMISSION OF OTHER DISEASES BY WASTEWATER IRRIGATION The foregoing sections have reviewed available evidence from a limited number of credible epidemiological studies of the quantifiable disease transmission associated with wasterwater reuse. Does the fact that strong incriminating evidence has been obtained only with respect to a limited number of diseases and certain routes of transmission mean that other important pathogens are not transmitted by wastewater irrigation? For example, one of the most extensive summaries on diseases transmitted by food contaminated with wastewater (Bryan 1974) lists 35 outbreaks in which vegetables or fruit irrigated with wastewater, or fertilized with sludge or night soil, were the vehicles of enteric disease outbreaks. A good many of the reports date back to the early years of this century and were not available for review. Bryan states that the produce mentioned in those reports included watercress, celery, cabbage, rhubarb, endive, salad crops, and blackberries. There were 10 outbreaks associated with wastewater irrigation of watercress alone, a salad crop usually irrigated by flood methods. This listing includes 15 outbreaks of typhoid fever, 2 of amoebiasis, 1 of shigellosis, 6 of - 122 - salmonellosis, 1 of viral hepatitis (HAV), I of hookworm, 2 of Taeniasis (tapeworm), 4 of ascariasis, 6 of faccioliasis, and 1 of cholera. The review also includes numerous additional outbreaks in which the vehicle was fish, clams, oysters, and other shellfish harvested from areas heavily contaminated with wastewater. This subject is not reviewed in this report. Bryan states that the agricultural produce implicated in these outbreaks had been found to be "grossly contaminated with night soil or raw sewage." No details are given in the review itself, but the author cautions that "the epidemiological evidence presented in many of the reports would not stand up to cricial evaluation. On the other hand, a number of investigators proved their hypotheses with epidemiological evidence." Bryan, who is from the Center for Disease Control of the U.S. Public Health Service in Atlanta, Georgia concluded that "these outbreaks will continue to occur sporadically if raw or partially treated wastewater . . . is used for irrigation or aquaculture . . . since the contaminating pathogens survive in the wastewater and soil and then contaminate foods in sufficient quantities." Can we totally ignore such indirect evidence? The conventional public approach would suggest that all excreted enteric pathogens with sufficient persistence in the environment could at some time be transmitted by irrigation with raw wastewater. This may indeed be so, and we cannot rule out the possibility that other enteric diseases with a known record for environmental transmission, and waterborne transmission in particular, may also, at times, be transmitted by uncontrolled wastewater irrigation. Thus, one must include in the list of diseases that could be associated with irrigation with raw wastewater typhoid fever, salmonellosis, infectious hepatitis, bacillary dysentery, rotavirus infection, enteroviral infections, amoebiasis, giardiasis, and possibly other enteric diseases. Only in the case of typhoid fever and shigellosis have we been able to find supporting epidemiological studies. The lack of firm epidemiological evidence on other diseases may be due to several factors, including the expense and complexity of carrying out credible epidemiological field studies; the lack of research capabilities in some areas where wastewater irrigation is being carried out; the difficulty of obtaining adequate, well-matched control populations for comparative studies; and the difficulty of demonstrating statistically significant differences in studies of environmentally transmitted diseases being transmitted simultaneously by multiple routes in addition to the one under study. Another reason that there may be little or no epidemiological evidence on transmission of some of the above-mentioned diseases by wastewater irrigation could be that they are not in fact effectively transmitted by that route. It is more likely however, that other routes of transmission, such as direct contact resulting from low levels of personal and domestic hygiene, and contamination of water and food, are so dominant for these diseases that any additional marginal transmission resulting from wastewater irrigation is - 123 - usually not detectable, even in the most sophisticated and well-designed epidemiological study. DISCUSSION From the foregoing analysis of epidemiological studies on the health effects of wastewater reuse in agriculture that have been undertaken in both developed and developing countries, we can conclude that there is evidence on the transmission of the following diseases in association with the use of raw or only partly treated wastewater: 1. To the general public consuming salad or vegetable crops irrigated with raw wastewater: ascariasis, trichuriasis, and probably tapeworm from eating meat of cattle grazing on wastewater-irrigated pasture. Possibly some transmission of enteric bacterial diseases such as cholera and typhoid fever under specific conditions as well as protozoan disease may occur. 2. To wastewater irrigation workers: ancylostomiasis (hookworm), ascariasis, cholera, and possibly to a much lesser extent, infections caused by some other enteric bacteria and viruses. 3. To the general public residing next to wastewater irrigation projects, particularly those employing sprinkler irrigation with raw or poorly treated wastewater: some minor transmission of diseases, particularly to children, caused by enteric viruses, especially those not currently endemic in the area. Also, possibly limited transmission of bacterial disease by direct contact with wastewater irrigation farmers. Thus, the evidence on possible disease transmission associated with wastewater irrigation points most strongly to the helminths as the number one problem, particularly in the developing countries, with some limited transmission of bacterial and virus disease. The findings and conclusions of this analysis generally fit the predictive assumptions of our theoretical model described in Chapter 2. According to that model, the soil-transmitted helminths and tapeworms are the ones most likely to be transmitted by irrigation with raw wastewater. A lower degree of transmission of the bacterial diseases would occur. The model further predicts that there would be little or no transmission of the enteric virus diseases. The fact: that the empirical evidence provides strong support for the theoretical model gives added weight to the findings and conclusions. Let us now consider the implications of these findings for the developing countries. In applying the above conclusions to the developing countries, one must first consider that most of these countries are in areas of the world - 124 - where helminthic and protozoan diseases such as hookworm, ascariasis, trichuriasis and tapeworm, are endemic. In many of these areas, cholera and typhoid are endemic as well. Figures 4-12 to 4-16 show the worldwide distribution of several of these diseases. It can be assumed that, in most developing countries having poor socioeconomic conditions and low levels of personal and domestic hygiene, the population is exposed to and has become immune to endemic enteric virus diseases at a very young age through typical contact infection in the home. Most infections with enteroviruses lead to either benign short episodes of acute illness, or even to numerous subclinical infections that place little or no overall economic burden on the society. Some virus infections such as rotaviruses play a major role in high infant mortality rates, however. For most of these diseases, lifelong immunity is acquired at a very young age, so that there can be no more than one episode of any particular enterovirus disease in a lifetime. Although the number of potential enteroviruses that can cause infection through the fecal-oral route or through contaminated wastewater is rather large, reaching approximately 100 discrete pathogenic viruses, it can be assumed that in most developing countries the excess burden of virus infection, if any, resulting from the utilization of wastewater is nil or insignificant. Although available information on the epidemiology of rotavirus and Norwalk virus infection is limited, we assume that they are essentially no different from the known enteroviruses. As we have seen, man can develop immunity to enteric viruses. However, he has little or no immunity to most enteric protozoans or helminths. Each worm egg swallowed or ingested can infect the intestinal tract by an additional parasitic worm. Thus, with continuing exposure, the infestation becomes long term and cumulative. Populations constantly exposed to Ascaris, Trichuris, or Taenia infections through the consumption of wastewater-irrigated vegetable crops or of meat from animals raised on wastewater-irrigated fields can be expected to carry ever-increasing loads of these helminths in their intestinal tracts for long periods of time, or for as long as they are exposed to the contaminated crops. Similarly, wastewater irrigation workers in developing countries who are continuously exposed to hookworms in the soil will build up heavy and debilitating loads of hookworm in their intestinal tracts. For all these reasons, the danger of transmission of helminth disease by wastewater use in agriculture appears to be the number one problem in this area for the developing countries. The explanation of the transmission of bacterial enteric infections by wastewater in developing countries most likely lies somewhere between the case of enteric virus diseases and that of helminthic diseases. For most developing countries suffering from relatively low levels of personal and domestic hygiene, it can be assumed that most bacterial enteric diseases--like the virus diseases--are transmitted through direct contact infection, or through contaminated water or food. Although a few enteric bacterial diseases impart long-term immunity, most result in relatively short periods of immunity, if they impart any immunity at all. Nevertheless, under conditions (Text continues on page 130.) I-n. Fig. 4-12. Geographical distribution of Ancylostoma duodenale. Source: Adapted from Feachem et al. (1983). Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 15, p. 33. ---~~~~~~~~~~~A. Fig. 4-13. Geographical distribution of Necator americanus (hookworm). Source: Adapted from Feachem et al. (1983). Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 16, p. 34. Fig. 4-14. Geographical distribution of Taenia saginata (beef tapeworm). Source: Adapted from Feachem et al. (1983). Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 17, p. 35. co JI - l 0 ~~~~~~~~~~~. e! Fig. 4-15. Geographical distribution of Taenia solium (pork tapeworm). Source: Adapted from Feachem et al. (1983). Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.C.: 1982), no. 1, map 18, p. 36. Africa, 1961 to 1915. Source: Adapted from Feachem et al. (1983). Taken from World Bank Appropriate Sanitation Alternatives, World Bank Studies in Water Supply and Sanitation (Washington, D.c.: 1982), no. 1, map 12, p. 30. - 130 - of intensive endemic transmission, a significant proportion of the population may have transient immunity to specific bacterial diseases at any given moment, and thus the opportunity for further infection and disease from external environmental factors may be reduced. Even though there is good evidence from the developed countries that cholera, typhoid fever, bacillary dysentery, and possibly other bacterial enteric infections have been transmitted in association with wastewater reuse in irrigation, it is not unreasonable to assume that in most developing countries where such bacterial diseases are endemic, and where concurrent transmission by the usual multiple routes is actively transpiring, any additional transmission by wastewater irrigation would be marginal--and would present only a small, if any, additional health burden not normally detectable by epidemiological methods. In conclusion, our interpretation of the importance of the epidemiological evidence of disease transmission associated with the use of raw wastewater in agriculture would rank pathogenic agents in the following declining order of priority for most-of the developing countries: 1. High risk--Helminths (Ancylostoma, Ascaris, Trichuris and Taenia) 2. Lower risk--Enteric bacteria (cholera, typhoid fever Shigella and possibly others); protozoans 3. Least risk--Enteric Viruses Extensive empirical evidence confirms that the helminthic diseases belong in the high-risk category. However, the relative ranking of the bacterial, protozoan, and virus diseases is based only in part on empirical evidence. Here, we have also had to rely on theoretical considerations and assumptions. Blum and Feachem (1985) reached essentially the same conclusion in their evaluation of the health aspects of the use of night soil and sludge in agriculture and aquaculture. The Engelberg Report (1985) has endorsed these conclusions as well. It must be pointed out that these negative health effects were all detected in association with the use of raw or poorly settled wastewater. Therefore, we also can conclude that wastewater treatment processes that effectively remove all or most of these pathogens according to their priority in the above listing, could reduce or even eliminate the negative health effects known to be caused by the utilization of raw wastewater. Thus, the ideal treatment process should be one that is particularly effective in removing the top-priority agents--helminths--even if the treatment is somewhat less efficient in removing bacteria and viruses. (Wastewater treatment technologies that can be used to achieve this goal are covered in Chapter 5.) - 131 - HEALTH AND ECONOMIC IMPLICATIONS ASSOCIATED WITH THE DISEASES FOUND TO BE TRANSMITTED BY IRRIGATION WITH RAW WASTEWATER This chapter has presented epidemiological evidence indicating that certain diseases may be transmitted by irrigation with raw wastewater. Among these, the helminthic diseases caused by Ascaris, Trichuris, hookworm, and tapeworm were shown to be of particular importance. Cholera, typhoid, shigellosis, and diseases caused by enteric viruses were shown to be somewhat less important as far as transmission in developing countries is concerned. What are the health burdens and economic implications that might result from a significant increase in transmission of these diseases in areas where unrestricted irrigation with raw wastewater is practiced? Would their control lead to significantly improved levels of health and well-being, with the expected resulting economic benefits? The following subsections go over the known health burdens and ,economic implications of a number of these diseases. Ascariasis According to Benenson (1980), this disease has a worldwide distribution and is most commion in moist tropical countries. When the worm becomes attached to the wall of the small intestine, the individual experiences "often vague, ordinarily mild or no symptoms. Ascaris pneumonitis, known as Hoeffler's syndrome, is important in children. Heavy parasite burdens may cause digestive and nutritional disturbances, abdominal pain, vomiting, restlessness, and disturbed sleep. Serious complications among children, exposed to particularly heavy and continuing infections, include bowel obstruction and occasionally death due to migration of adult worms into the liver, gallbladder, peritoneal cavity or appendix and rarely from perforation of the intestine." Feachem et al. (1983) state that, "where prevalence and intensity of infection are high, the nutritional consequences of ascariasis in an undernourished population may be considerable. It has been estimated that a child who has 26 worms may lose 10 percent of his total daily body intake in protein. There is also evidence that ascariasis in children may contribute to vitamin A and C deficiencies." However, they conclude that "the effect on child growth of light Ascaris infections, and the benefits of regular deworming are the subject of current debate and research." The World Bank study on the nutritional and economic implications of Ascaris infection in Kenya (Latham, Latham, and Basta 1977) concluded that if translated into a food loss, heavy infections could lead up to nonutilization of 25 percent of ingested calories. They also concluded that such heavy infestations could have "important implications for nutritional programs." Although their results largely pertain to children, the authors argue that "the implications are equally valid to the well-being and productivity of adults--" - 132 - However, Greenberg et al. (1981) did not detect any nutritional or growth improvement after a single dose drug treatment for Ascaris among children 1-8 years old in Bangladesh. They hold that many of the previous studies indicating health benefits of Ascaris worm therapy were "less than optimally controlled community studies." One of the members of that research team, Robert Gilman of the Johns Hopkins University Medical School, doubts the validity of the conclusions of Latham, Latham, and Basta (1977) because of the methodological problems in that study. He argues that, for most persons infected with Ascaris, the level of infection is mild, with essentially no measurable detrimental health effects. He has described the Ascaris worm as the "ideal" parasite, in that it thrives in the human body without generally causing harm or serious discomfort to the host. At the same time, he agrees that there can be serious medical complications in cases of heavy infestations among children, although such cases are, in his opinion, rare. Jenkins and Beach (1954) working in South Carolina reported, however, that "serious complications of ascariasis are not rare in children," and that in this region "the most common complication is intestinal obstruction." The hygienic situation in South Carolina in 1954 certainly was not as bad as in many of the developing countries and it is not unreasonable to assume that in developing countries a larger percentage of the children are exposed to high levels of Ascaris infections that lead to such serious complications. This would be particularly so in the case of children continuously consuming Ascaris- contaminated vegetables or exposed for long periods to helminth-contaminated soil from wastewater irrigation. Thus, it appears that there are major differences of opinion as to the health burdens and economic implications of Ascaris infections, particularly in light infections, which constitute the majority of cases. It is beyond the scope of this report to attempt to resolve these differences, but there appears to be some general agreement that heavy Ascaris infections, particularly in children, can cause serious health damage and economic burden. Trichuriasis Trichuriasis, a widely endemic disease, especially in warm moist regions, is a nematode worm infection of the large intestine. According to Benenson (1980), the disease is often asymptomatic and detected only by examination of feces. Benenson states that "heavy infections result in intermittent abdominal pain, bloody stools, diarrhea, and loss of weight. Rectal prolapse may occur in very heavy infestations." Feachem et al. (1983) have pointed to research indicating that, although most infections in adults are symptomless, "in malnourished children heavy infection can cause anemia, bloody diarrhea and prolapse of the rectum." Field studies, especially on detectable negative health effects from Trichuris infections, were not found. However, in a study by Karyadi and Basta (1973) on nutrition and health among Indonesian adult construction workers, concurrent hookworm, Trichuris, and Ascaris infections were found to be very common. These authors concluded, however, that iron deficiency anemia - 133 - that was correlated with reduced physical endurance was caused by the hookworm infestations, and no specific negative health effects among construction workers were attributed to the ubiquitous Trichuris and Ascaris. Ancylostomiasis (Hookworm Disease) This nematode worm disease is widely endemic in tropical and subtropical countries with low levels of community hygiene and soil moisture and with temperature conditions favoring the development of infective larvae, which enter the body by penetrating the host's skin. This occurs most frequently in areas where people walk barefoot. (See Figs. 4-11 and 4-12 showing geographical distribution of this disease.) According to Benenson (1980), it is "a chronic, debilitating disease with a variety of vague symptoms, varying greatly to the degree of anemia. The bloodletting activity of the nematode, along with malnutrition, leads to hypochronic microcytic anemia (iron deficiency) a major cause of disability. Children with heavy, long term infection may be retarded in mental and physical development . . . light hookworm infections generally produce a few or no clinical effects." According to Craig and Faust (1970), the blood loss of an individual with a light infection of hookworm disease may be about 3 cc/day, and in heavy infections blood loss may reach 100 cc/day. A study of Indonesian construction workers (Karyadi and Basta 1973) has shown a high correlation between iron deficiency anemia and the intensity of hookworm infestation (see Table 4-2). In this study, the productivity of the nonanemic, hookworm-free laborers was approximately 20 percent greater than that of anemic laborers. This research provides evidence on the economic implications of such levels of hookworm infestations. According to another report (Basta and Churchill 1974), once infestation with hookworm or anemia occurs, the environmental, economic, and nutritional factors are likely to increase the debilitating effects of the disease, and thus set up a vicious circle. An anemic individual will tend to work less and thus earn less income if he is performing piecework, for example. This in turn predisposes him to poor nutritional status because he takes in less food, aggravates the anemia further, and increases his susceptibility to other infections. Increased absenteeism and lower productivity can therefore be expected to follow. He is trapped in a series of events in which he cannot improve his income, his nutrition or his health. Furthermore, anemic school children are more apathetic and learn less than nonanemic school children. Loss of concentration in the classroom, combined with increased absenteeism due to illness, should therefore be included in the calculations of the economic costs associated with anemia and hookworm disease. Karyadi and Basta also state that "anemia, unlike most diseases, is not dramatized by overt, clear-cut symptoms which make it glaringly obvious to - 134 - TABLE 4-2 Relation of hookworm egg count to hemoglobin and hematocrit values in construction workers at Halim, Indonesia Intensity grouping (eggs per ml) Clinical Hemoglobin Hematocrit Score of feces No. classification Mean + SD Mean + SD (Low) 0 100 25 Normal 13.7 + 1.06 40.9 + 2.2 1 100 - 699 18 Very light 13.9 + 1.29 41.2 + 2.8 2 700 - 2,599 38 Light 13.6 + 1.79 41.6 + 4.7 3 2,600 - 12,599 48 Moderate 13.0 + 1.76 39.1 + 3.7 (High) 4 12,600 - 25,099 12 Heavy 11.9 + 2.08 38.7 + 4.0 Source: Karyadi and Basta (1973). the untrained eye. Therefore, it has been relatively neglected in public health programs. Its symptoms are subtle, additive, and, for many people lethal. It has been ignored because like most nutritional diseases it is masked by infections and other diseases which, in most cases, arise precisely because of the primary state of malnutrition." According to the World Bank, anemia is the most prevalent disease known to afflict man today. Nontheless, Karyadi and Basta believe that hookworm irradication, even if feasible, would not in itself lead to a significant reduction in anemia, since the anemia is due fundamentally to a dietary deficiency of iron. Hookworm infestation can be present in many people without causing anemia; however, heavy hookworm burdens increase the severity of the preexisting nutritional anemia, or in some cases contribute to its initiation. Briscoe (1979) is of the opinion that some estimates of caloric loss due to hookworm infections are unrealistically high. Most researchers agree that such infections can cause serious debilitating health effects and a serious economic burden among children and adults, particularly in the case of heavy hookworm loads resulting from high-level and continuing exposure. - 135 - Taeniasis and Cysticercosis (Beef and Pork Tapeworm Disease) Taeniasis and cysticercosis are infections of worldwide distribution (see Figs. 4-13 and 4-14) caused by the adult stage of the beef tapeworm (Taenia saginata) or adult or larval state of the pork tapeworm (Taenia solium). According to Benenson (1975), "clinical manifestations of infection with the adult worm (taeniasis) are variable and may include nervousness, insomnia, anorexia, loss of weight, abdominal pain and digestive disturbances. Many infections are asymptomatic." Taeniasis is a nonfatal disease. Larval infection with T. solium--the pork tapeworm (cysticercosis)-- is a serious somatic disease that may involve many different organs and tissues. When eggs of the pork tapeworm are swallowed by man, they hatch in the small intestine, and larval forms (cysticerci) develop in various tissues and organs of the body. Consequences may be grave when larvae localize in the ear, eye, central nervous system, or heart. In the presence of somatic cysticercosis, psychic symptoms, including epileptiform seizures, strongly suggest cerebral involvement. Cysticercosis is a chronic disease that may cause disability and may have a relatively high mortality rate. The damage to cattle and pigs infected by eating food or crops contaminated by human feces, sludge, or wastewater results in serious economic loss. Cholera Cholera is probably the best-known and most feared of the diarrheal diseases. Although it is by no means the most important cause of diarrhea in terms of total morbidity or mortality, it has caused, and in some parts of the world continues to cause, dramatic outbreaks of acute disease accompanied by considerable loss of life. In other areas, cholera is part of the overall spectrum of endemic diarrhea, and it is in this situation that it often occurs with a regular seasonal periodicity (Feachem et al. 1983). The disease has spread extensively from a focus in Indonesia through most of Asia and the Middle East into Eastern Europe and Africa. From North Africa it has spread into the Iberian Peninsula, and in 1973 it moved into Italy and other areas of Europe. The disease has had a tendency to persist in most of the affected countries. (See Fig. 4-15, which illustrates the pathway of the worldwide spread of cholera.) According to Benenson (1980), cholera is "a serious acute intestinal disease characterized by sudden onset, profuse watery stools, vomiting, rapid dehydration, acidosis, and circulatory collapse. Death may occur within a few hours of onset. Fatality rates in untreated cases may exceed 50 percent; with proper treatment they are below 1 percent. Mild cases with only diarrhea are common, especially in children. Inapparent and wholly asymptomatic infections are many times more frequent than clinically recognized cases, especially in El Tor cholera." Cholera outbreaks have severe economic implications since they can cripple tourism, hurt the export of agricultural produce, and disrupt normal economic life. - 136 - Typhoid Fever Typhoid fever is caused by Salmonella typhi and has widespread distribution throughout the world. It is endemic in many countries in Africa Asia, the Middle East, parts of Europe, and Latin America. The pathogen is excreted in the feces of an ill or infected person. About 10 percent of the patients discharge typhoid bacteria for 3 months after onset and 2 to 5 percent become permanent carriers (Benenson 1980). Typhoid is a serious, debilitating systemic infectious disease char- acterized by high fever and numerous medical complications. A usual fatality rate of 10 percent is reduced to 2 to 3 percent by antibiotic therapy (Benenson 1980). Susceptibility is general, although many adults acquire immunity through previous infections, some of a subclinical form. Shigellosis (Bacillary Dysentery) Shigellosis has a worldwide distribution and is found in all countries, the highest incidence being in countries and communities where hygiene is poor. Two-thirds of the cases and most of the deaths occur in children under 10 years of age. According to Benenson (1980), shigellosis is an acute bacterial disease involving primarily the large intestine, characterized by diarrhea and accompanied by fever and often vomiting, cramps, and tenesmus. In severe cases, the stools contain blood, mucus, and pus. In the usual outbreak, there are mild and asymptomatic infections. The severity of the illness and the fatality rate are functions of the age of the patient, of the preexisting state of nutrition, of the level of sanitation or size of the infecting dose, and of the serotype of the organism predominating in the outbreak. For serious cases, the fatality rate may exceed 20 percent in the absence of supportive therapy. Shigella dysentariae 1 is much more likely to be associated with serious disease. Enteric Viruses The term enteric viruses covers more than 100 different ubiquitous viruses known to be fecally excreted by man. New viruses of this type are still being discovered, and several have yet to be fully characterized. Typically, they infect the alimentary canal, but a number of them can cause infection through the respiratory tract as well. A very high percentage of the infections are completely asymptomatic, but the range of clinical symptoms is from nonspecific minor illness to severe gastroenteritis, meningitis, paralysis, and death. (See Table 2-1 for a classification of some of the main excreted viruses and the range of disease and symptoms caused.) Infection by most, or possibly all, of the enteroviruses result in lifelong immunity. Although most older children and adults have the antibody to rotavirus, adults can apparently become reinfected, as shown by rising antibody titers, and sometimes by clinical disease. However, the existence of - 137 - at least two rotavirus serotypes that are not cross-protective may partly explain such apparent repeat attacks. Under conditions of poor personal and domestic hygiene in the developing countries, most children become infected and thus acquire lifelong immunity to most of the endemic enteric viruses at a relatively young age. The older children and adults are thus usually immune, and therefore not at risk to further environmental transmission. Conclusions This brief review indicates that heavy infections with hookworms give rise to a severe debilitating condition with quantifiable negative economic effects. Tapeworm disease also has serious health and economic effects. Ascaris and Trichuris infections are mainly light, with few known negative health effects among adults, but heavy infections among children can apparently be quite serious. The health and economic impact of typhoid fever and cholera at times of epidemics can be severe, but are less so under low- level endemic conditions. Shigellosis seldom has severe health implications. Diseases caused by enteric viruses range from mild and benign to quite severe. However, since most children become infected and thus immune in the home by direct contact at a very young age, little additional health or economic burden results from further environmental exposure of the immune older children and adults. - 138 - CHAPTER 5 WASTEWATER CHARACTERISTICS AND TREATMENT FOR IRRIGATION This chapter presents different sewage and effluent characteristics and alternative municipal wastewater treatment systems suitable for effluent irrigation systems. Emphasis is on low-cost treatment processes and design parameters, particularly those dealing with pathogen removal. WASTEWATER CHARACTERISTICS AND PROBLEMS IN IRRIGATION SYSTEMS ASSOCIATED WITH WATER QUALITY General Characteristics of Sewage The content of wastewater depends mainly on the composition of the water supply, the use of the water, and the method of collection. The concentration of the different constituents depends on the amount of water used per capita, which is usually smaller (and therefore sewage is more concentrated) in developing countries and in arid or semiarid climates. The principal parameters that characterize wastewater and that affect wastewater treatment processes and reuse systems are very similar. Briefly, these parameters are as follows: I. Wastewater temprerature affects, among other things, settling rates and filtration efficiencies in the removal of suspended solids that may clog appurtenances, the removal of pathogens by physical processes, and the biological growth of microorganisms. The effect of temperature on suspended solids (SS) and pathogen removal by settling and by filtration is mainly due to its effect on water viscosity. A rise in temperature results in higher settling rates and better filtration efficiency: the higher the temperature, the higher the rate of growth of microorganisms (larger algal growth, better biological treatment, and so on). 2. Turbidity is caused by colloidal particulates. This parameter can be used effectively when the concentration of suspended solids is too low to be measured accurately by gravimetric methods, or for fast control of advanced treatment processes such as granular filtration or infiltration through soil. 3. Color of sewage as such is not of direct significance where irrigation is concerned; however, it can give a rough idea of the state of degradation (usually black when septic) and will indicate whether sewage contains high or low concentrations of algae (green) that may cause problems in some types of irrigation systems. - 139 - 4. Odor is of some concern as an environmental nuisance in residential areas around the irrigation site. It is mainly related to the anaerobic processes taking place. 5. Solids--dissolved and suspended organic and inorganic--play a very important role in wastewater reuse for irrigation. The general practice in wastewater treatment is to use total dissolved solids (TDS), suspended solids (SS), settleable solids, and different combinations of these factors as gross parameters for the design and operation of treatment systems. Ionic composition of dissolved solids and the size distribution of suspended salids are parameters that have great bearing on wastewater irrigation costs and benefits. 6. Oxygen demand parameters--dissolved oxygen (DO), biochemical oxygen demand (BOD), chemical oxygen demand (COD)--determine the degree of water pollution and local odors, the treatability of the wastewater, and the design and operation of treatment plants. They are not significant in irrigation system design and operation, however. Typical values of quality parameters of sewage used or expected to be used for irrigation are presented in Tables 5-1 and 5-2. Table 5-1 presents typical characteristics of the sewage of some major cities in India. The concentration of suspended solids ranges from 200 to 600 mg/l (with the exception of 985 mg/l in Hyderabad), as does the BOD (196-480 mg/I). Typical domestic sewage characteristics in the United States are described in Table 5-2. According to this table, the Indian sewage falls within the category of medium-to-strong sewage. Typical municipal sewage in Israel, where about 25 percent of the wastewater is already being used for direct irrigation, contains about 400 mg/l of SS, and the BOD is around 400 mg/l (in Jerusalem, for example, it is around 600 mg/l or more owing to the low consumption of water per capita). Typical concentrations of phosphates, organic N, and NH3 are 30, 50, and 70 mg/l, respectively. Irrigation System Problems Associated with Water Quality Surface irrigation. Surface irrigation is not greatly affected by water quality problems, as is obvious from our descriptions of the different methods of this type of irrigation (see Chap. 6). Some problems may occasionally arise if the water to be used carries large quantities of sediments. These may settle out and clog, at least in part, the transporting channels, gates, and also the pipes and appurtenances where pressure transport systems are sometimes in use, as in basin irrigation. Otherwise, this type of irrigation does not present much of a challenge to the water treatment industry. TABLE 5-1 Typical characteristics of sewage from Indian cities Sample Bombay number Characteristics Ahemedabad Dadar Calcutta Delhi Hyderabad Kanpur Madras Madurai Nagpur 1. pH 7.7 6.9 7.1 7.4 7.3 7.0 7.3 7.5 7.2 0 2. Total solids mg/I 1732 - - 1100 1708 1500 1700 1740 1200 1 3. Suspended solids mg/I 290 220 420 470 985 600 500 420 200 4. Dissolved solids mg/I 1442 1375* - 630 723 900 1200 1320 1000 5. BOD mg/I 196 320 - 223 339 250 350 480 350 6. Total N mg/I - 47.7 40.0 28.5 37.0 73.0 60 - 60 7. Phosphate as P04 mg/I - - 5.5 13.7 14.7 15.0 22.0 - 20.0 8. Potassium mg/I - - 15.9 15.0 26.0 40.0 55.0 - 41.6 Source: Shende et al. (1982). - 141 - TABLE 5-2 Typical domestic sewage characteristics in the United States mg/i Parameter Weak Medium Strong Total suspended solids 100 200 350 Volatile suspended solids 75 135 210 BOD 100 200 400 COD 175 300 600 TOC 100 200 400 Ammonia-N 5 10 20 Organic-N 8 20 40 P04-P 7 10 20 Source: Steel, Terence, and McGhee (1979). Sprinkler irrigation. Sprinkler irrigation systems are more affected by water quality than are surface irrigation systems. This is due primarily to clogging problems, which affect orifice size of the sprinkler head and consequently the application rate. Sprinkler nozzle openings range from less than 3 to more than 20 mm in diameter; the smaller ones may be clogged by large particles or aggregates in the water. Since sprinkler systems are pressure systems, they are susceptible to harm particularly from sand and silt, and rust from corroded pipes. Accumula- tion of sediments in pipes and appurtenances may also be expected when the water used contains large and heavy particles, unless they are removed by pretreatment. Drip (trickle) irrigation. Filtration equipment for low-rate application, particularly for drip irrigation systems, has recently become an integral part of irrigation practice to control clogging of drippers, orifices, or lateral lines. The problem is becoming even more serious, mainly in arid and semiarid countries, as a result of the increasing exploitation of marginal waters of secondary quality (such as wastewater effluents following various treatment methods) and turbid and polluted surface waters. - 142 - Recent work and field observations concerned with the performance of drip irrigation systems that utilize either freshwater or wastewater effluents point out that clogging of low-rate applicators is caused by suspended matter, chemical precipitates, and algae or bacterial biomass, of which the suspended material is generally the most important. 1. Suspended matter: these particles range in size from colloidal (<1 ipm ) up to coarse (1 mm). Three different types of particles may be observed: a. Silt and clay, which are released from surface water and sometimes from groundwater. b. Algae, which originate in surface water and in aerobic pond effluents. c. Organic suspended solids, which occur mainly in effluents. The elasticity of these particles allows them to pass through openings smaller than their original sizes. They also tend to encourage bacterial growth in the systems by providing food for the microorganisms. 2. Dissolved matter: these are salts that precipitate in the pipes and fittings. The different factors causing precipitation are: a. Sealing due to the application of hard water with a low saturation index (particularly at high temperatures). b. Corrosion products from steel pipes. c. Precipitated salts from dissolved fertilizers when applied through systems utilizing hard water. 3. Biological growth: this consists of bacterial colonies in effluents from activated sludge plants and trickling filters, or algal colonies from stabilization ponds. Clogging is often due to a combination of factors, for example, clay and corrosion products entrapped within a biological mass cemented with CaCO precipitate. The dominant cause of emitter clogging and flow reduction found in experiments with Colorado River water was suspended sediments or a combina- tion of biological (microbial slimes) and chemical (carbonates) factors, which played a minor role. The principal factors were sand grains and plastic par- ticles retained in the pipes during installation (Gilbert et al. 1981). Characteristics of Effluents from Conventional Wastewater Treatment Facilities Dissolved solids added to municipal water during use are not removed by conventional treatment, and pathogen reduction is relatively ineffective. In contrast, suspended solids, chemical, and biochemical oxygen demand are reduced, and in some cases nutrient nitrogen and phosphorps are transformed. - 143 - Ranges of removal efficiencies by different unit operations are listed in Table 5-3. The efficiencies are given in terms of percentage removal of BOD, SS, coliforms, and COD for each type of treatment. Most of the suspended matter is removed by sedimentation. Series of stabilization ponds are a highly effective (possibly the best) method of treating effluents. To compare the above process-efficient relationships, we calculated the possible effluent quality for typical "medium" sewage (Table 5-3). The results are presented in Table 5-4. The combination of settling and anaerobic digestion is used in many small communities, either in anaerobic lagoons or in the better-controlled Imhoff tanks. Imhoff tank effluents are used for irrigation in Israel and other places. An idea of the characteristics of such effluents can be obtained by looking at Table 5-5, which presents the characteristics of Imhoff tank effluents in Fort Devens, Massachusetts (Satterwhite et al. 1976a and b; McKim et al. 1979). The BOD of this effluent ragges from 30 to 185 (mean 112) mg/l, and the total coliform count is (18-53) 10, /100 ml, with a mean value of 3i x 106/ml. The unchlorinated effluent from the tanks (average flow 5,061 'm/day) has been applied to treatment beds. Twenty-two treatment beds have been used for these purposes for more than 30 years. The current practice is to flood the beds for a period of 2 days an'd then to allow a drying period of 2 weeks. Under this cycle, each treatment bed receives effluent about 52 days a year. McKim et al. (1979) calculated that the effluent application rate averaged 28.3 m/yr over a 15-year period (this means an average infiltration rate of 54 cm/day). Each year, 30 cm of the topsoil of the infiltration beds are replaced with local sand and gravel in order to maintain the infiltration capacity of the soil. Analysis of underground water obtained from wells that were installed at 1.5-m depth intervals adjacent to the treatment site indicated BOD values of 0.8-2.5 mg/l and total coliform counts of only 120-620 per 100 ml. Typical values for bacteria removal efficiencies in sedimentation of wastewater are summarized in Table 5-6. STABILIZATION PONDS, SUBSEQUENT TREATMENTS, AND EFFLUENT QUALITY A recent report on stabilization ponds emphasized that "pond systems are ideally suited for use in small communities and or schools, hospitals and other institutions, since they are simple and economic to construct, operate and maintain" (Cillie quoted in Drews 1983). The National Institute of Water Research in Pretoria, South Africa gives the effluent qualities of stabilization ponds and maturation ponds expected from recommended design criteria (Drews 1983). The maximum expected values are presented in Table 5-7. - 144 - TABLE 5-3 Relative efficiencies of sewage treatment operations and processes 0 Removal of 5-day, 20 C suspended Treatment operation or process BOD solids (Z) Bacteria COD Fine screening 5-10 2-20 0 5-10 Chlorination of raw or settled sewage 15-30 - 90-95 - Plain sedimentation 25-40 40-70 50-90 al 20-35 Chemical precipitation 50-85 70-90 40-80 40-70 Trickling filtration preceded and followed by plain sedimentation 50-95 50-92 80-95 50-80 Activated-sludge treatment preceded and / followed by plain sedimentation 55-95 55-95 99 _ 50-80 Stabilization ponds 90-95 85-95 >99.9 '/ 70-80 Chlorination of biologically treated sewage - - 98-99 - a. 3-6 hrs' retention. May be down to 10 percent for shorter residence time. b. Can decrease to 60 percent for poorly aerated AS systems and can reach 99.9 for extended aeration with hydraulic retention time >24 hrs. c. For series of three or more ponds with total detention time of 15-20 days or more. Sources: Fair, Ceyer, and Okun (1968); bacteria reduction data are based on Feachem et al. (1983), excluding chlorination and chemical precipitation data. - 145 - TABLE 5-4 Calculated effluent characteristics from different operations and processes Treatment and BOD SS Coliforms COD operation process (mg/l (mg/1) (per 100 ml) (mg/1) Raw sewage 300 270 107 450 Fine screening 270-285 215-265 107 405-430 Chlorination of raw or settle sewage 210-255 - 5x105_i06 - Primary sedimentation 180-225 80-160 106-5x106 290-360 (> 3 hours) Chemical precipitation 45-150 30-80 2x106-6x106 135-270 Trickling filtration preceded and followed by secondary sedimentation 15-150 20-135 a/ 5x105-8x105 90-225 Activated-sludge treatment preceded and followed by secondary sedimentation 15-135 15-120 a/ 105 90-225 Stabilization ponds 15-30 15-40 b/ 103 90-135 (> 20 days, in series) Chlorination of biologically treated sewage - - 10-2x10 c/ _ a. Mostly bacterial floc. b. Mostly algal floc. c. In some cases, a 0 for total coliforms was reported; however, regrowth of bacteria was indicated as well (Shuval, Cohen, and Kolodney 1973; Shuval and Fattal 1974). - 146 - TABLE 5-5 Imhoff tank effluent: chemical and bacteriological characteristics at Fort Devens, Massachusetts, land treatment site Effluent- Parameter Range Mean pH (standard units) 6.2-8.0 Conductivity ( mhos) 402-700 511 Alkalinity (ppm CaCO3) 116-245 155 Hardness (ppm CaCO3) 22-60 41 BOD5 30-185 112 COD 110-450 192 Total nitrogen 19-78 47 Organic nitrogen 11.5-32.8 23.4 NH4-N 6.2-42 21.4 N03-N 0.4-2.8 1.3 N02-N 0.002-0.06 0.02 Total P04-P 6-16 11 Ortho P04-P 3-15 9 Chloride 75-210 150 Sulfate 27-72 42 Total coliform 18-53 32 bacteria x 10o/ioo ml -/ mg/l unless otherwise indicated. Source: Satterwhite et al. (1976a). TABLE 5-6 Bacterial removal during primary wastewater sedimentation (percent) Total coliforms 10 Fecal coliforms 35 Escherichia coli 15 Mycobacterium tuberculosis 50 Salmonella spp. 15 Shigella spp. 15 Viruses <10 Helminths <50 Sources: Crites and Uiga (1979), Sproul (1978), and Kowal, Pahren, and Akin (1981). - 147 - TABLE 5-7 Expected values of properly designed stabilization ponds in Southern Africa Effluent Composition Stabilization ponds: Maturation ponds: Parameter for raw and settled for well-nitrified (mg/l except where wastewater, septic secondary otherwise stated) tank, and aqua privy effluent effluent Color, taste, and odor Not objectionable Not objectionable pH (range) 7.0-10.5 7.0-10.5 Temperature, Oc maximum 30 30 Dissolved oxygen, Z sat. minimum 75 75 Fecal coliform bacteria maximum 100/100 ml (97.5% 1,000/100 ml (97.5% probability) probability) BOD5 (total) maximum 16 12 BODJ (filtrate) maximum 12 8 COD- (total) maximum 150 120 COD (soluble) maximum 120 100 OA_/ (total) maximum 20 15 OA-/ (soluble) maximum 15 10 Ammonia nitrogen maximum 10 10 Note: Aimed at small communities of up to 5,000 people, 800 m3/day flow. Detention times 18-25 days, depending on temperature and plant configura- tion. Oxygen adsorbed from N/80 KMNO4 in 4 hours. Source: Drews (1983). There is no doubt that ponds in series can provide an excellent treatment that results in exceptional effluent quality, as was illustrated by Mara, Pearson, and Silva (1983) and Silva (1982) in research findings in ponding experiments in northern Brazil. This research was carried out using mainly domestic sewage. The ponds were formed from a former sewage treatment works (conventional type) that had been abandoned because of maintenance problems. The research findings are given in Tables 5-8 and 5-9, where it can be seen that a series consisting of anaerobic, facultative, and maturation ponds can virtually eliminate pathogens and substantially reduce BOD5 and SS loadings. According to the investigators, the high-effluent SS concentration over 40 mg/l results from algal growth, but this should not present problems, except in certain specific instances; in fact, if the end use is for flood, TABLE 5-8 Experimental results of effluents from a series of five stabilization ponds in Brazil Experiment A Experiment B Experiment C 6/77 - 5/79 1/81 - 12/81 6/79 - 11/80 R.T. BOD SS FC R.T. BOD SS FC R.T. BOD SS FC Raw - 240 305 4.6 x 107 - 289 283 4.1 x 107 - 232 297 3.0 x 107 A 6.8 63 56 2.9 x 106 4.0 92 62 4.0 x 106 2.0 59 61 4.5 x 10 -6 F 5,5 45 74 3.2 x 105 3.2 78 69 1.8 x 106 1.6 53 53 2.7 x 106 1 Ml 5.5 25 61 2.4 x 104 3.2 49 78 5.6 x 10 1.6 41 50 1.5 x 106 M2 5.5 19 43 450 3.2 37 66 9.0 x 104 1.6 32 49 6.8 x 105 M3 5.8 17 45 30 3.4 35 72 1.4 x 104 1.7 26 51 3.4 x 105 Note: A, anaerobIc pond; F, facultative pond; Ml to 3, three maturation ponds in series. R.T., retention time (days); BOO5, mg/I; SS, mg/I; FC, per 100 ml. Mean daily mid-depth temperature was 260 C. Source: Silva (1982). TABLE 5-9 Experimental results of effluents from four facultative ponds in parallel in Brazil EYxperiment A Experiment B Experiment C 6/77 - 3/79 6/79 - 11/80 1/81 - 12/81 R.T. BODL BOD8 SS FC R.T. 8L BOD5 SS FC R.T. BOL BOD5 SS FC Raw - - 245 310 4.7 x 10 - - 232 297 3.0 x 10 - - 289 283 4.1 x 10 a 11.8 258 54 90 5.2 x 105 7.5 388 60 84 1.5 x 106 7.5 482 75 85 1.8 x 106 b 12.0 255 51 91 4.3 x 105 6.3 464 61 78 1.8 x 106 6.3 577 76 77 2.2 x 106 c 9.5 322 57 85 6.3 x 105 6,8 425 63 86 1.5 x 106 6.8 529 75 81 2.0 x 106 d 18.9 162 40 95 .3.0 x 105 7.5 387 61 83 1.2 x 106 7.5 482 71 80 1.9 x 106 Note: a, b, c, d, independent facultative ponds; R.T., retention time (days); BODL, BOO loading (kg/ha/d); B005, mg/I; SS, mg/l; FC, per 100 ml. Mean daily mid-depth temperature was 260 C. Source: Siiva (1982). - 150 - border checks, or ridge and furrow irrigation, or for fish farming, there are considerable benefits. The work of Mara et al. (1983) on anaerobic treatment alone (Table 5-10) shows that an anaerobic pond can reduce BOD5 loadings in excess of 80 percent with only 0.4 days retention and, in addition, can reduce fecal coliforms by an order of magnitude of I log or a factor of 10 (at a temperature of 260 C). TABLE 5-10 Mean experimental results of effluents from anaerobic ponds in Brazil (June 1977 - March 1979) Retention BOD SS FC time (days) (mg/l) (mg/1) (/lOOml) Raw sewage - 245 310 4.7 x 107 1 0.8 59 82 8.2 x 106 2 0.4 46 64 5.0 x 1o6 3 1.9 49 57 4.7 x 106 Note: Mid-depth temperature: 260 C; ponds 1 and 2 in series! ponds 1 and 3 raw sewage; volumetric loading on pond 1, 311g BOD5/m /day; pond 3, 132g BOD5/m3/day. Source: Silva (1982). In the National Environmental Engineering Research Institute (NEERI) at Nagpur, India, investigators are experimenting with a treatment chain consisting of oxidation ponds followed by fish ponds, the effluent of which is used to irrigate different crops nearby. Here the fish pond serves as an intermediate step to provide an additional product. Its effect as an intermediate purification step as well as the effect of its design parameters is not yet fully established. NEERI investigators (Shende, Juwarkar, and Sundaresan 1983, and personal communication) noted that the fish consumed the algae from the oxidation pond effluent very "happily." This is something to bear in mind when irrigation with low-rate applicators with small apertures is being considered. - 151 - Large-scale indirect reuse of wastewater for irrigation in Israel has been described by Idelovitch (1979) and Tahal (1979). The Dan Region Reclamation Project, Stage I (15 million cubic meters per year) is based on the following relatively simple, low-cost treatment processes (see Fig. 5-1. 1. Biological treatment carried-out in recirculated oxidation ponds (120 ha). 2. Chemical treatment consisting of: high lime-magnesium treatment carried out in a lime reactor-clarifier; and detention of the high-lime effluent in polishing ponds (75 ha) for natural ammonia stripping, recarbonation, and additional purification. 3. Land treatment by intermittent groundwater recharge via spreading basins (27 ha) on sand dunes. 4. Reclamation of the water by recovery wells (located 800-1,800 m from the center of the recharge zone), which pump an admixture of recharged effluent and natural groundwater and supply it to the national rnetwork after suitable chlorination. Twenty observation wells are located 30- 820 m from the recharge basins. Field data of effluent quality after the various treatment steps in this project are given in Table 5-11. The concentration of colifgrm bacteria in the oxidation pond effluents is lO-107/b00 ml in summer and 10 -10 /100 ml in winter time. The recharge contained 0-3p0/lOO ml. Streptococcus faecalis in the oxidation pond effluent reached 10 -10/ 100 ml, and in the recharge effluent 200/100 ml. Feachem et al. (1978) have surveyed a large body of literature on bacterial survival in ponds and summarized the findings as follows: 1. Single anaerobic ponds have been shown to remove 46-85 percent of E. coli after 3.5-5 days at various temperatures. 2. Single facultative and aerobic ponds have been shown to remove 80->99 percent of E. coli after 10-37 days at various temperatures. 3. Fecal streptococci removals are similar to or greater than removals of E. coli in single facultative and aerobic ponds. 4. Fecal coliform removals of 99.99 percent or greater have been reported for series of three or more ponds. 5. One or two ponds will remove between 90 and 99 percent of Salmonella or other pathogenic bacteria. - 152 - RAW SEWAGE RECECLATION BIOLOGICAL OXIDATION | FONES TREATMENT PONDS | WIME CHLooE Tr-atmentCLARPIFSTR SLAKlNr a, | DOSING I OOSINGD SECONDARY DISSOLUTN . EFFLUENT MILK - OF-LI14E CG(Om) I SE MAL - 4c O tTIO f HiRh Lime-MognesiRn LIME REACTOR TreatMent CLARIFIER C:HEMICAL EFFLUEH LM_LES 7RREATMENT HELE WAE Ar-"Ionia St*P T POLISHYINETWOrK Na FRcroaa PONDS ,,,. gShaPUMP.ST. o R CAE aISPO Po TERTIARY . EFFUJENT TE GROUNWTER |SPREADING| RECHARGqE | BASINS RECLAMATION REOFR RECLAIMED W8ATER TO SUPPLY NETWORK Fig. 5-1. Schematic layout of Dan Region Reclaumtion Project in Israel -Stage 1. Source: Adapted from Idelovitch (1979). - 153 - TABLE 5-11 Effluent quality at various treatment steps in the Dan Region Wastewater Reclamation Project, in Israel Oxidation a/ Recharge bl Recovery water b,c/ Raw sewage a/ pond effluents effluents from wells pH 7.6-8.3 8.0-8.5 9.3 7.7-8.2 Turbidity (NTU) not measured not measured 19.7 0.2-0.4 Suspended solids mg/l 180-500 175-290 56 BOD 120-300 20-90 25 BOD filtrate 60-170 7-45 5.9 COD 350-800 210-400 114 8.8-24.6 COO filtrate 120-450 60-200 52 (") Chloride 140-240 130-230 203 187-239 Alkalinity as CaCO3 390-470 350-420 149 189-204 Hardness as CaCO3 190-340 160-330 149 230-300 Dissolved solids 600-1,200 650-950 566 646-739 Electrical cond. umho/cm 900-1,400 900-1,550 969 1,060-1,250 N(Kjeldahl) mg/l 55-85 40-70 16.8 0.45-0.85 N to filtrate 42-65 30-50 11.7 (") NH -N 35-60 15-40 7 <0.02-5.30 3 NO 3-N 0 0.1-1.0 0.10 <0.002-0.011 P total 7-14 8-12 2.2 0.01-0.05 Detergents 5-25 0.7-2.4 1.3 0.2-0.7 Boron 0.4-1.7 0.4-1.7 0.27 0.2-0.35 SAR 4.4 3.2-4.5 Total bacterial no/ml 1.2 x 10 approx. 100 Coliforms MPN/100 105-107 297 0 Entamoeba coli c54 0 Streptococus 104-105 240 0 faecalis Enteroviruses PFU/400 ml 1 0 a. Tahal (1979). Typical data. b. Idelovitch et al. (1983); 1982 average. c. Observation wells with travel time from recharge basins of 1-6 months in soil. Most of them have operated for more than 5 years; 1982 data. - 154 - 6. Complete elimination of pathogenic bacteria can be achieved with 30- to 40-day retention times, particularly at high temperatures (>250 C). 7. A series of five to seven ponds, each with a 5-day retention time, can produce an effluent with fewer than 100 fecal coliforms and fecal streptococci per 100 ml. Aerated lagoons--that is, ponds with mechanical aerators--have been reported to provide removal rates of 60-99.99. percent for total coliforms and 99 percent for fecal coliforms, total bacteria, Salmonella typhi, and Pseudomonas aeruginosa (Crites and Uiga 1979). Thus, wastewater stabilization ponds can be designed to achieve practically any degree of bacterial pathogen removal deemed necessary for the protection of public health, including complete wastewater treatment. Such a high degree of removal is not necessary for most land application systems (Kowal, Pahren, and Atkin 1981). WATER QUALITY IN WASTEWATER RESERVOIRS FOR AGRICULTURAL IRRIGATION (A CASE STUDY: NAAN RESERVOIR, KIBBUTZ NAAN, ISRAEL) Like runoff ponding, rainy season ponding of sewage provides irrigation water in the dry season. In addition, directing sewage to the ponds solves the problem of its disposal, prevents river pollution, and reduces the risk of groundwater pollution. In the course of ponding, the wastewater is subject to purification processes not requiring investments in mechanical energy; furthermore, the high content of inorganic nutrients in these waters helps to save on fertilizers. The wastewater reservoirs are built as technological devices, oriented toward sanitary, ecological, and economical purposes, and their success is judged accordingly. At the same time, the reservoir represents, in essence, an aquatic ecosystem especially marked by very intensive biological processes, which have a decisive influence on the quality of water in the reservoir and on the level of sanitary and environmental damage. The biological components, in turn, depend on the quality of the sewage, as well as on the weather, hydrological conditions, and the regime under which the pond is run. Actual detailed data on the performance of such reservoirs are still scarce. However, an in-depth study was recently carried out in Israel (Dor and Berend 1982) on a typical, r latively well-operated reservoir at Kibbutz Naan. The reservoir, 700,000 m in volume and about 10 m in operational depth, is fed by two adjacent oxidation polishing ponds (Fig. 5-2) discharging 0 to 100,000 m3/month (these ponds receive effluents from the overloaded oxidation ponds of Ramle, a town near Kibbutz Naan). The floor of the reservoir is lined with polyethylene sheets, and the walls are made of a clay - 155 - Pumping station t @7,y ~~Spampling, \ _ %X\ .. - ~~~~~~~~~the slope Limit of the plastic floor Oxidtion t\\Reservoir \ Oxidation pond Sewage inlet Oxidation Scale 1 2000 Fig. 5-2. Schematic layout of Naan Wastewater Reservoir, Israel. Source: Adapted from Dor and Berend (1982). - 156 - nucleus and stone lining. Some of the findings of the above investigation are described below. Two-year observations reveal that in the spring and summer (dry season) there is an appreciable temperature difference between the surface and bottom layer, whereas in the winter this difference practically vanishes. During the cold and rainy winter (November-February), a very low oxygen content is observed at any depth. In the dry and warm summer, a rapid transition is observed from the top water layer, oversaturated by oxygen, to the layers practically free of oxygen. The above transition occurs at the depth separating the top layer (where photosynthesis dominates) from the bottom layers (where respiration is dominant). Figure 5-3 illustrates the typical annual variations of BOD content in the water undergoing the two stages of purification. The organic content of the raw sewage is generally "strong." Average concentrations of organic material and removal efficiencies in the Naan system are presented in Table 5-12. TABLE 5-12 Organic material concentration and its removal in Naan purification-storing system Oxidation pond (O.P.) Reservoir (1.5 m) mg/l % removal vs. mg/I % removal % removal Parameter Raw mg/l raw vs. 0. P. vs. raw Warm season COOD total 935.0+182 354.8+108 62.1 182.8+73 48.5 80.4 COD filtrate 230.0+89 155.6+30 32.3 113.7+39 26.9 50.0 BOO filtrate 326.9+203 59.3+18 81.9 51.5+31 13.1 84.2 VSS 220.6+80 95.8+94 56.6 65.3+74 31.8 70.4 Cold season COO total 762.6+194 296.4+33 61.2 198.4+103 33.1 74.0 COD filtrate 275.8+91 208.3+64 24.5 112.1+37 46.2 59.4 SOD filtrate 513.0+138 151.5+14 70.5 56.7+16 62.6 88.9 VSS 288.0+108 76.2+9 33.4 38.5+24 49.5 83.1 Source: Adapted from Dor and Berend (1982). - 157 - *--o raw sewage .'A v v oxidation pond 700 I .… ~ v.---.vreservoir i \, _- 600 82 Removal in ' \ -055 sedimentation tank 97.7 , and oxidation ponds 200- (0/) ..~22 5 \1 1000C Removal 2.3.. /1 in reservoir 'I . 0 Total removal (1.) 89.4 951 895 829 81.2 971 54.5 57.9 * ~~~~~~~~~~~I I I I tI I Vll xl xn I II 11 IV V Months Fig. 5-3. Relative and total removal of BOD prior to and in the Naan wastewater reservoir, Israel. Source: Adapted from Dor and Berend (1982). The nutrient removal in the treatment system is variable. From the data it can be seen that the system removes the organic nitrogen more efficiently than the phosphate. The appropriate process is appreciably faster in the winter; the difference may be due to the larger mineral content of the raw sewage in summer. Thus, the final nitrogen content in the pond water is about 60 percent lower than in the raw sewage in the summer, compared with 70 percent in the winter. The total removal rates for phosphate are 20 percent in the summer and 45 percent in the winter. The situation is somewhat reversed for ammonia, which is reduced by 40 percent in the summer compared with 5 percent in the winter. In the case presented here, the mean concentration of suspended solids in the raw sewage is 329 mg/l in the warm season and 253 mg/l in the cold season. In the oxidation ponds, this amount is reduced by half, whereas in the reservoir itself it is reduced by another 40 percent in the summer and 45 percent in the winter, thus yielding an overall reduction of 70 percent in the summer compared with 74 percent in the winter. The coliform count in the raw sewage ranges from 104-109/100 ml, with an average of 3.i x 108/100 ml. The appropriate count within the oxidation ponds is 3.5 x 10 /100 ml, or about 98.9 percent reduction. In the succeeding - 158 - reservoir, the appropriate average figure is 4.7 x 105/100 ml, which implies another 86.6 percent reduction compared with the oxidation ponds. The overall bacteria reduction in the system is thus about 99.8 percent. The relevant data are presented in Figure 5-4; in addition, Table 5-13 summarizes the total bacteria count reduction and total removal of Salmonella (97.6 percent) in the system. TABLE 5-13 Removal efficiency of bacteria in oxidation pond-reservoir system (percent) Location SPC!/ Coliforms Salmonella Oxidation pond 85.0 98.9 Reservoir 41.7 86.6 Total removal 91.9 99.8 97.6 Standard plate count (as in Standard Methods) Source: Adapted from Dor and Berend (1982). Much is yet to be done to establish definite design criteria for wastewater reservoirs. Research in the near future will probably be able to recommend the water depth (or depths) from which effluent would be pumped out in order to prevent algae from getting into the irrigation systems. It is clear, though, that deep wastewater reservoirs do have great potential in keeping a proper sanitary environment and in serving as a link in the treatment chain. The Naan Reservoir system, which receives almost raw sewage (av. 300-500 BODF, 250-300 TSS) transported many kilometers from its source, is a fine example of what can be done. PARASITE REMOVAL THROUGH WASTEWATER TREATMENT PROCESSES Concentration of Protozoans and Helminths in Wastewater One fertilized female ascaris worm living in the human intestine produces 200,000 eggs a day; a female ancylostome (hookworm) deposits between 25,000 and 30,000 eggs a day; Trichuris trichiura female worms in the human intestine have been estimated to produce 6,000 eggs a day; and schistosome - 159 - 0- -* Raw sewage Ramia v Oxidation pond v--- -v Reservoir Remoed 9el . 9.9XO- log 106 10E 0 I 11 Ill IV V VlYll 106: w 'I ~~~~~~~~~~0. xii I 11III I0 i Months Fig. 5-4. Removal of coliform bacteria in the Daan oxidation pond-reservoir system, Israel. Source: Adapted from Dor and Berend (1982). - 160 - worms deposit a few hundred eggs daily (Craig and Faust 1970). The concentrations of parasites in sewage are a function of the type of parasite, the number of infected persons in the community serving as the source of the sewage, and the amount of sewage flow per capita. Ascariasis has a worldwide distribution, and levels of 1 to 10 percent are common in many areas, but may exceed 50 percent in moist, tro- pical, highly endemic areas. Many other protozoan and helminthic diseases show similar patterns of prevalence: extremely high levels in moist, tropical, endemic areas and lower levels with broad distribution in other areas. To make a rough estimate of possible levels of helminth egg concentrations in sewage, we can assume that, in a medium-level endemic urban area, about 10 percent of the population is infected at any given time with ascariasis, trichuriasis, or ancylostomiasis. Let us also assume that sewage flow per person per day is 100 1. On the basis of reported egg production rates (Craig and Faust 1970), it can be calculated that one liter of fresh sewage should contain about 200 ascaris eggs, 25 ancylostome eggs, and 6 trichuris eggs. Rowan (1964) in Puerto Rico found ascaris egg concentrations of 14-38/1 in raw sewage and schistosome egg concentrations of 0.7-1.2/1. Wang and Dunlop (1954) found means of 30 ascaris eggs per liter and 52 Entamoeba coli cysts per liter in the raw sewage of Denver, Colorado. Liebman (1965) reports 62 helminth eggs per liter in the sewage of a Bavarian town, a major portion from slaughterhouse wastes. Kott and Kott (1967) found a mean of 4 Entamoeba histolytica cysts per liter of raw sewage in Haifa. In raw municipal sewage in a highly endemic area of India, Lakshminarayana and Abdullappa (1972) found about 200 ancylostome eggs, about 1,000 ascaris eggs, and about 1 trichuris egg per liter. From the available data and theoretical calculations, it is assumed that pathogenic protozoans and helminths often occur in raw sewage in concentrations of 10 to 1,000 per liter. To significantly reduce the health risk to the population, the sewage treatment should remove or inactivate 99 percent or more of such parasites. Let us examine the removal efficiencies of various sewage treatment processes to determine whether this objective can be met. Removal of Parasites by Sedimentation Preliminary treatment by screening or comminution will have no effect on the pathogen content of wastewater. In primary settling tanks with 2 or 3 hours of detention, parasites may be removed either by direct sedimentation or by being absorbed onto solids that are in the process of settling. Studies of laboratory and full-scale primary sedimentation tanks have been done, but laboratory models always give higher removal efficiencies than actual plants because of more idealized conditions. Entamoeba histolytica cysts are generally reduced by 50 percent or less, while 50-70 percent of helminth eggs (Feachem, et al. 1983) usually settle. - 161 - Bhaskaran et al. (1956) reported only 50 percent removal of helminth eggs in a number of primary sedimentation plants in India and about 70 percent removal of Ascaris and hookworm eggs by a septic tank. Phadke, Tacker, and Deshpande (1972) reported considerable reductions of ascaris and hookworm eggs in a septic tank effluent with 20 days' detention, but positive helminth cultures were obtained for the majority of effluent samples tested. Liebman (1965) contends that, although the larger helminths with a specific gravity greater than 1.1 should theoretically be effectively removed in primary sedimentation tanks, variation in detention times, the presence of detergents, and other circumstances leading to nonuniform sedimentation conditions may cause a relatively low removal efficiency, with the result that pathogenic helminths may still be disseminated by sewage effluent used to irrigate fields. In his opinion, biological treatment provides little additional removal capability. He recommends chemical coagulation for more effective removal. A theoretical calculation of sedimentation rates of parasite eggs has been made (Shuval 1977) using available information on the size, shape, and specific gravity of the eggs (Craig and Faust 1970). It is assumed that the parasite eggs behave as discrete, nondeformable particles settling in a dilute solute according to Stokes's law. The short period of acceleration is ignored, and it is assumed that terminal velocity (Vt) is reached immediately. The following equation was used: vt = [2g * - Q v_E/2 - D I ~p where CD = 24/Re in laminar flow conditions, and Re is Reynolds number = , dimensionless, Vt is the terminal velocity, in cm/sec, g is gravity acceleration, 981 cm/sec2 p is particle density, in gr/cm3 5 Pi is liquid density, in gr/cm3 CD is drag coefficient = f (Reynolds number = Re) V is particle volume, in cm3 p A is particle projected ae,i m Apae,i m - 162 - d is particle diameter, in cm p V is dynamic viscocity, in gr/cm sec. The calculated velocity (Vt) for siX typical parasites is presented in Table 5-14. TABLE 5-14 Discrete gravitational settling of parasites in water Settling Size, um Density3 Assumed Vt cm/sec. velocity Parasite grams/cm shape m/hr. Ascaris lumbricoides 55 x 40 1.11 sphere 0.0181 0.65 Hookworm 60 x 40 1.055 sphere 0.0108 0.39 Trichuris trichiura 22 x 50 1.15 cylinder 0.04261 1.53 Schistosoma sp. 50 x 150 1.18 cylinder 0.3386 12.55 Taenia saginata 36 1.1 sphere 0.0074 0.26 Entamoeba a) 5 1.1 sphere 0.0002 0.007 hystolytica b) 20 1.1 sphere 0.0033 0.11 Source: Shuval (1977). From the calculated theoretical settling velocities in Table 5-14, it can be seen that only the eggs of Schistosoma, which settle at 12.55 m/hr, and those of Trichuris trichiura, which settle at 1.53 m/hr, can be assumed to achieve a high degree of removal in conventional sedimentation tanks having upward flow rates of about 1.2 m/hr. Only partial removal of Ascaris, hookworm, and Taenia eggs could be anticipated with their respective settling velocities of 0.65, 0.39, and 0.26 m/hr, and little if any removal by sedimentation of Entamoeba hystolytica, particularly the smaller cysts, reported to be about 5 Jm in diameter. - 163 - As previously mentioned, ideal sedimentation conditions are hampered by short-circuiting, nonuniform flow rates, detergents, and interfering floatables, which can lead to lower removal efficiency. The hatching of certain helminth eggs and the release of free-swimming infectious larvae during the sedimentation stage may also cause infective forms of the parasite to enter the effluent. Removal of Parasites by Conventional and Polishing Treatment Removal of parasites by primary treatment alone has been discussed above. This section focuses on secondary and tertiary treatment and the overall treatment in a conventional treatment plant. Trickling filters alone do not appear to be efficient in removing protozoal cysts and helminth eggs. Entamoeba hystolytica removal of .83-99 percent has been reported. Egg removal appears to be in the range of 20-90 percent, with higher reductions when the effect of secondary sedimentation is included. The activated sludge process itself has little effect on protozoal cysts and helminth eggs, but substantial proportions of eggs will be removed in the secondary settling tank (activated sludge plants have been reported to remove 80-100 percent of helminth eggs). Overall, the results that have been obtained with conventional treatment vary. Vassilkova (1936) reported that treatment in an Imhoff tank removed 97 percent of all helminth eggs present in the influent examined, and a trickling filter removed 87 percent. Roberts (1935) reported the Cysticecus bovis infection of 23 of 45 cattle grazed on a sewage farm irrigated with primary effluent. And Cram (1943) reported that after primary sedimentation, effluent from activated sludge and trickling filters contained significant numbers of hookworm and ascaris eggs, which were completely removed only after chemical coagulation with alum and sand filtration. In addition, a study of five plants in Johannesburg found tapeworm eggs in both raw and settled sewage and in the effluent of trickling filter and activated sludge plants (Hamlin 1946), and still others (Silverman and Griffiths 1955) have also concluded that conventional primary sedimentation, even when followed by secondary treatment, cannot be relied on to effectively remove tapeworm eggs from sewage. At the same time, Kott and Kott (1967) report a 50 percent reduction in E. histolytica cysts after primary sedimentation and trickling filter treatment and a 90 percent reduction in the final effluent. Rowan (1964) reports from a study of eight sewage treatment plants in Puerto Rico that primary treatment removed 35 to 74 percent of the ascaris and 83 percent of the schistosome eggs; trickling filters and activated sludge treatment removed 95 to 99.7 percent of the ascaris and schistosome eggs. In most cases, schistosome eggs hatched during treatment and allowed large numbers of infective miracidia to escape into the stream, the effluent serving as a potential source for dissemination of the disease. Postchlorination or other tertiary treatment was considered essential to prevent further schistosome - 164 - infections. Studies performed in India on the removal of parasites by different sewage treatment processes gave different results, as summarized in Table 5-15 (Paniker and Krishnamoorthi 1978). Originally, polishing processes were not designed primarily for pathogen removal, but some of them do have good pathogen-removal characteristics. Rapid sand filters partly retain protozoa cysts and helminth eggs because of their size. Slow sand filters can completely remove them, and are highly recommended where there is a lack of trained operators and where land is available. Land treatment by percolation may have similar results if properly designed and operated, and if groundwater contamination is not expected. Maturation lagoons that receive effluents from aerobic ponds will remove parasites on the principle of waste stabilization ponds. In general, if two or more maturation ponds are used, with perhaps 5 days of retention in each, total removal of protozoal cysts and helminth eggs will be achieved. Effluent chlorination is not efficient in eliminating protozoal cysts because they are more resistant than either excreted viruses or bacteria. Most helminth eggs are totally unharmed by effluent chlorination. Removal of Parasites by Stabilization Ponds The above findings, which are supported by the theoretical settling calculations, suggest that primary sedimentation cannot be relied on for effective removal of pathogenic protozoans and helminths from wastewater. Some additional removal may be obtained by conventional biological treatment, but the reported results are not uniform. oxidation ponds usually provide detention periods of 5 to 30 days and should provide better conditions for the sedimentation of protozoans and helminths than conventional primary sedimentation plants with detention times of 2 or 3 hours or secondary treatment systems with total detention of 8 to 12 hours. Wachs (1961) found that cysts of Entamoeba hystolytica could be effectively removed from sewage after some 20 days' detention in a stabiliza- tion pond. Arceivala (1970) reports that a municipal oxidation pond in India with a total detention period of 7 days produced an effluent free of protozoan cysts and helminths eggs despite the heavy load of parasitic cysts and eggs in the influent (from 100 to 1,000 per liter). He concluded that oxidation ponds are more effective in removing helminths and protozoans than conventional treatment plants and can provide greater protection to the health of farm workers in contact with sewage used in agriculture. TABLE 5-15 A comparison of the removal of cysts and eggs of entric parasites in various sewage treatment processes in India in percent Sample Trickling filter Biological-/ Aerated-/ Oxidation- Stabilization pond number Parasite Sedimentation Activated sludge 1 2 disc lagoon ditch 1 2 3 4 1 E. histolytica 63.7 82.9 91.0 73.5 69.6 84.0 91.3 86.5 94.1 100 100 o 2 G. lamblia 51.7 92.0 92.5 78.0 58,4 86.5 91.0 95.4 95.0 100 100 LIn 3 A. lumbricoides 96.2 97.9 94.8 95.7 79.2 92.0 94.3 100 100 100 100 4 Hookworm 80.0 85.0 81.8 50.0 50.0 70.0 81.3 93.4 88.0 100 100 5 H. nana 90.0 95.0 80.0 60.0 60.0 77.8 88.9 100 100 100 100 6 T. trichiura 90.0 100.0 92.5 60.0 60.0 100 100 100 100 100 100 7 Taenia sp. 75.0 -- -- -- -- 100 100 100 100 100 100 -- Not detected in raw sewage. - Pilot plant studies. Source: Panicker and Krishnamoorthi (1978). - 166 - In studies on an oxidation pond having a total detention time of 6 days and divided into three pond cells of about equal volume, Lakshminarayana and Abdulappa (1972) showed that the parasites Hymenolepis nana, Trichuris trichiura, Ascaris lumbricoides, and Enterobius vermicularis were totally absent from the pond effluent, despite very high concentrations in the raw sewage influent. Ancyclostoma duodenale (hookworm) rhabditiform larvae were present in significant numbers in the final effluent, however, while mainly hookworm eggs were found in cells one and two. It was calculated that there was more than a 90 pecent reduction in Ancyclostoma. In samples taken from the pond bottom, viable eggs could be detected mainly in pond one and at much lower concentrations at the beginning of pond two, dropping off to none near the outLet. No eggs were deposited in pond three. These findings indicate that protozoan and helminth eggs do settle out effectively in about 3 to 6 days in an oxidation pond but that free- swimming larvae of hookworm or schistosomes may hatch from the eggs during this period and may appear in the effluent. Free-swimming schistosome miracidia larvae can swim at a velocity of 700 cm/hr but cannot survive in the free-swimming state for more than 10 hours (Craig and Faust 1970). Laboratory studies by Craig and Faust indicate that anaerobic pond conditions can completely eliminate or destroy hookworm eggs. They suggest that the inclusion of primary anaerobic ponds before facultative and aerobic ponds would be effective in obtaining helminth-free effluent. Similar findings have been reported by Kazuvoshi and Kruse (1956). Mara et al. (1983) reported that after 28.1 days of total residence time in oxidation ponds, raw sewage containing hundreds of hookworms, Ascaris, and Entamoeba coli per liter ended up with only one hookworm egg and one Entamoeba coli cyst per liter, while a shorter residence time of 8.5-17 days resulted in 2-8 Ascaris and Entamoeba coli and 2-4 hookworms per liter. Feachem et al. (1983) have pointed out that "100 percent removal is indicated in all cases in which well-designed, multicelled ponds with a total retention time of more than 20 days were investigated. Hookworm larvae may survive for up to 16 days in aerobic ponds. Because of this fact, hookworm larvae have been reported in the effluent from ponds with an overall retention time of less than 10 days; they have not, however, been reported in the effluent ponds with retention times of more than 20 days. The majority of Schistosome eggs in an aerobic pond will settle; in a facultative pond they will either settle or hatch into miracidia. Miracidia will either die or infect an intermediate snail host if the correct snail species is colonizing the pond (as may be the case in badly maintained and vegetated ponds). Even if cercariae emerge, they should not find a human host to invade and die within 48 hours." Thus, if only aerobic ponds are used, retention time will be 20 days, which means a very large area. It appears that appropriate combinations of anaerobic and aerobic oxidation ponds may be an effective way to eliminate the health risks associated with protozoans and helminths. A minimum of 1-2 days' detention in anaerobic ponds followed by 7 days in facultative aerobic ponds - 167 - appears to be the minimum desirable treatment of sewage before agricultural use to achieve effective control of helminths or protozoans that might infect agricultural workers or crops, animals, or fish exposed to sewage. COSTS Table 5-16 shows calculated values of the ratio of construction cost for several conventional treatments to the cost for waste stabilization ponds (for this purpose, the "expected cost" values have been used [Widmer 1981, based on Clare and Weiner 19651). The ratios do not include the effect of land cost and are therefore unrealistic, but they do provide some indication of the economic advantage that the waste stabilization pond process can offer in areas of low land cost. In a complete economic evaluation, consideration must also be given to transmission costs from an alternative treatment site within or on the edge of the service (where land is expensive) to a site where land costs are reasonable. TABLE 5-16 Ratio of construction cost of conventional treatment plants to cost of a pond treatment plant of the same capacity ASP TFP PTP ITP Design population WSP WSP WSP WSP 100 3.7 4.5 3.2 4.0 1,000 4.1 4.6 3.4 2.8 10,000 4.8 4.8 3.8 2.0 100,000 5.8 5.0 4.2 1.4 Note: ASP, activated sludge plant; TFP, trickling filter plant with separate sludge digestion; PTP, primary treatment plant with separate sludge digestion; ITP, Imhoff tank plant; WSP, waste stabilization plant. Removal efficiencies are not necessarily equivalent. Source: Clare and Weiner (1965). Operating and maintenance costs are also important factors to be considered in selecting a treatment process (Widmer 1981). The simplicity of waste stabilization ponds, which have no moving parts, makes these most economical to operate. Interpond flow may usually be accomplished by gravity. Pumping may be needed to bring the wastewater from the community - 168 - source to the pond site, bukt this might also be required for alternative treatment schemes. Energy requirements for pond operation are thus negligible. Gloyna (1979) has suggested the following comparative energy ratios (based on unit requirements for activated sludge) for a system handling approximately 3,785 m3 per day (or 1.0 MGD) of wastewater having an influent BOD of 350 mg/l (this excludes pumping and pretreatment costs): Activated Sludge Plant 1.00 Aerated Lagoon Plant 0.85 Rotating Biological Filter 0.12 Waste Stabilization Pond 0.00 Table 5-17 presents cost data based on information obtained from various small existing installations for similar sized communities (adjusted to 1983 costs) in southern Africa (Drews 1983). It can be assumed that the BOD removal efficiencies are in the same range for the systems compared. These data show that pond systems are highly economical for providing waterborne sanitation for small and isolated communities that cannot afford to construct conventional treatment facilities. TABLE 5-17 Relative 1983 costs of wastewater treatment facilities in South Africa Capital cost Running costs (US$ per person) (US$ per person per year) Facultative pond system Conventional Without With Conventional purification sealing of sealing of purification Pond Population works bottoma bottomb works system 240 263 23 45 - 3.60 1,000 162 23 41 19 2.25 3,000 117 19 38 12 1.20 5,000 109 19 38 - 1.20 a. Cost of construction estimaied at US$27,800/ha. b. A rate of ca. US$2.70 per m has been assumed for sealing of pond bottoms, both primary and secondaries (1 rand = US$0.9). Source: Drews (1983). - 169 - Table 5-18 shows the approximate requirements for the total site, capital costs, and operating costs for pond systems serving a population of 30,000 to 100,000 (Arthur 1983). These figures should be treated as rough guidelines in view of the many factors for which assumptions have had to be made. A capital cost comparison in 1980 U.S. dollars of various systems installed in Israel produced the following results: US$ per capita Stabilization ponds 11-18 Ponds with partial aeration 27-32 Aeration ponds followed by finishing ponds 50-55 In Israel, operating and maintenance costs for 1980 ranged from US$0.42 to 0.81 million for stabilization pond systems serving 20,000-100,000 persons, respectively; the 1980 operation and maintenance costs for ponds without aeration equipment ranged from US$0.5 to US$0.6 per capita for a well- maintained system. The. comparative costs for aerated lagoon systems ranged from US$6 to US$13.0 per capita (Arthur 1983). The Central Public Health Engineering Research Institute (1970) gave the figures shown in Table 5-19 for sewage treatment under Indian conditions. Unfortunately, these costs did not take into account the cost of land, which can be an important cost component for industry in urban areas. A survey of 262 treatment plants in mid-America, of which 160 were waste stabilization lagoons and 81 conventional treatment plants (Clare and Weiner 1961), showed that "in many cases the price of land may be 50% of the cost of the complete waste stabilization lagoon, yet the total cost has been equal or less than the cost of a completed secondary treatment works. In numerous instances land costs could be double or triple the completed lagoon construction costs before equaling the conventional plant cost." Even when land costs are included, land-intensive systems seem to be competitive with higher rate systems, but there is a limit at which the cost of land becomes prohibitive. Even when an industry has spare land available, the land has an opportunity cost that should be included in cost comparisons of treatment systems because, even though the land will be available at the end of the treatment plant life, it will not be available for alternative uses during that life (Pescod 1981). WASTE STABILIZATION POND DESIGN Anaerobic Ponds The wastewater treatment technology of choice in developing countries, particularly for effluent irrigation, is stabilization ponds. - 170 - TABLE 5-18 Approximate per capita requirements for a waste stabilization pond system serving a total population of 30,000-100,000 With anaerobic pond Without anaerobic pond 25° C 120 C 25° C 12u C Effluent standard 100 FC/100 ml Land per capita (gross) m2 2.0 3.9 2.5 6.0 Capital cost US$/capita 20 30 25 40 Operational cost US$/capita/annum 0.5 0.7 0.5 0.7 Effluent standard 1,000 FC/100 ml Land per capita (gross) m2 1.7 3.4 2.2 5.5 Capital cost US$/capita 18 28 23 38 Operational cost US$/capita/annum 0.4 0.6 0.4 0.6 Effluent standard 10,000 FC/100 ml Land per capita (gross) m2 1.4 2.9 1.9 5.0 Capital cost US$/capita 15 25 20 35 operational cost US$/capita/annum 0.3 0.5 0.3 0.5 Note: Assumes no pond lining required. Land costs excluded. Source: Arthur (1983). Costs at 1981 levels based on recent field visits to Asia and Africa (Arthur 1983) and less recent literature (Gloyna 1971; Mohanrao 1971; Arceivala, 1970). These are superior to other options, according to the data in Table 5-20, which summarizes some of the advantages of the most widely used sewage treatment processes discussed above (Arthur 1983). It also shows the degree to which this system fares worse than the others with respect to SS removal (owing to the algae in their effluents) and the land requirements. The following sections present the basics of waste stabilization pond design. - 171 - TABLE 5-19 Annual costs of sewage treatment in India, 1970 Annual cost-/ Sewage treatment process (rupees/person) Waste stabilization ponds 0.9 - 2.3 Aerated lagoons 2.8 - 4.8 Oxidation ditches 3.8 - 6.0 Conventional secondary treatment processes 3.5 - 13.2 -/ Including capital amortization over 20 years at 6 percent. Source: Central Public Health Engineering Research Institute (1970). The kinetics of BOD removal in anaerobic ponds are similar to that in conventional anaerobic digesters. In practice, lack of reliable field data has led to inherently conservative empirical designs based on the daily quantity of BOD5 applied per unit volume: L.Q (1) sv V where x = volumetric BOD5 loading in g/m3/d Li = influent BOD5 concentration in mg/l Q = influent flow rate in m3/d V = volume of pond in m3 Provided that the volumetric BOD5 loading is below 400 g/m3fd and stable alkaline fermentation with methane evolution is established, odor is a minimal problem. If the wastewater is acidic, the pH should be adjusted with lime soda ash to a pH between 7 and 8. Anaerobic ponds should be desludged 3when they become half full of sludge. A sludge accumulation rate of 0.04 m per person yearly is generally observed at temperatures above 150 C. TABLE 5-20 Advantages and disadvantages of various sewage treatment systems Waste Waste stabilization stabilization Activated Extended Aerated pond system pond system Package sludge Trickling aeration Oxidation lagoon (including (excluding Criteria plant plant filter plant ditch system anaerobic units) anaerobic units) BOO5 removal ** ** ** ** *** *** ** FC removal * * * * SS removal ** *** *** *** *** ** ** Helminth removal * ** * * ** ** *** *** Virus removal * ** * ** ** *** *** *** Ancillary use possibilities * * * * * *** *** ** Effluent Reuse possibilities a * *a/ ** ** Simple and cheap construction * * * * ** ** f * Simple operations * * ** * ** * *** Land requirement *** *** *** *** *** ** ** * Maintenance costs * * ** * * * *** *** Energy demand * * ** * * * **** Minimization of sludge for removal * **b/ **b/ **_/ * ** *** *** a/ The effluents from activated sludge, trickling filter and package plants frequently have high ammonia levels (> 5mg/1) and fecal bacterial concentrations, and may not be suitable for irrigation or fish farming without ponds or other tertiary treatment. b/ Assumes provision of sludge digesters. Key: *** Good, ** Fair, * Poor. Source: Adapted from Arthur (1983). - 173 - Facultative Ponds Facultative ponds are generally 1-2 m deep. There are a number of design procedures for these ponds, but the one described here is based on the areal BOD5 applied to the pond per unit surface area: (2) X = 10L L s i A where X = areal BOD loading in kg/ha/d, A = pond area in m2, and Li and Q are as defined above. The maximum value of X that can be used for design is a function of temperature determined from performance data of facultative ponds obtained worldwide. It is recommended that design be based on the relationship: (3) X = 20T - 120 s where T = mean temperature of the coldest month, in degrees Celsius (this formula works well in areas having a temperature range of 150 C and up). Thus, the pond area is given by: L.Q (4) A = 2(T-6) BOD5 removal in facultative ponds is a function of the loading. McGarry and Pescod (1970) found the following relationship in equation (5), where X is the BOD5 removed in kg/ha/d: (5) X = 0.725) + 10.75. r s Generally about 70-85 percent of BOD5 is removed. An effluent BOD5 greater than 100 mg/l indicates a predominately anaerobic pond; 40-80 mg/l indicates a predominately aerobic one. Additional removals are achieved in maturation ponds. In facultative ponds that treat raw or screened sewage, a sludge layer forms on the pond bottom. Facultative ponds should be desludged when they are a quarter full of sludge; as with anaerobic ponds, a sludge accumulation rate of 0.04 m3 per person yearly may be predicted (assuming that suitable traps are provided to remove grit, sand, or ash residues that may be in the incoming sewage). Facultative ponds that receive the effluent from anaerobic ponds (or sewered PF toilets) do not normally require desludging. Maturation Ponds Maturation ponds are usually designed to remove fecal coliforms rather than BOD. The model most commonly used for the removal of fecal coliforms in waste stabilization ponds is first-order kinetics in a completely mixed reactor. The kinetic equation is: - 174 - N. (6) N =( e ( + Kb(T)t*) where Ne = number of fecal coliforms per 100 ml of effluent Ni = number of fecal coliforms per 100 ml of influent Kb(T) = first-order rate cony tant for fecal coliform removal at To C day t* = mean hydraulic retention time in days The rate constant varies with temperature according to the equation: (7) K = 2.6 (1.19)T 20. b(T) In a series of anaerobic, facultative, and maturation ponds, equation (6) is written as: N. N N (8) e;R^; )( + K KbT)t an) (1 + Kb(T) t*)fac (1 + Kb(T)t*ma)n where t*an, t*ac and t* are the retention times in the anaerobic, an' Jac'mat facultative, and maturation ponds, respectively; n is the number of maturation ponds (which have the same retention time and which ideally are all the same size--otherwise they should be multiplied separately in the denominator); and Ni and N refer to the fecal coliform concentrations in the raw sewage and the final ef luent, respectively. Retention times in maturation ponds are usually in the range of 5-10 days, and the number of maturation ponds required depends on the desired values of N . A typical design value of N is 1 x 108 per 100 ml. Note that two aiaturatt-on ponds, each with 5-10 days' retention, will normally reduce the BOD5 of facultative pond effluent from about 60-100 mg/l to below 30 mg/l. Physical Design of Ponds In general, rectangular ponds with length-to-breadth ratios of 2 or 3 to 1 and embankment slopes of 1 in 3 are used wherever possible. The embankment is protected from wave erosion by placing precast concrete slabs or stone riprap at surface water level. The pond base should be impermeable. In coarse permeable soils, the pond b.se-shoul*d sealed with plastic sheeting or clay. The inlet and outlet structures should be as simple as possible; a wide variety of low-cost designs is available. For all ponds, V-notch weirs, rectangular weirs, or, if necessary, Parshall flumes may be installed to measure influent and effluent flows, as required for performance evaluation. - 175 - Sample design calculations are given below. Assume a population (P) of 100,000, a BOD5 contribution of 40 gcd, and a wastewater flow of 80 lcd. The design temperature is 200 C. The design concentration of fecal coliforms in the final effluent is to be 100 per 100 ml. The sewage is to be treated by anaerobic, facultative, and maturation ponds operating in series. 1. Anaerobic ponds: Flow, Q = 80 x 10 3 x 100,000 = 8,000 m3/d. Influent BOD5, Li = (40 x 103)/80 = 500mg/I. Taking X as 250 gIm3/d, the volume (V) is given by: 3 V = LQ/X = 500 x 8,000/250 = 16,000m If the depth is 3 m, the area would be 0.53 ha. The hydraulic retention time (= V/Q) is two days, so that the BOD5 removal would be around 60 pefcent. Desludging would be required every n years, where n is given by: nV/2 - 16,000/2 =2 years. P x 0.04 100,000 x 0.04 This assumes that sludge accumulates at a rate of 0.04 m3 per person yearly and that the pond is desludged when it is half full of sludge. 2. Facultative ponds: From equation (4) the area A is given by: LiQ (500 x 0.4) x 8,000 2 A T -12 - = 57,x m or 5.7 ha. If the depth is 1.5 m, the volume will be 86,000 m3 and the retention time 11 days. Assuming a conservative BOD removal of 70 percent, the effluent BOD5 would be 60 mg/l. 3. Maturation ponds: For Ne approximating 100 per 100 ml, try three maturation ponds, each with a retention time of 5 days: N. (9) N = 1 e )(l+ Kb(T)tan) (1 + Kb(T) t*fac) (1 + Kb(T)t* )n - 176 - 108 [i + (2.6 x 2)] [1 + 2.6 x 11)] [1 + 2.6 x 5)]3 = 200. This value is too high. Repeating the calculation--assuming three ponds with 6 1/2 days of retention each--gives N a value of 95, which is satisfactory. The area (A) of each pond, assuming a depth of 1.5 m, is given by: A = Qt*/D = 8,000 x 6.5/1.5 = 35,000 m2. Thus, the total working area of the pond system is approximately 17 ha. The total retention time is 32 1/2 days; since this is greater than 20 days, the effluent will be completely free of helminth eggs, larvae, and protozoan cysts. If the anaerobic pond were not included in the design, the required area would be 25 ha (for one facultative pond of 27 days' retention and four maturation ponds of 5 days' retention each). Night-Soil Ponds The kinetics of BOD removal in night-soil ponds have not been studied, and so it is difficult to estimate with any precision the BOD5 of the effluent. A conservative estimate, based on BOD removal in ponds treating domestic sewage, is that the effluent BOD5 would be in the range of 40-100 mg/I. Further treatment in a small maturation pond with a retention time of 10-20 days might therefore be required if the effluent is to be discharged into a small watercourse. Since the facultative pond effluent would be completely free of excreted pathogens, however, further treatment would not be required if the effluent were to be reused in aquaculture or agriculture. Some caution must be exercised in the agricultural reuse of night-soil pond effluent because it may contain too high a concentration of dissolved salts, especially sodium. According to available evidence, chloride and sodium concentrations in night-soil pond effluents are in the range of 200-300 mg/l and 140-330 mg/l, respectively, which compares well with concentrations of 100-660 mg/l and 60-360 mg/l, respectively, in effluents from ponds treating domestic sewage. In areas where evaporation greatly exceeds precipitation, however, makeup water may be necessary to prevent salts from building up to concentrations that inhibit algal growth. Night-soil treatment ponds have two additional requirements over ponds treating sewage. First, an adequate source of water must be locally available to replace evaporation losses. River water is normally suitable. Second, there must be unloading facilities for the night-soil tankers. The design should include a manually raked medium screen (for example, 10-mm bars with 20-mm spacings), a night-soil pond with a capacity twice that of the largest night-soil tanker used, and a macerating pump that should discharge below the pond surface water level and should be approximately 10-20 m away from the embankment. Provision should be made for the night soil to flow by gravity directly into the pond when the pump is under repair. - 177 - ILLUSTRATIVE EXAMPLE FOR IRRIGATION PURPOSES From the point of view of environmental health, the minimum demands on the treatment plants should be set in the following order of priority (see Chap. 4): Priority 1: Highly effective removal of helminths (100 percent). Priority 2: Reasonably effective removal of bacteria (minimum 99-99.9 percent) and some removal of viruses. Priority 3: Proper loading and dissipation of BOD to eliminate odor. Effluents should meet appropriate water quality and other environmental criteria (see Chap. 4). These will vary both between and within countries and within cities. Without exception, standards for developed countries are inappropriate in developing countries. On the basis of the above data, it is recommended that plants use the following layout: 1. Anaerobic ponds, at least two in parallel, with a minimum of 2 days' total detention time. This will partly take care of both the helminth problem and BOD (without environmental nuisance, if properly operated). 2. Facultative pond(s), usually 5-10-15 days' total detention time (depending on ambient temperature) to remove BOD, to reduce bacteria by one to two orders of magnitude, and to serve as a safety factor for helminth eggs carried over from the anaerobic pond(s). 3. Maturation pond(s) in series, as necessary, to achieve reasonably high bacterial removal. A standard unit for maturation ponds for each location is highly recommended, ideally, a detention time of 5 days each. Figure 5-5 illustrates the above layout, and Figure 5-6 illustrates pond arrangement in which all have a similar size. Where the availability of funds and other constraints dictate construction in phases, the first phase may include anaerobic and facultative ponds, and maturation ponds can be added later. The following example analyzes and compares different pond arrangements in development steps, as suggested above, 'for communities of 5,000-100,000 people, and suggests wastewater flow contributions of (1) 50 lcd and (2) 100 lcd, and a BOD contribution of 40 gcd. Fecal coliform counts in the raw sewage are assumed to be 5x107/100 ml. The detention time in the facultative pond in this case is 10 days. The corresponding ambient - 178 - +1 1 5 I q ~~~Parshall flume Anaerobic Units (flow measurementl 1~~~~~ |G Dpoa rit chantnels 9 T I _I I w S 1 S~~~~~treen Inflowing * 1 raw sewage -n "j-1 Stop cocks or gates _ . Direction of flow i. - Interconnecting Sump Facultative pipework on 4 channels v s c A L E Maturation Maturation atuaion 2 0 Pha se m Phase 11 Eff luent for Irr i gat ion_._. V notch Initially system operates 4- 5- 6 - 7 then 1-4-5-6-7 then 1+2-4-5-6-7 Pond 3 interchanges with 1 and 2 Fig. 5-5. Schematic layout of oxidation ponds system for effluent irrigation (after Arthur 1983) and construction phases. - 179 - ~~~~~~A D Ef fluent RAyu.. L-_.. 5 DAYS 5 DAYS - for (operational pond) Irrigation igr 5-6 Suggste laou of equail«lysized SEDIMIENTATION B C BASINS 5 Days r5 Days Fig. 5-6. Suggested layout of equally sized oxidation ponds in series. temperature is 200 C. BOD and bacterial removals vary according to the appropriate construction phase and the above temperature. A contribution of 50 lcd results in a highly concentrated sewage, which might not be the case for many larger towns, but is considered here for purposes of illustration. The three phases in the example are as follows: Phase 1: Anaerobic (2xl day in parallel, 4 m deep) and facultative (10 days, 1.75 m deep) in series. Phase 2 : Addition of one unit of maturation pond (5 days, 1.5 m deep), in series. Phase 3: Addition of two more units of maturation ponds (5 days each, 1.5 m deep) in parallel. Results are shown in Tables 5-21 and 5-22. Comparison of the effluent-quality parameters for the different phases demonstrates that for polluted sewage of 5x107 FC/100 ml, Phase 1 (that is, anaerobic and maturation pond) does not satisfy even the minimum recommended irrigation bacterial standard for unrestricted crop irrigation; with the addition of one 5-day maturation pond (Phase 2), however, partial use for irrigation becomes possible, and with Phase 3 (3 MP altogether in series), unrestricted irriga- tion can almost be permitted. When the initial BOD is too concentrated, the BOD becomes the limiting factor. Helminth removal will be excellent even in Phase 1, however, and for most situations a bacterial standard is not thought to be a feasible requirement in the developing countries. Cost estimates of the different schemes for the case of 100 lcd are presented in Table 5-23. The above area and cost estimates may be compared with an alternative system consisting of an aerated lagoon, followed by a facultative pond, and two maturation ponds in series. Detention times of 4, 10 and 2 x 5 days, respectively, may result in BOD and coliform removal rates close to those achieved by the complete oxidation-pond system. TABLE 5-21 Treatment pond example, phase 1 Basic data Anaerobic ponds Facultative pond Effluent quality-phase I Discharge BOD load Volume BOD load Area Volume Area BOD lcd Population (m3/d) (kg/d) (m3) (kg/m3/d) (m2) (m3 m2) FC/lOOml (mg/I) 00 50 5,000 250 200 500 125(2x62.5) 2,500 1,430 o 10,000 500 400 1,000 250(2x125) 5,000 2,860 50,000 2,500 2,000 5,000 0.4 1,250(2x625) 25,000 14,300 2.5xlO 96 100,000 5,000 4,000 10,000 (OK) 2,500(2xl,250) 50,000 28,600 100 5,000 500 200 1,000 250(2x125) 5,000 2,860 10,000 100 400 2,000 500(2x250) 10,000 5,710 50,000 5,000 2,000 10,000 0.2 2,500(2x1,250) 50,000 28,600 2.5xlO 48 100,000 10,000 4,000 20,000 (OK) 5,000(2x2,500) 100,000 57,100 TABLE 5-22 Treatment pond example, Phases 2 and 3 Effluent Two additional Effluent Maturation pond quality-phase 2 MP in series quality-phase 3 Volume Area Volume Area Icd Population (m3) (m2) FC/100ml BOD mg/l (m3) (m2) FC/100ml BOD mg/i 50 5,000 1,250 830 1.25x104 64 2xl,250 2x830 200 40 10,000 2,500 1,670 2x2,500 2xl,670 50,000 12,500 8,300 2x12,500 2x8,300 100,000 25,000 16,700 2x25,000 2xl6,700 100 5,000 2,500 1,670 1.25x104 32 2x2,500 2x1,670 200 20 10,000 5,000 3,330 2x5,000 2x3,330 50,000 25,000 16,700 2x25,000 2x16,700 100,000 50,000 33,300 2x50,000 2x33,300 TABLE 5-23 Treatment pond example, Cost Estimatesa' Capital Costs bl O&M cost in bl Phase 1 Phases 1 and 2 Phases 1+2+3 in 1,000 US$ 10 US$ per Year Total Total Total Total Total Total Population Discharge volume area volume area volume area 1 1+2 1+2+3 1 1+2 1+2+3 (m3d) (m3) (m2) (m3) (m2) (m3) (m2 5,000 500 6,000 3,110 8,500 4,780 13,500 8,100 76 99 140 1.0 1.5 2.5 O 10,000 1,000 12,000 6,210 17,000 9,540 27,000 16,200 152 188 280 2.0 3.0 5.0 50,000 5,000 60,000 31,100 85,000 47,800 135,000 81,000 756 990 1,400 10 15 25 100,000 10,000 120,000 62,100 170,000 95,400 270,000 162,000 1,520 1,880 2,800 20 30 50 Assumptions: Capital cost excluding land, without sealing bottoms, in US$/capital: 12, 15, 20 for Phases 1, 2, 3, respectively. Operation and Maintenance (O&M) Value: US$ 0.2, 0.3, 0.5/capita/year for Phases 1, 2, 3, respectively. a. Based on sources referred to in the text updated to 1984 for warm climates. b. Including land value of US$ 5/m - 183 - Table 5-24 presents net area requirements and cost estimates for the complete aerated lagoon works, and compares them with the data on oxidation lagoons in Table 5-23. Depths are assumed to be as follows: aerated lagoon, 3.5 m; facultative pond, 1.75 m (serves as a settling tank as well); and maturation ponds, 1.5 m. Results show close to a 20 percent reduction in the area needed for aerated lagoon systems, and, when the oxidation pond system is applied, considerable savings in capital, operation, and maintenance costs. More economical evaluations of wastewater treatment processes for developing countries are presented elsewhere (Arthur 1983). EFFLUENT TREATMENT FOR DRIP IRRIGATION SYSTEMS Turbid surface waters are often pretreated by settling in earth basins for up to a few days (if possible). Further treatment, which also serves for well water, involves the use of hydrocyclones to remove sand or large silt particles. When orifice clogging is expected to be caused by ferric oxides--a common problem when groundwater is being used--then aeration, settling, and granular filtrations are performed, as is usually done for drinking water. Acid or chlorine treatment is sometimes used to control chemical precipitation or biological slimes. The most popular treatment in drip irrigation (and sometimes in sprinkler irrigation) is filtration by strainers, either regular or with automatic rinse. Most of these strainers are manufactured and supplied by the manufacturers of the irrigation systems. They are often installed in two or three levels: that is, head strainers, then lateral strainers and control strainers. The smallest openings in the strainers used today are #200 (80 micrometers). Most of these filters consist of either screen or perforated plates. Typical Strainers Some typical strainers used in recent years in irrigation systems of low-rate applicators are as follows: 1. Type I. Manufactured by BAR-RAM, PLASTRO and TAPUZ, Israel (see Fig. 5-7). a. Flow is eccentric as in hydrocyclone; the inlet is in the bottom and the outlet in the top. b. 2 rubber rings: i. for creating proper pressure difference to obtain the necessary throughput; ii. for sealing. c. The screen is attached to a plastic cylinder--with large perforations--from the inside. i. The pore size range is #20-200. ii. The screens are made of stainless steel, which provides extra strength. - 184 - TABLE 5-24 Aerated lagoon vs. oxidation ponds (complete systems)-- area and cost estimates Aerated lagoon Oxidation ponds Difference Population system system (a-b) Area Capital O&M costbl Area Capital O&M cost-/ Area Capital O&M cost/b (ha) costa/ (ha) costa/ (ha) cost- 5,000 0.68 164 6 0.81 140 2.5 -0.13 24 3.5 10,000 1.36 328 12 1.52 280 5.0 -0.26 48 7 50,000 6.8 1,640 60 8.1 1,400 25 -1.3 240 35 100,000 13.6 3,280 1202 16.22 2,800 50 -2.6 480 70 Note: Based on sources referred to in the text updated to 1984, for warm climates. For oxidation ponds, see Table 5-23. Aerated lagoon system; Capital cost excluding land US$22.0/capita; Operation and Maintenance (O&M) value US$1.2/capita. Land value US$5/m'. a. in US$1,000. b. in US$1,000/year. d. Principle of operation: the heavier particles settle out and the light ones enter the upper compartments, where they partly erode and disintegrate because of the energy exerted by the water flow. e. Cleaning i. Once in each irrigation cycle, either before or after, the upper valve is opened. Because of the pressure difference, all the particles are pumped out. ii. Another cleaning option is a bottom valve. Also, it is recommended that the bottom compartment be cleaned. iii. Once every 2 weeks, the screens are checked. Once a month, they are cleaned with a fine brush. 2. Type II. Manufactured by NETAFIM, GILAT, PLASSIM, PLASTRO, AMIAD, and BERMAD, Israel (see Fig. 5-8). a. Principle i. The direction of flow is internal and follows the longitudinal axis of the cylinder. The rapid flow in the center of the stream does not come into contact with the cylinder wall, but tends to pull the particles toward the - 185 - 1 -ShutterI 2-Filter housing I 3-Filter screen l I 4- Inlet 5 -O utlet X A B Fig. 5-7. Strainers (examples A and B) for irrigation systems, Type I. end of the cylinder. The particles therefore do not impinge with full force on the cylinder perforations and thus are not forced into the holes. ii. The flow is always in to out. While flowing along the filter cylinder, the water seeps through the perforations, and particles that are too large to pass through continue to the end of the filter cylinder, where they accumulate. b. Cleaning i. By opening the drain valve. ii. By attaching a bleeding device (a bleed) that acts like a large dripper. The flow out of the bleed is limited by a long path acting as a pressure reducer. iii. Automatic electrically operated cleaning mechanism: - For sprinklers only or on the central line. - Preset so that a certain pressure difference (e.g., 3m = 4.2 psi) actuates the cleaning mechanism. - The cleaning operation: * drainage valve opens; * simultaneously, revolving brushes dislodge the particles attached to the cylinder; - 186 - 1 - Shutter 2 - Cover and rubber rings 3 - Outlet 4 - Rinsing valve 5 - Filter screen and support 6 - Inlet (excentric) 7- Solids holder o - Solids rinse outlet A-A Fig. 5-8: Strainer for irrigation systems, Type II. - 187 - time can be adjusted (approximately 15 seconds); no interference with the water flow through the filter. c. Screen materials i. Housing: - up to 2": plastics (greater resistance and cost) - 2" to 6": iron and epoxy - 7" to 8": iron- and corrosion-resistant paints ii. Screen of filtration element - 2/4" to 1/2" Molded polyester weave (manufacturer is now switching to nylon which is stronger) - 2" to 14" Perforated stainless steel ('#30) 3. SANOMAT - FILTOMAT (see Fig. 5-9) Diameters: 3" to 12". In this filter, a revolving arm sucks in the dirt. The principle of operation is as follows: The loss in water pressure, at a preset level, automatically opens a hydraulic valve and activates a backwash cleaning mechanism. The cleaning mechanism consists of a rotating hydraulic motor and suction rings, which wash off the dirt from the filter unit. During the cleaning cycle, water continues to pass through the filter unit. Clogging of Strainers Despite the variety of filters produced, both farmers and manufacturers still complain about frequent clogging of drippers and/or strainers. Recent investigations also reveal that organic particles can be driven through the screen and come out as larger aggregates than before, so that filtration may sometimes make the situation worse. In order to reduce the frequent recurrence of dripper-clogging, several producers have developed filters with self-rinse devices (as already mentioned in one of the above examples). Filters of up to 600 m3/hr can currently be obtained, the sophistication of which depends largely on filter dimensions and flow rates. Filters designated for high-flow rates consist of two screens. The external screen is coarse, usually made of stainless steel, and serves as a shield for the more delicate screen, which is the main filter medium. Such a screen is usually made of synthetic materials such as polyester or nylon. Aperture diameters of shielding screens range from 500-3,500 vim, whereas the corresponding figures in the main media are 80-300 pm only. The supervising mechanism for the flushing process is based on headloss, and is actuated when headloss has reached a preset value. Flush mechanisms based on time elapsed are also in use. The filter cake is removed by the unfiltered water, which brings about a 5 percent loss of the total volume. Most problems associated with such filters, as we have seen, stem from incomplete flushing. This causes clogging, which can only be corrected by individual filter servicing. Since irrigation waters contain considerable - 188 - 1Filter housing 14. Cover seal, cover 2. Cover gasket 3. Inlet 15. Upper bearing 4.Outlet 16. Lower bearing 5. Drain opening 17. Screen handle 6. Coarse strainer 18. Collector plug 7. Coarse grid 19. Center pin 8. Fine screen 20. Upper 0-ring 9. Dirt collector 21. Lower 0-ring 10. Collector fins 22. Sealing gasket 1 1. Hydraulic motor 23. Bushing, bush 12. Hydraulic valve 24. Bearing 13. Rinse control 25. Wing nuts Fig. 5-9. SANOMAT-FILTOMAT strainer, Type III. - 189 - amounts of suspended matter (silt, algae, and so on), frequent problems tend to arise because of incomplete flushing. Granular Pressure Filters There is a tendency today to use more and more granular pressure filters with automatic backwash. In Israel, a basalt sand is being used instead of regular sand since it is cheaper (here the filters are called "ogravel filters"). These filters commonly consist of a 20-inch metal casing containing a layer of 1-2 mm crushed basalt sand, 60 cm deep, supported by a layer of 1.5-2.5 mm gravel, 30 cm deep. The freeboard above the sand is about 30 cm. The water discharge is about 15-20 m3/hr, and the filtration rate about 75-100 m/hr. A "control" strainer follows this filter, just before the irrigation network. No flocculants are used, since one of the design principles is minimum sophistication and complication. Although the generalization and standardization of the pressure filters and strainers mean a fast breakthrough in many cases, surface plugging can be a problem in times of algal blooms or floods. Manufacturers, as well as farmers, today agree that the main problem with filtration for irrigation is the lack of know-how in providing the right filter--or water treatment system in general--for a specific quality of raw water, which changes with time and location. More research is needed, particularly on the relationship between the nature of the deposit and the quality of water supplied, to resolve this problem. Meanwhile, it is recommended that pilot plant experiments be performed with specific water, as suggested for surface water supplies (Adin, Baumann, and Cleasby 1979). - 190 - CHAPTER 6 WASTEWATER IRRIGATION PRACTICE INTRODUCTION Efficient water resource management is realized by matching capacity to demand; by integrating supply, storage, and disposal; and by establishing technological and cross-sectoral linkages of supply, storage, treatment, and use. Land application of sewage effluent is a simple and economically attractive method of wastewater disposal with potential agricultural benefits that can improve the overall efficiency of water utilization. The manner of application depends on the secondary objective of the operation, where the primary objective is the economic disposal of municipal and/or industrial effluent in accordance with public health, environmental, and aesthetic considerations. If attainment of the primary objective is assured, there is a choice between simply getting rid of the wastewater and its constituents, or of utilizing it for the production of useful vegetation, be it agricultural crops, plant cover for soil conservation, land reclamation, recreation, or landscape enhancement. If the principal secondary objective is disposal, then this is best accomplished by overland flow or rapid infiltration, both of which allow the application of large amounts of effluent to a relatively small land area in a short period of time (Bouwer and Chaney 1974; Bouwer 1981). If plant production is the major secondary objective, then slow-rate application or simply irrigation is the proper method of applying the effluent (Culp and Hinrichs 1981). Although there may be some superficial resemblance among certain technical details of the three methods, there are really large differences between them in their principle of operation and water quality upgrading. They also differ in their land requirements, economic performance, and criteria for pretreatment of the effluent prior to land application. The main emphasis in this chapter is on slow-rate application in the form of agricultural irrigation. Application Rates The overland flow and rapid infiltration treatment methods are based on achieving the fastest application rate possible consistent with soil properties and standards of wastewater upgrading. Where soil freezing is not a factor, weather conditions are not a design factor. The soil surface must be vegetated in the overland flow system (to avoid erosion) and occasionally harvested, but the installation is not operated for crop production. Applica- tion rates are site-specific and vary between 5 and 35 cm per week, or 3 to 15 m per year (Bouwer and Chaney 1974; Culp and Hinrichs 1981). In the rapid infiltration system, intermittent application rates may range between 0.5 m and several meters per week (Bouwer and Chaney 1974), and drying-out periods may achieve recovery of the land infiltration rate. - 191 - In contrast to the above two disposal methods, slow-rate application, or irrigation with wastewater, is based on the idea of utilizing both the water and the nutrients of wastewater to grow beneficial vegetation, usually agricultural crops. The application rate must therefore be adapted to crop needs, as indeed must be other features of the system. Utilization of wastewater for irrigation is naturally most suitable in situations where wastewater can take the place of ordinary irrigation water from surface or groundwater sources. It- other words, wastewater utilization by slow-rate application will be optimal where there is a need for irrigation water in order to assure stable and high crop yields. This situation prevails in the warm arid and semiarid regions of the world, which include large belts of land on all continents around the Tropics of Cancer and Capricorn (see Figs. 1-2 to 1-8). Some Basic Principles of Irrigation The goal of irrigated agriculture is to optimize the so-il moisture regime of the crop root zone, which for most crops consists essentially of the upper 100 cm of the soil mantle. A cropped field loses water to the atmosphere by evaporation by two possible pathways: direct evaporation of water from moist soil, and transpiration from plant leaves of water taken up from the soil by the roots and transported to the leaf tissue. The ratio between these two processes is difficult to determine and changes as a crop develops. Since their separate determination is also not very important for irrigation management, it is customary to lump them together under the term "evapotranspiration" (ET). Before going into greater detail, let it be said that the ET rate of irrigated crops during the period of peak demand ranges from about 0.5 to 1.0 cm per day. Since this rate varies with season and crop age, and since most cropland is not irrigated continuously the year round, the annual application rate under slow-rate systems of wastewater disposal is typically around 250 cm. This is considerably in excess of actual crop water requirements, except for extremely arid locations. Thus the most obvious and most important difference between waste- water irrigation and the rapid-rate disposal methods is that the application rate is approximately one-tenth that of the rapid-rate methods. The land area that can be irrigated--that is, the area required for disposal of a given discharge of wastewater--is therefore roughly ten times larger for the slow-rate method than for the rapid-rate methods. The actual area served is determined by a number of factors, including climate, which determines the length of the growing season and choice of crops; crop water requirements; the land utilization factor, which is the percentage of the total time that a field is actually under an actively growing crop; choice of irrigation method, which affects the irrigation application efficiency; soil hydraulic properties; and soil texture and water quality, which determine the need for leaching accumulated soluble salts out of the root zone. - 192 - In principle, irrigation water distribution systems, as well as irrigation schedules, are designed to replace the water lost from the soil reservoir (the root zone) by ET since the previous irrigation, with the provision of some additional amount to make up for unavoidable water losses during storage, conveyance, and field application. To this may be added a deliberate loss to deep percolation wherever it is necessary to leach salts out of the root zone. The intervals between water applications are determined by the rooting depth of a particular crop, which defines the thickness of the soil layer and thus the storage capacity of the soil water reservoir; by weather conditions, which determine the rate of water extraction by the crop; and by specific crop responses to changing soil moisture conditions. Depending on crop and soil characteristics, climate, and irrigation method, intervals between irrigations may range from two days to four weeks. Such a wide range of irrigation frequency, and thus also of single irrigation size, will naturally result in very different demands on the carrying capacity of various sections of the water distribution network, even though the total weekly or monthly discharge from the water source may be the same. It is thus necessary to make any decisions that affect irrigation frequency and single application amount in the early stages of project planning, before the distribution system is designed. In regions with definite seasons, crop water requirements will fluc- tuate even if year-round cropping is possible. In addition, the rate of wastewater production will also be affected to some extent by the weather conditions. Some common examples are the cases where city storm sewers are integrated at some point with the domestic sewerage system; fluctuations in domestic demand due to weather; and seasonal fluctuations in water demand by industry (for example, in food processing where demand is geared to the harvesting periods). These fluctuations in supply and demand are often not synchronized. Since the aim of irrigation, including wastewater utilization, is to manage the soil moisture quantity and quality in a well-defined layer of soil for optimal crop production and minimal water losses by deep percolation or runoff, the supply of water to the cropped area must be adjusted to irrigation needs. Temporary water shortage during the peak crop demand period and "dumping" of excess effluent when it is not needed could affect both the technical and economic performance of a project. Small operational reservoirs to smooth out short-term fluctuations in supply and demand, as well as large reservoirs capable of storing several months' excess of effluent, may be essential components of a slow-rate application system (Noy and Feinmesser 1977). In some cases, peak demand may be supplemented from other water sources, such as pumped groundwater. Although each design problem has to be solved on the basis of the specific conditions, the balancing of supply and demand is much more critical in the design of wastewater irrigation projects than of rapid-rate application systems (see Chap. 5 on wastewater reservoir treatment systems). Similarities and Differences between Effluent and "Normal" Irrigation Water For our purposes, water from different sources should be compared on the basis of its physical, chemical, and biological properties in order to determine how these might affect the operation of an irrigation system, - 193 - including possible effects on soil properties and crop response. The criteria applied to evaluation of irrigation water differ from those for domestic water supply. Potability for domestic supply is determined by medical and aesthetic considerations of the presence of pathogens, toxic solutes, color, flavor, odor, and suspended solids. The "hardness" of domestic water--that is, the concentration of solutes, mainly calcium salts, which will form insoluble precipitates--is of economic importance mainly to industrial users, and represents a fairly minor inconvenience in the domestic kitchen and laundry. Chemical Properties of Irrigation Water--Salinity and Specific Ion Effects The principal criteria for evaluating the quality of irrigation water have to do with the total concentration and the composition of soluble salts in the water (Shainberg and Oster 1978; Stewart and Meek 1977; U.S. Salinity Laboratory 1954). Soluble salts in the soil solution may affect a cropped field in several ways: Increased concentration of any solute makes it more difficult for plant roots to take up water from the soil (the soil appears drier to the plant than is indicated by a standard soil moisture determina- tion). Furthermore, certain salts often found in irrigation water may be toxic to some plants. For example, at equal concentrations, chloride salts are much more damaging to citrus and subtropical fruit trees than are sulfate salts. Boron at very low concentrations is harmful to many crops (Shainberg and Oster 1978; U.S. Salinity Laboratory 1954). Soluble salts commonly found in irrigation water may also have an undesirable effect on the physical properties of the soil. Both crop growth and management of the soil water regime are enhanced by a soil structure characterized by certain proportions of mineral solids, water-filled pores, and air-filled pores, which farmers describe by the single concept of "soil tilth." With the exception of soils whose mineral particles consist mainly of relatively large sand grains, which do not adhere to each other easily, good tilth is associated with the presence of soil aggregates, or crumbs roughly 2-10 mm in diameter. These are formed by much smaller individual soil particles held together by cementing agents such as clay particles, organic matter, and some inorganic salts such as calcium carbonate. Clay particles are the most common and most important agent of soil aggregation, especially in dry regions. The effectiveness of clay in producing aggregation is strongly affected by the concentration and composition of salts in the soil. The critical condition is the relation between monovalent and bivalent cations adsorbed on the face of clay particles, which in most cases means the ratio between sodium and calcium ions. An excess of sodium will cause clay particles to disperse instead of aggregate, and in some cases also to swell. This results in a soil with low porosity, poor permeability, poor aeration when wet, large clods separated by deep cracks when dry; such soil is diffi- cult to work with tillage implements when either wet or dry, and often its pH will also be excessive. - 194 - Thus, water that may be chemically suitable for domestic supply may not be suitable at all as irrigation water. This point must be remembered because sewage effluent is composed mainly of municipal supply water chosen according to criteria for domestic use, and whose quality has been affected, generally for the worse, by having been used. Two rather common examples will serve to illustrate this point. "Hard" water may contain calcium, magnesium, iron, and aluminum compounds that form boiler scale, or insoluble deposits, both in kitchen utensils and in industrial installations, and that also interfere with laundry operations. Hard water is often treated, or "softened," either by chemical amendments or by an ion exchange process, but the result of either procedure is replacement of calcium ions by sodium ions. Although beneficial to domestic and industrial users, such water may require additional treatment with exactly the reverse effect in order to make it suitable for irrigation. The second example is the case of boron. An essential plant nutrient at very low concentrations, it is toxic at only slightly higher concentrations; an upper limit around 1 part per million has been suggested for irrigation water (Shainberg and Oster 1978; U.S. Salinity Laboratory 1954). Boron in the form of borax is used in many laundry powders and other cleaning materials for domestic use and in the food processing industry, and it may be present in excessive amounts in sewage effluent. Suggested limits of boron concentration in irrigation water for various crops are given by Shainberg and Oster (1978) (see Table 6-1), who also provide recommended limits for other trace elements. Typical effluent generally does not reach these limits (Thomas and Law 1977). The crops in each column of Table 6-1 are arranged in ascending order of tolerance. Figures quoted by Thomas and Law (1977) for secondary effluent, with data based on various sources, show boron concentrations around 1 ppm, which indicates that care has to be taken to determine the boron concentration of effluent intended for irrigation. As is the case with many other pollutants from industrial sources, such as heavy metals, there is no simple and economical chemical treatment available for boron removal, and the best solution is to isolate the pollutant at the source and prevent it from being added to the wastewater to be used for irrigation. It is beyond the scope of this chapter on irrigation methods to give a full discussion of the chemical interactions between irrigation water and soil, especially since a great deal of information has been published in the agricultural literature, both in numerous journal articles and in book form (see, for example, Cass and Summer 1982; Culp and Hinrichs 1981; Daniel and Bouma 1974; Frenkel et al. 1978; Hadas and Frenkel 1983; Hansen, Israelsen, and Stringham 1980; McNeal and Coleman 1966; Shainberg, Rhoades, and Prather 1980; Sopper and Kardos 1974). It is relevant, however, to discuss those chemical properties of irrigation water and of wastewater that may have direct bearing on the design and operation of irrigation systems. It was briefly mentioned above that excess sodium may cause dispersion of soil aggregates and result in low soil permeability. The relationship of soil properties, sodium concentration in the water, and the concentration of total soluble salts is not a simple one. - 195 - TABLE 6-1 Limits of boron in irrigation water for crops of various sensitivities, based on toxicity symptoms of plants grown in sand culture Sensitive Moderately tolerant Tolerant (0.3-1 ppm boron) (1-2 ppm boron) (2-4 ppm boron) Citrus Lima beans Carrots Avocados Sweet potatoes Lettuce Apricots Peppers Cabbage Peaches Oats Turnips Cherries Milo Onions Persimmons Corn Broad beans Figs Wheat Alfalfa Grapes Barley Table beets Apples Olives Mangel beets Pears Field peas Sugar beets Plums Radishes Palm Navy beans Tomatoes Asparagus Jerusalem artichoke Cotton Walnuts Potatoes Sunflowers Source: Shainberg and Oster (1978). Nonetheless, it is possible to make several qualitative statements or generalizations about this relation, keeping in mind that, as is common with generalizations, many exceptions can be found and that these statements are not yet a quantitative prediction tool. 1. The quantity of positively charged ions that can be adsorbed on the surface of a unit mass of negatively charged soil particles is defined as the Cation Exchange Capacity (CEC). The great majority of the CEC is accounted for by the clay fraction in the soil, since it is made up of the smallest particles and thus has the largest surface area per unit mass. 2. When the proportion of monovalent sodium (Na+) exceeds a certain threshold value of the CEC (about 15 percent), clay particles will tend not to adhere to one another or to larger particles, soil aggregates break up, and the soil becomes dispersed. 3. Clay is not a definable chemical compound. There are different kinds o.f clays, and they differ in chemical composition, physical properties, and thus also in their response to adsorbed sodium. The - 196 - amount of swelling is especially sensitive to the structure of the particular clay mineral. 4. Dispersion and swelling of clay particles change the pore size distribution of a soil toward a smaller average pore diamater. This in itself increases the resistance to water flow through the soil. Furthermore, the smaller the average pore size, the more water (and thus less air) a soil will retain when subjected to a force tending to extract water from the soil. When appreciable clay swelling takes place, the pressure developed in the soil causes some pores to collapse entirely, thus decreasing the total porosity of the soil. This likewise results in lower air-filled pore volume and greater resistance to water flow through the soil. 5. The higher the clay content of a soil, the greater will be the effect of clay dispersion on the soil body as a whole. However, it also takes more sodium per unit mass of soil to cause all the clay to be dispersed. On the whole, very sandy soils are much less vulnerable to sodium damage than finer-textured soils. 6. The tendency of clays to disperse, or, conversely, to flocculate into aggregates, is not only a function of clay properties and of the proportion of sodium (the Exchangeable Sodium Percentage, ESP) in the CEC, but also of the concentration of electrolytes in the solution surrounding the soil particles. The higher the electrolyte concentration, the more stable are the soil aggregates. Thus a saline-sodic soil may have acceptable physical properties, but if the soluble salts are leached out of such a soil, it may become completely dispersed and impermeable, and thus unusable for crop growth unless extensive reclamation is carried out. The U.S. Salinity Laboratory (1954) has formulated a set of guidelines fot the classification of irrigation waters according to the salinity and sodium hazards involved. These guidelines, which are based on a very large number of field observations, have gained wide general acceptance (Fig. 6-1). More recent work has shown that developments in irrigation methods may require considerable adjustments in the guidelines (Goldberg et al. 1971; Goldberg 1979; Shainberg and Oster 1978; Stewart and Meek 1977). Shainberg, Rhoades, and Prather (1980) and Frenkel, Goertzen, and Rhoades (1978) have presented evidence that sodium damage may occur at lower levels than had previously been thought dangerous. Nevertheless, the information summarized in Figure 6-1 is a good starting point when no specific information is available in the planning stage of a project. In Figure 6-1, water classes Cl-C4 stand for increasing salinity effect as expressed by conductivity, and classes S1-S4 represent increasing sodium hazard. A conductivity of 200 micromhos/cm represents a salt concentration of approximately 130 ppm of total dissolved salts, and a conductivity of 1,000 micromhos/cm is roughly equivalent to a concentration of 650 ppm. While the exact relation depends on the chemical composition of the water, the average relation between conductivity and concentration is linear - 197 - 100 2 3 4 5 6 7 8 1000 2 3 4 5000 F _ I i ~~I I i11 *z + 30 - 28 , C3-S4 26 02-S4 24 03-S4 22 0 C4-S4 01-S3 20 R -~~ I a LOW CEDIS HIH VRYHG N ~0 -J -l 0 C3-S3~0-S 4 0. e-$ 0 ~ o14 0 12 0 0 o ~ I2- S ~~~~~~~~~~~~~~~~4S 10 SAUNIT Y HAZARD Fig. 6-1. Classification of irrigation water quality according to electrical conductivity and sodium hazard. Source: U.S. Salinity Laboratory (1954). - 198 - on a log-log scale. The sodium adsorption ratio (SAR) is defined as Na+/[1/2(Ca++ + Mg++)]1/2 where Na +, and so on, stand for the concentrations of the various cations in milliequivalents per liter. In planning the use of wastewater for irrigation, the following points should be considered: 1. Irrigation is practiced mainly in arid and semiarid zones. It is precisely under these climatic conditions that a salinity problem is likely to occur either in the soil or in the water sources. The lack of sufficient rainfall and the scarcity of irrigation water may lead to inadequate leaching and to the accumulation of excess salts in the crop root zone unless special steps are taken to avoid it. 2. Municipal water supply is likely to be drawn from the same sources as irrigation water, and thus its initial chemical quality may be already impaired. 3. The total soluble salt content of municipal effluent is always higher than that of the supply water, and thus the additional solutes must be added to the original concentration of the supply water. Noy and Feinmesser (1977) report a "pickup" of about 250 ppm of soluble salts between supply water and effluent. Thomas and Law (1977) quote "strong," ",medium" and "weak" concentrations of raw effluent as being 850, 500, and 250 ppm respectively, and Fuller and Tucker (1977) report that concentrations for effluent from the Tucson, Arizona, treatment plant (a very arid climate) range between 600 and 820 ppm during the winter months. It must be remembered, however, that these are general figures, and concentrations vary tremendously both in time and by location. Water in the above concentration ranges is classified as representing a medium-to-high salinity hazard by the U.S. Salinity Laboratory (1954). 4. Excess irrigation water salinity may impair crop growth, and excess sodium may cause soil sealing to the extent of making it impossible to irrigate the soil. Chemical amendments to the water or soil may be required to maintain its productivity. Physical Properties of Irrigatior- Water or Wastewater The most important physical property of irrigation water with respect to irrigation technology is the presence of suspended solids. These could cause clogging of narrow passages or orifices in some components of the irrigation system. Suspended solids delivered to the land could also affect the physical properties of the soil for better or for worse. The latter aspect is probably more closely related to the biological properties of the water. - 199 - The concentration of total suspended solids (TSS) in effluent varies greatly, depending on the source and degree of treatment. The susceptibility to clogging of irrigation equipment varies over an equally wide range. The more complex and vulnerable an irrigation system is to clogging, the more care must be taken to assure the physical quality of the irrigation water with respect to the amount and particle size distribution of suspended solids. Although it is not recommended that raw sewage be used for irrigation, mainly for health reasons, in actual practice it is widely used in many parts of the world (Lance, Rice, and Gilbert 1980; Marquez 1981; Menzies 1977; Moreno 1981; Rawitz, Kataeen, and Friedman 1978; Schalscha and Vergara 1978). However, even then gross trash is removed either by bar screens or by sedimentation, the latter taking place even in the irrigation stream within a short distance from the primary outlet. Where raw sewage is used for irrigation and clogging is not a problem, the application method is most likely to be a simple, improvised system of gravity flooding with an open-channel conveyance system. Data on the composition of raw effluent or effluent after only primary treatment indicate maximum concentrations of TSS between 250 and 350 mg/I, or 350 ppm, which is really a very dilute suspension (Schalscha and Vergara 1978; Thomas and Law 1977). The TSS concentration of secondary effluent apparently depends on the type of treatment and operational practices. Thomas and Law (1977) mention some controversy regarding the inclusion of effluent from certain lagoon systems in the category of secondary effluent, since it has been found to have higher TSS and BOD levels than the accepted standard for secondary effluent. The range of TSS of secondary effluent is typically between 10 and about 100 mg/l, with extreme values reported to be around 200 mg/l (Thomas and Law 1977). In terms of TSS concentration, even raw sewage more closely resembles ordinary water in its physical properties than it does a slurry, and concentration of TSS in itself would not appear to be a factor in the clogging of mechanical components. However, where effluent is applied by either the sprinkler or spray method or by drip irrigation, the presence of even a few large particles of organic aggregates could lead to clogging of equipment. For sprinkler application, the use of nozzles with a minimum diameter of 5 mm is recommended as a safety measure (Noy and Feinmesser 1977), and it would certainly be good practice to remove trash and grit by bar screens and sedimentation at the intake end of the distribution system. The minimum recommended nozzle diameter does not constitute a constraint, since this is a very commonly used nozzle in ordinary irrigation systems (Water Workers Association 1983). Drip irrigation systems do represent special problems because of the required small orifice diameter (0.4-2.5 mm) (Water Workers Association 1983), and the occurrence of laminar flow in some systems where accumulation of settled sediments could be a danger. Since emitter clogging is one of the problems of the drip method, even with ordinary irrigation water it would appear logical to avoid this method for wastewater irrigation unless good operational and maintenance practices can be assured (see Chap. 5). - 200 - However, the method has some desirable features for effluent application, and ongoing research has already shown that this method may be used successfully, especially if the equipment industry develops improved filtering equipment (Oron, Shelef, and Turzynski 1979; Oron, Shelef, and Zur 1980; Oron, Ben-Asher and De Malach 1982; Oron, Rawitz and Kataeen 1982; Rawitz, Kataeen, and Friedman 1978; Zur and Tal 1977). However, it is clear even at first glance that effluent application by the drip method requires relatively high-quality effluent, sophisticated equipment, and a high level of quality control and maintenance by project personnel (see Chap. 5). Like sewage effluent, ordinary irrigation water from surface sources may contain plant parts, animal remains, and algal colonies, and the equipment used for removing this material is the same as that used with effluent. Water pumped from groundwater wells often contains considerable amounts of sand grains. Being relatively large and heavy particles, they can be expected to settle out at various points in a domestic water supply and sewage system. This is not the case with pressurized irrigation systems, and the sand has to be removed by centrifugal separators. Biological Properties of Irrigation and Wastewater From the point of view of irrigation technology, these properties have a bearing mainly on the question of maintaining soil infiltrability. The amount of scientific information in the context of slow-rate application is very limited, and even reference to high-rate infiltration systems does not lead to clear-cut conclusions. On the one hand, the clogging of soil pores or the formation of surface crusts is envisioned as being the result of adding algal and bacterial biomass (Daniel and Bouma 1974; Noy and Feinmesser 1977; Thomas, Schwartz, and Bendixen 1966), clogging by bacterial slime, and penetration of particles in the colloidal size range into the soil pores (Hunt and Peele 1968; Noy and Feinmesser 1977; Rinot 1963). Evidence of soil clogging under high-rate application certainly exists, although solutions to the problem have been found (such as tillage and intermittent drying) that are highly significant when one applies the findings to slow-rate application. On the other hand, agricultural experience over the centuries as well as extensive research has conclusively demonstrated that the addition of organic matter (manuring, green-manure crops, reincorporation of crop residues, addition of raw, digested, or composted human or animal wastes) improves the physical structure of the soil by enhancing aggregation and the formation of macropores, and thus improves both infiltration and the aeration status of the soil (Allison 1973; Greenland 1965; Hunt and Peele 1968; Noy and Feinmesser 1977). The net result of wastewater application on infiltration properties of the soil is site-specific, depending on the interaction among soil properties, a number of water quality parameters, and agricultural management practices (Burns and Rawitz 1981). Turning first to the evidence from rapid-infiltration installations, Bouwer and Chaney (1974) mention both surface and internal soil clogging, but report that in the system reviewed, surface clogging was the cause of lowered - 201 - intake rates with secondary effluent. Drying periods between infiltration cycles effectively restored the infiltration rate provided TSS were less than 10 mg/liter. At higher loading rates, accumulated sludge had to be mechani- cally removed. The authors quote Thomas, Schwartz, and Bendixen (1966), who noted that anaerobic conditions accelerate clogging and that drying of the soil leads to complete infiltration rate recovery. Lance, Rice, and Gilbert (1980) report on column experiments with primary and secondary effluent under a rapid-infiltration regime, showing that in both cases a high and constant infiltration rate was maintained for about eight months. They found evidence of internal soil clogging rather than surface clogging when effluent with TSS greater than 10 mg/liter was applied. The decrease in intake rate was not very large (15-30 percent, depending on treatment), and the authors point out that the soil they used could maintain its infiltrability for extended periods provided it was exposed to periodic drying in the sun and if the soil surface was tilled to break up the organic crust. Bouwer (1981) reports on a large pilot project in Phoenix, Arizona, where holding secondary effluent in a lagoon for three days caused algal bloom, which added 50-100 mg/liter of TSS to the effluent and caused a "filter cake" to form on the bottom of the infiltration basins that drastically reduced intake rate. Bypassing the lagoon reduced the TSS markedly and greatly diminished clogging. Bouwer (1981) also reports that primary effluent can be used successfully, although lower infiltration rates are to be expected. The increased land requirement is offset, however, by a 50-75 percent saving in treatment costs, and the choice is site-specific and dependent on current economic considerations. Firm data from actual irrigation use of effluent are harder to come by, primarily because systems that have been in operation for a long time were constructed to solve a practical problem and were not scientifically monitored from their inception. The continued successful operation of these systems is in itself evidence that soil clogging is not a serious problem for effluent irrigation of agricultural land where application is intermittent and soil tillage is a normal part of farm operations. Noy and Feinmesser (1977) report on a 25-year old installation in Israel where effluent containing 150 mg/liter of TSS was used, and compare it with an adjacent unirrigated plot. The comparison is not rigorous, since it was carried out at the end of the 25-year period, and no information is given on the initial properties of both plots. However, at sampling time the sand content of the irrigated plot was 89 percent vs. 97 percent in the control plot, silt was 10 percent vs. zero, and clay content was 11 percent vs. 3 percent in the 0-30 cm soil layer. These differences appear to have been caused by the use of sewage effluent for 25 years. The authors also cite the case of a citrus orchard irrigated an unspecified number of years with effluent, and show that tillage resulted in higher infiltrability than no-till with effluent irrigation, but that this was still lower than under no-till practice with ordinary irrigation water. Marquez (1981) mentions that wastewater from Mexico City has been used for irrigation since 1886; the fact that he does not mention infiltration rates indicates that soil clogging is apparently not a serious problem there. Similarly, Schalscha and Vergara (1978) report commercial irrigation with raw sewage for at least 10 years, without any mention of soil clogging problems. - 202 - To sum up, soil clogging due to the import of suspended solids can be a problem in systems with a high loading rate. The problem can be alleviated by measures that create conditions of rapid aerobic decomposition. Intermit- tent water application, longer drying periods, and mechanical mixing of the surface soil favor such conditions. All these procedures are normal farming practice with irrigated crops, for purposes not necessarily connected with the maintenance of infiltrability. The extraction of soil water by the crop intensifies the wetting-and-drying cycle, and the lower loading rate also helps to avoid the clogging problem. Under climatic conditions where irrigation is required for crop growth, limiting infiltrability of the soil is more likely to be caused by excess sodium in the irrigation water than by overloading with suspended organic particles. CROP SELECTION CONSIDERATIONS AND CRITERIA Crop selection should be based on three kinds of criteria: 1. Suitability of the crop to the general agronomic conditions of the site, considering climate, soils, markets, and so on. In general, any of the crops grown by local farmers under irrigation would be considered suitable according to this criterion. 2. Constraints on crop production due to water quality changes, that is, salinity and toxic effects of specific ions. 3. Constraints on crop utilization or marketing imposed by public health considerations or regulations, considering both pathogens and toxic chemical compounds. Suitability of Crop to General Conditions This is a problem for the general agronomist, and cannot be covered in detail here. Crop factors to be considered are growth habit and plant spacing, rooting depth, sensitivity to climatic conditions (for example, frost, hot dry spells), and storage and marketing conditions. Of specific relevance is the adaptability of various crops to the irrigation method to be used. Thus drilled crops such as pasture, forage, and small grain crops are not well adapted to furrow or drip irrigation, and orchard crops are not irrigable by some of the sprinkler systems. This subject is treated in greater detail in Chapter 3, where the irrigation methods are described. Constraints on Crop Growth These constraints are due principally to salinity of effluent as a limitation to crop growth or as a cause of soil property deterioration, and to the toxic effect of specific ions to certain crops (see Fig. 6-1 and Table 6-1). The U.S. Salinity Laboratory (1954) and Shainberg and Oster (1978) have published detailed lists of various crops and their salt tolerance. Table 6-2 is based on the latter source. - 203 - TABLE 6-2 Yield decreases of various crops to be expected due to salinity of irrigation water Yield decrement 0 perj7nt 10 percent 25 percent Crop EC- EC EC Fruit crops Dates 2.7 4.5 7.3 Grapefruit 1.2 1.6 2.2 Oranges 1.1 1.6 2.2 Apricots 1.1 1.3 1.8 Peaches 1.1 1.5 1.9 Almonds 1.0 1.4 1.9 Grapes 1.0 1.7 2.7 Plums 1.0 1.4 1.9 Strawberries 0.7 0.9 1.2 Vegetable crops Beets 2.7 3.4 4.5 Broccoli 1.9 2.6 3.7 Cucumbers 1.7 2.2 2.9 Tomatoes 1.7 2.3 3.4 Spinach 1.3 2.2 3.5 Cabbage 1.2 1.9 2.9 Potatoes 1.1 1.7 2.5 Sweet corn 1.1 1.7 2.5 Pepper 1.0 1.5 2.2 Lettuce 0.9 1.4 2.1 Onions 0.8 1.2 1.8 Carrots 0.7 1.1 1.9 Forage crops Tall wheat grass 5.0 6.6 9.0 Bermuda grass 4.6 5.6 7.2 Barley 4.0 4.9 6.3 Perennial rye grass 3.7 4.6 5.9 Birdsfoot trefoil 3.3 4.0 5.0 Tall fescue 2.6 3.9 5.7 Vetch 2.0 2.6 3.5 Sudan grass 1.9 3.4 5.7 Alfalfa 1.3 2.2 3.6 Orchard grass 1.0 2.1 3.7 Berseem clover 1.0 2.2 3.9 Field crops Barley 5.3 6.7 8.7 Cotton 5.1 6.4 8.3 Sugarbeet 4.7 5.8 7.5 Wheat 4.0 4.9 6.3 Peanuts 2.1 2.4 2.7 Maize 1.1 1.7 2.5 Cowpea 0.9 2.0 2.1 Beans 0.7 1.0 1.5 */ EC - electrical conductivity at 250 C, dSm&1. Source: Adapted from Shainberg and Oster (1978). - 204 - Public Health Constraints Pathogens The health aspects are treated in detail in Chapters 2, 3, and 4 of this report; only those aspects directly connected with irrigation technology will be discussed here. To the extent that pathogens require consideration in crop selection, the most obvious problem is the contamination by wastewater of plant parts destined for human consumption without further processing, within a short period after contamination. A second path of infection could be via the meat of livestock contaminated by pathogens. Although proof of contamination does not prove there is a clear danger of infection, the most drastic precautions would make it possible to utilize effluent for irrigation of crops not grown for food, for example, wood and fiber crops, among which cotton is by far the most important worldwide. The choice of food crops depends on the interaction with the contemplated irrigation method, the basic requirement being minimal contact between the effluent and the edible plant part. Thus tall-growing fruit and nut trees irrigated by surface or drip systems, and even by low-trajectory sprinklers, would be a favored combination. Since desiccation and ultraviolet radiation as well as heat are lethal to pathogens, at the next level of approval we would find grain crops and fruit crops for which irrigation could be terminated several weeks before harvest. Preference would be given to fruits that are normally washed or treated before consumption, such as citrus fruit (washed and waxed), olives, avocados, and the like. Low-growing but erect plants would be the next category, including table grapes and tomatoes grown on trellises, sweet corn, peppers, eggplant, particularly if grown in the ridge-and-furrow system and not irrigated by sprinkling. Crops least eligible for effluent irrigation are those with a supine growth habit, such as squash, cucumbers, some tomato varieties, strawberries, peas, beans, root crops such as carrots, radishes (which are eaten raw), as well as asparagus, potatoes, beets, and onions. The degree of treatment to which the effluent was subjected would affect the severity of limitations in addition to the irrigation method. These issues are dealt with in detail elsewhere in this report. Toxic Chemicals These are generally a group called heavy metals, and in the agricul- tural literature they are often referred to as minor elements, trace elements, or microelements. Although generally immobilized in the soil, they can be taken up by plants, and are often toxic to plants. This can affect crop selection, but is also a certain safeguard against the elements entering the food chain and eventually having a toxic effect on the human consumer. Knowledge of the chemistry of these elements is still limited (Shainberg and Oster 1978), but their possible introduction into irrigation water, via - 205 - industrial effluent and as impurities in phosphorus fertilizer, must be carefully monitored. Shainberg and Oster (1978) give recommended maximum concentrations for irrigation water. Allaway (1977) gives a detailed survey of the heavy metals in sewage sludge and evaluates their effects on soil, plants, and the food chain. Marquez (1981) gives interesting data on a few trace elements, making possible a comparison between U.S. EPA standards, irrigation water quality, composition of alfalfa irrigated with this water, and analysis of milk from cows fed with fodder including this alfalfa (see Table 6-3). Although not all the elements are comparable, the data do demonstrate how some elements can enter the food chain (the health implications of this are not discussed here). Drinking water standards are either the same or more stringent than those recommended for irrigation water. No similar data have been found for vegetables or fruit. Nonetheless, it can be seen that the alfalfa greatly concentrated the heavy elements and the cows did not pass all of this on to the milk; at the same time, heavy metal concentrations in the milk exceeded both those of the irrigation water and the allowable standards for irrigation water. Such data certainly suggest that careful consideration must be given to the allowable heavy metal levels in the human body and caution must be exercised in irrigating crops meant for human consumption with water of the above quality. CHARACTERISTICS OF IRRIGATION SYSTEMS RELEVANT TO EFFLUENT IRRIGATION All the commonly used irrigation methods can be employed to apply sewage effluent under appropriate conditions. Constraints may be imposed on the choice and design of the irrigation system beyond those encountered with ordinary irrigation water by several kinds of factors: One group covers the possible effect of effluent properties on the technical operation of the irrigation system, as for example the clogging of small orifices by aggregates of suspended solids. A second group of factors concerns the characteristics of an irrigation method that may be acceptable or at least tolerable with ordinary irrigation water but not with sewage effluent. An example of this may be nonuniform water distribution over the irrigated area, which ordinarily lowers the irrigation efficiency, but in the case of effluent may, in addition, be a cause of surface or groundwater pollution. The third group includes environmental health hazards specific to the combination of effluent properties and those of a particular irrigation method, which may not be associated with the method when applying ordinary water, or may not be a hazard with effluent if a different application technique is used. Examples of this are spray drift into residential areas, operator contact with the irrigation water, or the wetting of crop parts destined for human consumption without disinfection or other processing. Although the general technical features of the various irrigation methods are discussed here, emphasis is on the above considerations. For - 206 - TABLE 6-3 Comparison of boron and heavy metal maximum recommended concentration standards with effluent irrigation water, alfalfa fodder, and milk (mg/1) Effluent irrigation Alfalfa Milk Element U.S. EPA Standardsa/ water sample sample Boron 0.75 2.8 < 5.0 not given Cadmium 0.01 0.03 < 1.5 - 5.25 9 Copper 0.2 0.35 25 not given Chromium 0.1 0.21 < 5 - 22 3 Iron 5.0 12.9 3,000 10 Manganese 0.2 0.32 100 3 Molyb enum 0.01 1.44 < 50 9 Leadb_ 0.05 0.22 < 25 9 a. EPA-recommended maximum concentration in irrigation water for continuous application. U.S. EPA Water Quality Criteria. EPA Re/73-033 (1973). b. EPA-recommended maximum for drinking water. National Interim Drinking Water Regulations, EPA 570/9-76-003 (1976). Source: Marquez (1981). convenience, design procedures found in the literature on irrigation engineer- ing are presented in condensed form, with reference to the appropriate sources for the reader wishing to go into greater detail. Surface Irrigation The most important characteristic of this irrigation method, sometimes also called gravity irrigation, is that the actual distribution of water over the land area to be irrigated takes place during the overland flow of the irrigation stream on a properly prepared land surface. About 80 per- cent of the irrigated area in the United States is irrigated by some version of this method (Culp and Hinrichs 1981), and probably more than 95 percent of irrigated land worldwide is served by surface irrigation. Notable exceptions include Israel, southern Cyprus, and certain parts of southern California, where surface irrigation has been replaced by sprinkler and drip systems. This development was due to a special combination of hydrologic, sociological, and technical circumstances (Wiener 1972), and cannot be transplanted to other situations without a thorough analysis of local conditions. - 207 - In surface irrigation, the water is delivered to the plot boundary by the conveyance system either at a point or along a line; it then spills onto the ground at atmospheric pressure and advances over the land surface to the far end of the plot. The only available source of energy for moving the water forward is the force of gravity, and thus the surface of the flowing water must have a downward slope. This is generally provided by having the water flow over a sloping land surface, but in some cases completely level land is irrigated by having the water build up its own slope; that is, the flow is deepest at the point of delivery and slopes to zero depth at the point of farthest advance. During the advance of the water over the land surface, part of it infiltrates into the soil while the remainder continues downslope. An important consequence of the advance process is that the time during which a unit area of land has been covered by water, and thus the amount of water that could infiltrate into the soil, decreases as the distance from the delivery point increases. Thus, by the time the advancing stream reaches the end of the plot, the amount of water that has entered the soil along the flow path cannot be uniform. Since it is desirable to achieve uniform wetting over the land surface, however, particularly so with effluent irrigation, it is the main task of the designer to devise a system that will yield the best possible uniformity of water distribution and will be economical in terms of initial and operating costs (Bishop, Jensen, and Hall 1967; Rawitz 1973; Goldberg 1979). Methods of Distributing Water on the Land The several methods available differ in the manner in which the flowing water is controlled and guided across the land surface. They also differ in their adaptability to different cropping systems and initial as well as operating costs. Wild flooding. This is the least efficient and least common system, and because of its inability to control the water to a reasonable degree it cannot be recommended for use with effluent. Depending on the topography, temporary earth ditches are made either along a contour line or perpendicular to the contours. Water is made to overflow from the ditch either by breaching its downslope berm (in the case of contour ditches) at a number of points, or by placing earth dams in the ditch that will cause water to overflow the ditch berm and spread over the land surface some distance to either side and flow downslope (Fig. 6-2). Since no preparatory land leveling is used, only naturally smooth slopes are suitable for this method, and even then the measure of control over the flowing water is minimal. This method can be used for irrigating pastures, hay or forage crops, and small grains. However, if crop production is a principal aim of the project, adequate amounts of water must be supplied to all parts of the area, and, if this is assured, it is very likely that some runoff water, called "tail water" by irrigators, will be produced at the lower end of the field. With the use of effluent, more so - 208- l l 1: l l, l\ l l l:lllll- /F /)t\ /vt\ 111III II III I I I1 1 I II/ -' /t,' \ z J'~~~~~~~~~/1/ P\'\'\/1 // \\ a s l/l/ \\\'/ ~ ~ n ________________________ __ ;____ _ i7i~~~~~~ (a) (b) Fig. 6-2. Wild flooding from field laterals: (a) ditch along contour on a steep slope; (b) ditches running down steepest slope of gently sloping land. than with ordinary water, it would be important to control, collect, and dispose of this tail water, and this could add to the cost of the installation and its operation. Since the water is usually supplied by temporary earth ditches, the periodic breaching of the ditch berm or placement of earth dams requires a considerable amount of manual labor, as does the handling of the tail water. This would require the laborer to stand in water or mud and to move wet earth with a shovel or hoe. Even if he wore rubber boots, upper body parts, including the face, could be splashed or could otherwise come in contact with the effluent. Thus the worker would be exposed to whatever health hazards are present in the effluent on a more or less continual basis. Border checks. Border checks are elongated plots of land bounded by earth dikes or borders. The land surface should be perfectly level along all cross sections (perpendicular to the direction of flow), and have a slope between zero and about 0.5 percent along the direction of flow, depending on the natural slope of the land, soil properties, and the stream flow available. This method is therefore best suited to land that naturally has large planar surfaces and very gentle slopes. Some land leveling or grading is usually required in the preparation of a well-functioning border check system. At the very least, the borders have to be constructed in a way that does not leave small ditches or furrows at the base of the border, followed by land planing or smoothing of the area within the borders. The border check method is suitable for a wide variety of crops, including row crops grown on ridges or beds, in which case furrow irrigation is actually practiced within the border checks. - 209 - Border checks are by definition elongated, with the long dimension typically being 10 to 30 times the width, but they are not necessarily rectangular. A variation of the method is the contour check, with the borders being laid out along a "falling contour," which is a line of constant but not zero slope running roughly parallel to the natural contours. Although the resulting checks are not straight and may not have a constant width, this method minimizes land leveling costs and the removal of valuable topsoil. The irregular shape of contour checks creates some difficulties during tillage and harvesting operations, and the minimum width of the check must be at least as large as the width of the widest agricultural implement (for example, a com- bine) that will have to be used. Contour checks are successfully being used for orchard irrigation in the gently undulating landscape of northern California, and on a large scale for rice irrigation in the Sacramento River delta in California, where slopes are near zero and where the diversion of large discharges of river water makes it possible to have large distances between the borders. The terraces used for paddy rice cultivation in parts of southeast Asia are essentially contour checks. The two types of border checks are illustrated in Figure 6-3. Economically sized border checks require a relatively large discharge of water in order to achieve reasonably good water distribution. The actual discharge required depends on the width and length of the check, the land slope, infiltration capacity of the soil, and the hydraulic roughness of the land surface. Border checks are typically 3-30 m wide and 100-400 m long. Typical discharges required are between 10 and 50 m3/h per meter of check width, and commonly used total discharges range between 50 and about 300 m3/h. The irrigation water can be supplied to border checks by any of the commonly used conveyance and distribution systems, which include lined or unlined canals and ditches, and surface or buried pipelines made of concrete, steel, aluminum, or plastic. Water may be delivered to the check from a ditch by means of concrete or wood turnout gates, permanent pipes with valves leading through the canal wall, or one or more large siphons, as illustrated in Figure 6-4. If water is delivered by a buried, low-pressure concrete pipeline, concrete risers of 20-30 cm diameter equipped with simple screw-down covers (called "alfalfa valves" in the United States) are placed to irrigate one, or alternatively two, adjacent checks (Fig. 6-5). If high-pressure pipelines of concrete, steel, or aluminum are used to deliver the water, a short section of aluminum or plastic gated pipe attached to a riser with valve is the best way to discharge water into a check without causing excessive erosion, although a large gate valve discharging into a stilling basin, or a section of earth ditch, is also an acceptable solution. A well-designed and well-constructed border check system is probably the most efficient surface irrigation method available, giving reasonably good uniformity of water distribution and good control over the flowing water, with low maintenance and operating costs. Initial investment is relatively high, however, and is a function of the amount of land leveling required and of the quality of the conveyance system. Water control at the lower end of a check is achieved either by collecting the tail water in a ditch, which returns it - 210 - (a) \ > _ R ~~~~S L O P E- \/ SU~PPLY oirctr (b) Fig. 6-3. Craded rectangular border checks for small grains, forage and row crops (a), and contour checks for orchards (b). - 211 - wood f log turn- out goft tI siph o n Fig. 6-4. Various methods of delivering water from a farm ditch to border checks or irrigation furrows. Fig. 6-5. Alfalfa valve mounted on concrete riser supplied by underground pipeline. - 212 - to a lower portion of the distribution system, or by having the lower end of the check closed to impound the water. Thus no labor should be required for water control. The turnout structure can be opened and closed manually without operator contact with the water, and the system is also adaptable to remote-controlled or automatically operated turnout devices. If the borders are made sufficiently strong and high to prevent breaching and overtopping so as to eliminate the danger of uncontrolled tail water, the border check-method is probably the "cleanest" of all methods of effluent irrigation, both from the point of view of personnel exposure and of environmental pollution. For reasons that will be discussed in greater detail later in the chapter, it is impossible to achieve uniform distribution in border checks with small water application amounts, at least when compared with applications commonly given by sprinkler or drip irrigation. The smallest practical water application is about 100 mm. However, it should be noted that this is a very small amount compared with the application rates of many effluent disposal systems. Given a crop with a rooting depth of at least 100 cm and medium- to fine-textured soil, an irrigation application of 100-150 mm could be given every two to three weeks. Except for the application rate, border check irrigation of a cropped area, particularly of densely sown crops such as small grains or forage crops, closely resembles overland flow effluent treatment, without causing any appreciable downward leaching of effluent beyond the root- zone depth. From the technical point of view, sewage effluent has no properties that could interfere with border check irrigation, and there is no reason why even raw sewage cannot be successfully applied in this way after removal of coarse trash. Basin irrigation. Although similar in construction to border checks, basins are generally smaller in area, are square or sometimes round in shape, and completely level. Their smaller size makes it comparatively easier to cover their entire surface with water in a short time in spite of the land surface having zero slope. Well adapted to orchard irrigation, a basin may contain 1, 4 9, 16, or 25 trees, and typical basin dimensions thus range from about 3-30 m . In contrast to border checks, water distribution in basins is little affected by a furrow at the base of the borders, and thus border construction is simpler and can be done with such implements as an A-frame, disk, or moldboard furrowers. Some hand labor may be required to complete or repair the borders left by mechanized implements at the corners of basins, where perpendicular travel paths intersect. In many parts of the world, farmers construct small basins by hand labor to grow irrigated vegetables, small grains and legumes, as well as fruit trees. Water is generally delivered to basins from small earth ditches, with the elevated ditch pad also serving as a border for every second row of basins. Being smaller in area than border checks, basins also require a smaller discharge per basin, and therefore it is often possible to irrigate a number of basins simultaneously. The turnout structures can be simpler, and often the irrigator simply breaches the ditch bank with a shovel or hoe. This makes the method more labor-intensive than border checks, and may involve more - 213 - operator contact with the water, a possible drawback with effluent irrigation. The distribution of water in large basins can be improved by supplying the water not from a point-source, but from what is essentially a line-source, namely, a large number of small-diameter siphons (2.5-5 cm diameter) along the entire length of one side of the basin. The need for the irrigator to walk in wet soil and to wet his hands when priming the siphons is an argument against this practice when using effluent. Basins can easily be supplied with water from a pipeline either on the surface or underground, essentially as in the border check method. Risers from an underground concrete pipeline are installed at the corners of basins, and one alfalfa valve or orchard hydrant can serve four basins, either consecutively or simultaneously. If a pressure supply is available, gated pipe laid on the border pads is a good line source instead of siphons, and one gated pipe could serve two rows of basins consecutively. The grading operations for border checks or large basins require heavy earth-moving machinery and some specialized equipment for final smoothing, as well as engineers and contractors skilled in the design and construction of these irrigation systems. The construction and installation of irrigation control structures and equipment also require skilled personnel and local industry able to fabricate the equipment. Thus the utilization of wastewater for irrigation by these methods, where they are in common use with ordinary water, should present no problems of system construction. However, in situations where the above infrastructure is not available, problems may be encountered in achieving the required quality of design and construction and could result in poor system performance. It should be emphasized that the availability of good civil engineering capabilities, as would be needed for road building, is not sufficient to guarantee that a well-functioning agricultural system can be achieved. Furrow irrigation. Many row crops are grown on raised ridges or beds for a variety of agrotechnical reasons unrelated to irrigation. However, what is merely a low area between ridges to the grower of dryland crops, or at most a means for the disposal of excess rainwater, is to the irrigator the primary device for controlling and distributing the irrigation water in the field. Besides being ideally suited for the irrigation of ridge-grown row crops such as potatoes, cotton, sorghum, maize, and various vegetables, this method is also well suited for irrigation of orchards and vineyards. In a young orchard, one furrow may be opened near the tree row to supply the still limited root system, and, when the trees develop, several furrows may be placed between two rows of trees so as to wet the entire volume of soil between the trees. The fact that the ridge tops are 10-25 cm above the water surface in the furrow is an advantage of this method with effluent irrigation, since even crops with a low growth habit are further removed from contact with the water, provided they have erect stems (as do peppers, tomatoes, and eggplant). The furrow method is probably the most flexible of all the surface methods in terms of length, slope limitations, and size of stream required. Furrows can be used on graded land, even inside border checks if the crop - 214 - rotation calls for it, and also on ungraded la, in the form of contour furrows laid out along a "falling contour" similar . contour checks. Furrow length may range from about 50 m to 500 m, depei ng on conditions. Since furrows are rather easily overtopped, which resul in loss of control over the water, erosion damage, and water waste, the in planning constraint is the avoidance of overtopping. Because furrows t quite narrow (typically about 30-50 cm) and are frequently rebuilt in tl course of cultivation for weed control, they are hydraulically rougher than e surface of border checks or basins. These factors combine to limit ft Jw streams to fairly low discharges, ranging from 2 and 15 m3/hour per fu ow, and to make completely level furrows inadvisable. Slopes along the furrc may range from 0.2 percent to about 2.0 percent. This does not mean that the atter is the steepest land slope suitable for furrow irrigation, since furrt direction definitely does not have to follow the steepest slope--quite the, .ontrary. Contour furrows run almost parallel to contour lines, more or 3s perpendicularly to the steepest slope. Maximum tolerable cross slope in furrow-irrigated field is about 10 percent. Furrow irrigation differs in principle from border check or basin irrigation in that, in the latter two methods, the entire land surface, with the exception of the small portion of the area taken up by border levees, is inundated by irrigation water, and it is a fair assumption that all the water entering the soil by infiltration moves vertically downward. In furrow irrigation, however, only about half the total land surface is actually under water, with the earth in the ridge being wetted by the lateral and upward flow of water (Fig. 6-6). The importance of this difference lies in the fact that even good quality irrigation water almost invariably contains some salts, which tend to move with the water in the soil. When plants take up water, most of the soluble salts are excluded by the biological membranes in the roots, and when water evaporates from the leaf tissues or from the surface of wet soil, it does so as chemically pure water--evaporation is synonymous with distillation--and the water vapor is essentially distilled water. The salts that were dissolved in the water stay behind in the soil, and to some extent in the plant tissues. When water moves vertically downward through the soil, it carries soluble salts with it, and it is indeed common to find a zone of salt accumulation in the deeper soil layers. If this takes place within the root zone to the extent that crop plants are adversely affected, the salts can as a rule be fairly easily leached to a greater depth by deliberate overirri- gation. As Figure 6-6 shows, water movement in the ridges is never downward, and thus there is a tendency for salts to accumulate in the ridges, particularly in the highest parts. However, there is no way to leach down these undesirable salts during the growing season. Depending on the crop, soil, and water properties, salt accumulation may impair seed germination, seedling establishment, and thus the plant population of the field, and may also affect plant growth during later stages of development, all of which can result in lower yields. When planning a furrow irrigation system, the designer must by all means consider whether there is a reliable rainy season that will leach the root zone of accumulated salts, whether the crop rotation is such that the next crop will be grown in border checks that facilitate - 215 - Fig. 6-6. Cross section of furrows showing flow path of water into ridges. leaching, and whether the salt accumulation over a single growing season will not cause intolerable yield depression. Since sewage effluent always contains more soluble salts than the supply water, the question of water quality must be even more carefully considered if effluent is to be applied in a furrow system. Another characteristic of furrow irrigation is that the rate and distance of lateral and upward flow of water into the ridges are not equal for all soils or even soil conditions, but depend on soil properties and on tillage practices. Lateral and upward spread of water is favored by a small average pore size, and is therefore associated with soils of high clay content and a fine-grained and dense soil structure. Since many farmers feel that they must accomplish complete wetting of the ridges, including the soil surface (which is not always really necessary), they often waste large amounts of water by overirrigating fields where the soil "wets up" slowly and where the ridges are too high. Besides being a waste of water, in the case of reeffluent irrigation this may result in pollution, both by creating uncon- trolled tail water at the lower end of the field and by deep percolation to groundwater. The problem can sometimes be alleviated by compacting the ridges after construction to lower them and to decrease their porosity, and by spacing the rows closer together and making wider furrows. However, soils - 216 - that are too permeable to allow adequate wetting of the ridges are simply unsuitable for furrow irrigation. Water may be supplied to furrows from either permanent or temporary head ditches or from low- or high-pressure pipelines. If the main supply comes from a ditch, water may be turned into the field with siphons or "spiles," short tubes leading through the ditch wall, directly from the head ditch. The water level is raised in the ditch by temporary, portable dams called "flags," made of wood lath and canvas, rubber, or plastic sheets (see Fig. 6-4). Alternatively, water may be discharged through a gate into a secondary head ditch or stilling basin with breached walls opposite each furrow (Fig. 6-4), or a short length of gated pipe may be led out of the main ditch. If the water supply is a pipeline, the breached stilling basin and the gated pipe are common solutions, while in orchards, special orchard hydrants supply water to individual furrows (Fig. 6-7). Because the land area around the entrance to the furrows is very susceptible to erosion, it often becomes necessary for a laborer to make repairs in order to guide the proper amount of water into each furrow. To do so, the worker has to use a shovel or hoe while walking in the flooded area, and thus comes in contact with the water and is exposed to health hazards. Since it is very difficult to supply equal discharges to all furrows, and since furrow shape, roughness, and soil properties are likewise not uniform in a field, the rate of advance of water in the furrows is never uniform. As a consequence, the irrigation stream of individual furrows will reach the downstream end of the field at different times. When the water reaches the far end of the field, the depth of wetting is not uniform along the slope, with the downstream portions of the plot having received less water than the upstream ones. Irrigation must therefore be continued, but at a reduced rate since it is only necessary to replace water infiltrating into the soil. It is therefore customary to decrease, or "cut back," the input stream when the water reaches the lower field boundary. It is not practical, however, to do so for each individual furrow, nor is it possible to cut back the discharge so as to exactly balance the infiltration loss of each furrow. Thus some excess water will unavoidably reach the end of some furrows at least for some time; as a result, some tail water usually runs off the field. If this runoff is not controlled, not only is water lost, but erosion may occur owing to uncontrolled flow and may also create a nuisance by making field roads impassable. Whenever possible, tail water should be collected in a drainage ditch at the lower field boundary and led to a location where it can be reused. In the case of sewage effluent irrigation, tail water presents additional problems: possible contamination of crops in a neighboring field; pollution of surface or groundwater; and contact with people, who may not be aware that they are dealing with sewage effluent and thus may not take appropriate precautions. More care must therefore be taken to control tail water with effluent than with ordinary irrigation water systems. - 217 - -~ N FUROWS ALFALFA VALVE v J ~SPPLY PIP Fig. 6-7. Orchard hydrant mounted on underground concrete pipeline. Design of Surface Irrigation Systems The basic objective of irrigation system design is to apply the required depth of water uniformly over the irrigated area by means of an installation that is economical to construct and operate. It was indicated above that this ideal is not attainable in practice, and actual systems represent a compromise between water application efficiency and uniformity, initial cost, and operating costs of the system. The real design task is therefore to find the optimal solution for a particular situation, based on an evaluation of the relative importance of technical performance and initial and operating costs. A large number of factors are involved in determining this relative importance, and some of them are either difficult to determine quantitatively or they may change with time in an unpredictable manner. To illustrate the point, publications from some regions state that, in general, surface irrigation systems have a higher initial cost than sprinkler systems but have lower operating costs, while other publications state exactly the opposite. The truth is evidently site-specific, depending on current local costs of earth-moving operations, pumps, conveyance equipment, power for pumping, labor, and water. Each project must therefore be designed as a unique case, taking into consideration local conditions. Wastewater does not present any additional technical factors to the design problem. However, the specific properties of wastewater related to the exposure of personnel, water resource pollution, public health hazards, and salinity are likely to impose additional constraints whose effect will be to change the relative importance - 218 - of, for example, the initial cost of more complicated distribution structures as against labor costs, where the use of such structures would decrease the risk that personnel will be exposed to the effluent water. In the following discussion, we restrict ourselves to the technical aspects of the design problem. Some basic principles. The physical principles of surface irrigation are the same for border checks, furrows, and basins. A furrow can be viewed as the narrowest possible border check, and a basin as a relatively short and wide border check. The irrigation process involves the advance of the water over the land surface downslope from the source until it reaches the far end of the run, with water being continuously abstracted from the stream by infiltration into the soil. This part of the process is analogous to the filling up of an open channel with a leaky bottom. After the water has reached the far end of the run, further input is required to balance infiltration. Since the infiltration rate of the soil is not constant with time during an irrigation, the rate of water loss from the irrigation stream varies both along the length of the wetted soil strip at any given moment in time, and with time at a given location. A physical description of an irrigation therefore involves both the hydraulics of open-channel flow and knowledge of the infiltration process. Both processes have been the subject of a great deal of research and each taken by itself is today fairly well understood. However, when both processes take place simultaneously, the situation is very complex and still not well understood, and no simple, direct, and accurate solution applicable to practical design problems is available (Hall 1956; Hansen 1960; Hansen, Israelsen, and Stringham 1980; Rawitz 1973). Let us consider an elementary volume of water at some point along the advancing irrigation stream, as shown in Figure 6-8, and perform a qualitative analysis of its mass balance. The input flux or discharge, q., is determined by the input discharge at the upper end of the plot, the hydraulic properties of the land upstream of the elementary volume under consideration, and the infiltration rate of the same area. As was explained above, these conditions may not be constant, and thus qi is a variable with time. The rate of outflow from the elementary volume, qo , will be a function of q., the infiltration rate of the soil, i, and any possible change of water stored in the elementary volume, which is controlled by the depth, h. This depth is in turn a function of the hydraulic roughness of the element, the average flow across it, and the slope of the water and of the ground surfaces tl) qo = qi - i -h where all the parameters of equation 1 are time-dependent rates. To describe the flow in a strip of unit width at any given time, it is necessary to integrate equation 1 over the length of the strip, which is the sum of all unit distance increments xi; and to describe the entire irrigation, it is also necessary to integrate over time, t. - 219 - qO zq w w J E L E t/ H ~~L ErARtY VOLUME D I S T A N C E Fig. 6-8. Schematic view of the mass balance of an elementary water volume at any point along an advancing irrigation stream. This approach has been described in detail by Bishop, Jensen, and Hall (1967) and by Hall (1956) in a review of several hydraulic approaches. The authors point out, however, the difficulty in predicting the hydraulic roughness of the land, as well as changes in the infiltration function during different irrigations. As a consequence, doubt is thrown on the correctness of the results of rather complex mathematical treatment of water flow over the land owing to the need to guess the values of some of the required parameters (Bishop, Jensen, and Hall 1967; Shull 1960; Collis-Ceorge and Freebairn 1979). A fairly simple example of the problem is the use of the well-known Manning formula for open channel flow: 0.67 0.5 (2) V =R S n where V = flow velocity R = hydraulic radius S = slope of the water surface n = roughness coefficient - 220 - The Manning formula is only one of a number of more or less equally good empirical equations for open channel flow. The values of the exponents are also empiricial, and one could argue about their exact values. However, the main problem is the evaluation of the roughness coefficient to be used in a particular situation. The value of n is usually chosen by the designer, on the basis of judgment and experience, from the published results of empirical tests on common engineering materials. One such commonly used table (Davis and Sorensen 1969) -lists a range of values between .0009 and .040 for artificial channels, and up to .150 for natural channels, none of which is directly applicable to agricultural fields. The roughness coefficient of a border check sown to alfalfa or wheat, for example, certainly changes as the crops grows, and also with the depth and velocity of flow, which cause plant stems to bend and submerge to varying degrees. Similarly, soil clods in a fresh furrow offer a high resistance to flow, but as the season progresses these clods slake, gradually forming a smooth crust on the soil surface and causing roughness to decrease appreciably. To date, no reliable method has been found for determining the value of n to be used for surface irrigation design, and the only alternatives are to make the best possible estimate on the basis of judgment or to carry out field tests on the site. The situation is no better with regard to the infiltration process. This has been extensively investigated by soil physicists and by hydrologists, and there is a wide choice of formulas available, all of which describe the infiltration process equally well once their constants have been evaluated on site. However, without prior empirical determination of these constants, none of them is applicable to predicting the infiltration function of a given soil. To complicate matters, some of the "constants" vary with soil conditions in the field, sometimes to a great extent (Rawitz et al. 1964). In view of the uncertainties involved in predicting both the hydraulic properties and the infiltration behavior of field soil, there is little point at present in carrying out complex mathematical manipulations to predict the rate of advance of an irrigation stream in the field. The best alternative is to carry out field tests to determine the integrated effect of the various factors on the performance of an irrigation system under different conditions (Bishop, Jensen, and Hall 1967; Hansen, Israelsen, and Stringham 1980). Field tests to evaluate surface irrigation performance. The best practical procedure for field testing was described by Criddle et al. (1956), and it will serve as the basis of the following discussion. These tests require at least an improvised irrigation system in the field, which is a major drawback when a new project is to be planned in a completely undeveloped area. The field test can be carried out on border checks or furrows, as the situation demands. Soil surface condition and soil moisture content should be as close as possible to those expected prior to irrigation of a cropped field. In an undeveloped area, furrows of different slopes can be tested by grouping their upper ends around the water source and laying out groups of furrows along falling contours of different slopes with a surveying instrument - 221 - (Fig. 6-9). The furrows need not follow a straight line. Several parallel furrows should be provided for each test treatment, so that the actual test furrow is surrounded by at least one border furrow on each side. This is needed because water from a furrow penetrates laterally as well as downward, and lateral flow will be different if the test furrow borders on dry land rather than on another irrigated furrow. The effect of input stream on water advance can be determined if several sets of furrows or several border checks with the same slope are provided. If the land is a plane surface of suitable slope, the layout of a border check test is similar to that of a furrow test. Otherwise, some land grading may be necessary in preparation for the test. In areas where irrigation is already practiced, tests are often carried out on existing systems on a soil type similar to the one to be developed. This should be ascertained by an appropriate soil survey of both the developed area and the new area. In most cases, tests will be carried out with ordinary irrigation water, even if sewage effluent is to be used in the newly constructed installation. If it is assumed that effluent will tend to decrease soil infiltrability, then the results of field tests with ordinary water will err toward the conservative side; that is, longer plots or smaller input streams than are indicated by the test results will be permissible with effluent. In preparation for the test, each plot slope must be determined by a level survey. During the survey, marker stakes can be placed at regular intervals along the plot, 10-20 m apart. The means for adjusting and measuring the input discharge must be provided, and in the case of furrow irrigation, it is desirable to measure any tail water as well by means of small portable flumes. Since observers will be working a considerable distance apart, a few simple hand signals should be arranged for announcing whether the stream has reached a certain point in the field (for example, the end) and for giving orders to increase, decrease, or terminate the flow. "Walkie-talkie" radios are ideal aids in these tests. Water is turned into the test plot and the flow is adjusted to the desired rate. An observer walks along with the stream and records the time at which it reaches each stake. In border checks, the advancing front may be irregular owing to imperfect leveling of the surface across the check, and some judgment must be used to decide when the "average" of the advance front has passed a point. When the water reaches the lower end of the plot, it is desirable to continue irrigating with a smaller stream just as will be done when the field is cropped. Since there is no way of knowing how much to decrease the input, it is good practice to decrease it only moderately, say by 30 percent, and to measure the remaining tail water. The results will serve as a guide in adjusting the "infiltration stream" in future irrigations. It is advisable to continue the irrigation until steady-state infiltration has been reached. In preparation for ending the irrigation, it is best to place a number of observers along the plot, since the rate of water disappearance from the soil surface, which is to be recorded now, is often much faster than its advance, and a single observer might not be able to record this at every station. When the observers are in place, the water is turned off, and the time is recorded when the soil surface becomes exposed at each stake. A schematic representation of water profiles on the land surface during the - 222 - CONTOU WTRSOURCE LINS i - - r65r PLors Fig. 6-9. Sample map of a field-test layout. Dashed lines represent the axes of groups of furrows or of border checks with different slopes, ranging from almost level (upper group), to the maximum land slope (perpendicular to contour lines). advance process is given in Figure 6-10. If it is assumed that infiltration properties along the plot are uniform, the amount of infiltration, alternatively the depth of wetting, is also indicated for the position of the advance front after equal time increments. It should be noted that the increment of advance becomes smaller with each successive time increment, that the amount of infiltration is obviously not uniform, and that it is zero at the last station at the moment that the advance front arrives there. To facilitate irrigation analysis, the data are plotted as an advance curve, showing the time required for the advance front to reach a given distance (Fig. 6-11). Depth of wetting, or the amount of water infiltrated, is not directly proportional to the time that water was in contact with the soil at each point, as this would require a constant infiltration rate. In fact, infiltrability is related to time by a typical decay function, an example of which is shown in Figure 6-12. As mentioned above, such curves can be described by one of several infiltration equations. What is important to stress here is that the shape of the infiltration curve has certain implications for the uniformity of depth of wetting in surface irrigation. If irrigation is stopped just as the stream reaches the far end of the field, the amount infiltrated at the upper end is the integral under the infiltration - 223 - .4.. 0 0 S ~~~~~~~~~~~~~~~~~U DISTANCE FROM/ UPSTREAM END, ._ _ - ,- - , ' 0 t2-'_/ , - ' ,, 0 __- _ _ ' t4,__- _ Fig. 6-10. Profiles of water layer on land surface (solid lines) and depth ot= penetration into soil (dashed lines) at equal time intervals during the advance stage. Vertical scale greatly exaggerated. Source: Adapted from Hall (1956). - *, DISTANCE FROM UPSTREAM END Fig.. 61.p of * ater *d-.c curve in * fur-o or border bch-c. Fig. 6-11. Example of a water advance curve in a furrow or border check. - 224 - curve (Fig. 6-12) up to the time of cut-off, while infiltration at the far end will be zero, and thus the difference in depth of wetting (the nonuniformity) will be maximal. However, if the irrigation is continued after the entire length of the field has been wetted, then infiltration at the upper end, where contact time with the water is greatest, will proceed at the slowest rate, since the rate will have already dropped some considerable distance along the curve. In contrast, at the lower end, infiltration is proceeding at near maximal rate, thus tending to "'catch up" with the upper end, thereby decreasing the nonuniformity. Up to the time that all locations along the run have reached or almost reached the more-or-less constant final infiltration rate, this process of compensation continues, tending to even out the depth of wetting, as is illustrated in Figure 6-13. The uniformity of water distribution is therefore enhanced by relatively large water applications, and this is one of the reasons why surface irrigation methods are better adapted to deep-rooted crops requiring relatively large and infrequent irrigations. The "rule of thumb" irrigators often use is that the advance time should be about one-quarter of the total irrigation time. In other words, it is customary to continue the irrigation for a considerable time after the water reaches the end of the field. As mentioned before, runoff of tail water in this stage should be avoided, or at least controlled, especially in the case of irrigation with sewage effluent. If the continued irrigation is plotted as in Figure 6-14, the time of irrigation will obviously increase, while distance covered will not, and thus the "infiltration stage" (in contrast to the "advance stage") will be plotted as a vertical line at the distance corresponding to field length. As explained above, during this stage the advance of the wetting front in the soil is not parallel to the front at the time maximum distance was reached, penetration being faster in the downstream parts of the field. When the required amount of water has been applied, the irrigation is terminated, and a new stage of the process, called the recession stage, begins. Just exactly what happens to the water on the soil surface during the recession stage depends on a number of factors, and is usually different in border checks and in furrows. In any case, water will continue running downslope and infiltrating simultaneously, until the higher parts of the land are no longer covered by water, thus terminating infiltration. In the case of basins or border checks with a small slope, large flow depth, high roughness coefficient, low infiltration rate, and a closed lower end, the water surface will become horizontal, further overland flow will cease, and the soil surface will gradually become exposed along the slope as the water level drops. This is shown in Figure 6-15, which is a continuation of Figures 6-11 and 6-14. At the opposite extreme is the situation where flow is shallow, the slope steep, hydraulic roughness low, infiltration rate high, and the lower end of the check or furrow open. This is the case most common in furrow irrigation. The water disappears from the soil surface very quickly, and sometimes a small puddle forms at the lower end of closed furrows. Some puddles may also remain in small local depressions along the slope if the land is not perfectly graded. - 225 - -.4 - .4 DISTANCE FROM UPSTREAM END Fig. 6-12. Typical relation between soil infiltrability and time for initially dry soil. E 0 ^ / U- T I M E Fig. 6-13. Variation of infiltrability along a plot (inserts show infiltrability vs. time) due to differences in intake opportunity time, just after advance stream has reached end of plot. - 226 - CuroFF TIME LUJ CM ot w Ci) 0. ADVANCE TIME A, -- - -_ --- -- ~- -~ - - - -- - w ADVANCE STAGE END OF PLOr DISTANCE FROM UPSTREAM END Fig. 6-14. Plot of infiltration stage after advance stream has reached downstream end of field. RtECESSION 0 lO I U Uf) a- -J DISTANCE FROM UPSTREAM END Fig. 6-15. Complete advance-recession diagram, showing intake opportunity time (IOT) at various distances. - 227 - The complete irrigation event, as described in Figure 6-15, can now be analyzed qualitatively, and recommendations can be made for improvement where indicated. If it is assumed that the infiltration properties of the soil are uniform over the length of the irrigation run (and there is really no other choice) and that two locations along the slope were in contact with water for equal periods of time, then equal amounts of water infiltrated during this period. This is true regardless of the shape of the infiltration function with time. In order to achieve a uniform irrigation, it is therefore desirable that this time, called the intake opportunity time (IOT), be equal at all points along the slope. In Figure 6-15, any vertical line drawn between the advance and recession curves represents the IOT at that distance. If the IOT is the same at all points, then the advance and recession curves will be parallel, and this is indeed the desired result. If IOT is not uniform along the run, changes will have to be made to correct the situation. The shape and slope of the recession curve is determined by the slope of the land, surface roughness, the infiltration rate, and whether the check or furrow is blocked at the lower end and can store water. Whether the lower end of the plot can be blocked is determined in part by the slope of the land. In an existing installation, none of these factors, except possibly the last one, can be controlled or changed. In the planning stage one does have a choice of slopes within certain limits, but since change of slope would have a similar effect on both the advance and the recession curves, adjustment of the slope can at best be a partial solution. Adjustment of the IOT to improve uniformity of water distribution is generally accomplished by controlling the rate of advance of water over the land surface. The slope and shape of the advance curve is affected by the same factors as the recession curve, but additionally also by the size of the input stream and by the length of run. The stream size can be adjusted at any time, even during an irrigation. Although length of run can be changed in an existing installation, this is liable to be both cumbersome and expensive, and the appropriate length of field is best chosen in the light of diagnostic tests, in the planning stage. Changing the length of run does not, in fact, change the advance curve, but it does enable the planner to choose the segment of the curve along which the irrigation will operate. If the field test is carried out on an essentially infinitely long plot, using different input streams, and is continued long enough for the intake rate to reach steady state, the results would look like those shown in Figure 6-16. Each stream would eventually wet an area, large enough so that the steady-state infiltration would exactly balance the input, and the advance curve would be asymptotic to a vertical line. Since the recession line will never become asymptotic to the vertical, uniform distribution is unattainable under such conditions. In reality, it would take a very excessive application to reach the final condition, and this would certainly not be chosen. However, if the run is too long and the advance curve includes part of the steeper final portion, uniformity can be greatly improved without increasing the input stream, by shortening the length of run. - 228 - Li DISTANCE FROM UPSTREAM END Fig. 6-16. Effect of input stream on advance curves on an infinitely long land surface (q1>q2>q 3)- Adjusting the input stream is the most convenient way of adjusting the advance rate to the recession rate. In the rarer case of the advance rate being greater than the recession rate, the discharge is decreased; such a solution does not present any difficulties. Usually, however, it is necessary to increase the advance rate by using a larger input stream. There are certain limits to this possibility, some of which can be overcome in the planning stage. First, the water source and conveyance system must be large enough to supply the required discharge. Second, the flow velocity of water over the land must be kept low enough so as not to cause erosion damage. Finally, overtopping of furrow ridges or border levees must be avoided. In a given situation, increased stream size will lead to a greater depth of flow, and this is limited by furrow depth or levee height in the case of border checks. Discharge can be increased without increasing depth of flow if the slope is increased as well. This entails increased flow velocity, however, which is ultimately iimited by the erosion hazard. A combination of increased input stream and increased slope can be a good solution in the planning stage. The infiltration behavior of soils changes with time, both during a particular growing season and between crops owing to factors such as changes in the hydraulic resistance of soil surface and crop -stand, tillage operations, wetting and drying cycles of the soil, and chemical changes due to irrigation water quality. It is therefore wise to plan an irrigation system with some flexibility, which in this case means not basing the design on the longest possible run or the largest possible input stream. Analysis of a worked example. Let us assume that border checks 12 m wide and 200 m long are to be given an irrigation of 12-cm water depth. The checks are closed at the lower end, and it is possible to apply the entire irrigation without changing the input discharge. To keep the example simple, it will be assumed that both advance and recession curves are essentially linear. - 229 - Case 1: Area of check = 12 m x 200 m = 2,400 m2 = 0.24 ha Volume of irrigation = 0.12 m x 2,400 m2 = 288 m3 Observations taken during irrigation: Input discharge = 60 m3/hr Time of advance = 3.0 hr Time of recession = 0.5 hr Recession started immediately upon termination of irrigation. Irrigation time was: (288 m3)/(60 m3/hr) = 4.8 hr Analysis: Since total irrigation time was 4.8 hours and it took the advancing stream 3 hours to reach the far end, application was continued an additional 1.8 hours after the advance was completed. The intake opportunity time (IOT) at the upstream end was thus 4.8 hours. The downstream end was in contact with water only during the 1.8 hours that the infiltration stream was applied, plus the half hour of water recession in the check, for a total of 2.3 hours. The ratio of IOTs between the upper and lower ends of the check is 4.8/2.33 = 2.09. The soil at the upper end was thus infiltrating more than twice as long as at the lower end, and the advance rate was obviously too slow. Since linear advance and recession curves were assumed, the average, and correct amount of water infiltrated halfway down the check. If we round off the above ratio to exactly 2, the upper and lower ends of the check would have received 16 cm and 8 cm of water, respectively. The entire lower half of the check was underirrigated, and, if this was not corrected, yield would be unfavorably affected in this area. The upper half of the plot was overirrigated, with the excess water percolating below the crop root zone. If no corrective measures are taken, this percolate could eventually reach the groundwater table, and if it contains undesirable dissolved chemicals, these would affect groundwater quality. From the data in the above example, it is possible to calculate the water application efficiency. In this case, it would be deceptively high, because the example assumed that the lower half of the check would be under- irrigated. In actual practice, it is likely that the farmer would increase his irrigaton application to assure that the entire check was wetted to the desired depth, which would result in overirrigation of the entire length of the check except for its downstream end, a lower application efficiency, greater water loss, and greater chances of groundwater pollution. Water application efficiency is defined as the ratio between the amount of water actually added to the root zone and the amount of water - 230 - applied, expressed as a percentage. This index does not give any expression to those parts of the area that received less water than required. In the above case, if we assume linear advance and recession, the average excess in the upper half of the check was 2 cm of water (4 cm excess at the end, zero halfway down the run), representing a water volume of 24 m3. The water application efficiency is therefore: E = [(288-24)/288] x 100 = 92%. This is actually a very high efficiency for any irrigation system, certainly much higher than is generally achieved in surface irrigation. As pointed out above, this example is not quite realistic because of the simplifying assump- tions that were made, particularly that underirrigation would be tolerated. In actual practice, the application efficiency on surface-irrigated farms ranges from 25 percent to 75 percent, with the worldwide average probably around 50 percent. Case 2: Assume that it was decided to correct the above performance by using a larger advance stream, this time 120 m3/hr. Time of advance was now 1 hour, and recession 0.75 hour (owing to the larger depth of water in the check at cutoff time). Irrigation time required to apply the same application is now: (288 m3)/(120 m3/hr) = 2.4 hours. The IOT at the upper end is now 2.4 hours, and at the downstream end, 2.15 hours (1.4 hours between arrival of the stream to cutoff time, plus 0.75 hours of recession time). The ratio of IOT at the upper end to that at the lower end is in this case 1:1.12, indicating much improved uniformity. In most real situations the advance curve would probably not be linear. If a quantitative analysis is required, more detailed data would have to be obtained and more complicated calculations carried out. An index of the uniformity of distribution can be calculated using the method described by Rawitz (1963). In most cases, it would be sufficient to make a qualitative judgment of the irrigation performance and the corrections required on the basis of the inspection of the advance-recession diagram. Since the soil hydraulic properties and intake rate are affected by tillage practices, crop characteristics and development, and the number of irrigations applied, some adjustments in the irrigation stream will probably be necessary in any case, and the qualitative method of estimating the proper slope-discharge is suffi- cient for planning purposes. Operational Considerations Water supply. In order to achieve adequately rapid advance of water over the land, large discharges are often required. The largest streams are required for level basins, followed by border checks. Discharges per meter width of land range from 10 to 50 m3/r (Rawitz 1973), and total discharge to a field ranges from 200 to 2,000 m /hr (Reed, Meyer, and Aljibury 1976). - 231 - Stream size per individual furrow is generally 1-15 m3/hr, with the total discharge per field as for border checks. The concentration of such large discharges requires a relatively costly distribution system, a large part of which will not be utilized at any particular time. Special control and turnout structures are required, and must be maintained in proper working order. If conditions are not ideal, control of the large discharges may require a considerable input of labor. If the actual water source cannot supply the required discharge continuously, or if irrigation is carried out only during a part of the day, operational reservoirs are required to balance supply and demand. These can be incorporated into a sewage treatment system, as the last of a series of aerobic lagoons. Surface irrigation does not require delivery of water under pressure, so that power costs for pumping are nil. Water use efficiency. "Irrigation efficiency" can be defined in many different ways: for example, project efficiency of delivery, consisting of the efficiency (or losses) of storage, conveyance, and actual field applica- tion. This type of efficiency is an engineering concept, and is defined as the ratio between useful water delivered in the field and the amount of water withdrawn from the source. Technically, it is possible to store and convey water for surface irrigation in unlined earth reservoirs and ditches and in lined ones; conveyance can also be by pipelines. Unlined earth structures are subject to the greatest seepage losses. In addition to being a loss, such seepage can cause serious problems of waterlogging, salinization, and contami- nation of the land adjacent to the leaky structures. Conveyance losses from pipelines can be zero in a properly constructed and maintained system. The field water application efficiency is independent of the conveyance and storage efficiencies. In actual farm practice, the field application efficiency is generally much lower with surface irrigation than with the other available delivery methods, the main losses being due to deep percolation and to runoff. Another important expression of irrigation efficiency is the crop water use efficiency, which is an agronomic concept. It is defined as either the total or the marketable yield per unit volume of water either used by the crop or applied to the field. The former is an index of the botanical efficiency of the crop, while the yield per water applied is an indicator of both crop efficiency and operational efficiency. The agronomic efficiency, however defined, is of course affected by the technical efficiency of water application, but it also depends on a number of other factors. Among these are the suitability of the irrigation frequency or schedule, water quality, soil fertility, plant protection, weed control, and in fact any other factor affecting crop productivity. The combined effect of these two influences is generally lower water use efficiency under surface irrigation than under the other methods. The use of effluent in surface irrigation systems tends to accentuate the above problems and therefore requires additional attention during farm operations. The higher salinity of effluent increases the possibility of soil salinization either in the field or in areas receiving excess water from - 232 - runoff or due to seepage from the storage and conveyance system. In addition, public health and aesthetic considerations arise if excess irrigation water reaching the groundwater supply is withdrawn in the immediate area or flows uncontrolled over the land. The keys to achieving the higher quality of irrigation with effluent are good design and good management. Labor considerations. Compared with other irrigation methods, the amount of labor per unit area is among the lowest for border check irrigation, and among the highest for furrow irrigation (Reed, Meyer, and Aljibury 1976). The actual labor requirement varies tremendously, depending on plot size, size of irrigation stream, quality of design and land preparation, and labor skill. A properly operating modern system requires a minimum of attention during irrigation, especially if it has some automated controls. On the other hand, if the water is difficult to control, attempts to do so are a backbreaking and frustrating task. The operation of a canal system requires technical knowledge and experience, and the field labor of water control requires skill acquired by practical experience. Good irrigators are found where surface irrigation is part of the farming tradition, since it is diffi- cult to impart these skills in a formal program to people with no previous experience. Irrigation with effluent introduces the consideration of personnel contact with the effluent and possible infection by pathogens. Since health aspects are not within the scope of this chapter, it will be assumed that personnel contact with effluent should be avoided as much as possible, or that effluent should be treated to reduce health hazards to workers. Well-con- structed border checks, with levees closing the downstream end, and gates or valves serving to divert water from the supply line onto the land, should prevent workers from coming into contact with the irrigation water. Only in the case of mishaps--such as overtopping or breaching of a levee--would there be an increased likelihood of contact with the effluent since it might become necessary for a worker to make emergency repairs with a shovel or hoe and to stand in water or mud. In furrow irrigation, much more guiding of the water may be needed than in border irrigation, both at the supply end and at the downstream end of the field. Water is commonly taken from the supply ditch to the land by siphons, which must be primed by hand, and the water must be guided to individual furrows by temporary earth ridges, and thus some direct contact with the effluent is almost unavoidable. Crop considerations. Border checks or level basins are essential for continuously flooded rice culture. Moreover, they are the only alternative to sprinkler irrigation for forage, pasture and hay crops, and small grains. They are also often used for irrigating orchards and vineyards and some row crops. However, crops that suffer from stem rot (such as trees) and row crops that appear to be affected by poor aeration due to flooding are better irrigated by furrows. With furrow irrigation, there is a tendency for soluble salts to accumulate at the highest points on the ridge, and such accumulations may affect some crops. Thus the plant row may have to be placed on the shoulder of the ridge rather than at the peak by means of a special planting - 233 - technique. Besides the various agrotechnical advantages of ridge-and-furrow tillage for many vegetable crops, furrow irrigation decreases contact between the marketable yield and the irrigation water. Sprinkler Irrigation Distinguishing Characteristics of Sprinkler Irrigation The most important characteristic of this irrigation method is that it relies upon a mechanical device to distribute water droplets over the land surface, in an imitation of rain, but the land surface is not prepared and shaped to control water flowing over it. The reasons are mainly financial, but sometimes also dictated by specific site conditions. The first requirement of a sprinkler system is therefore to avoid the formation of surface runoff. This means that the water application rate must not exceed the lowest anticipated infiltration rate of the soil during an irrigation. To the extent that effluent application may have a specific effect on soil infiltrability (for example, high sodium concentration, soil clogging by colloids or microorganisms), this possibility must be taken into account in the planning stage. The application rate of a sprinkler system is determined by the combination of sprinkler spacing and sprinkler discharge, the latter being a function of nozzle diameter and hydraulic characterics and of the operating pressure. The sprinkler manufacturing industry offers a wide variety of equipment, so that the designer has great flexibility in choosing suitable equipment for a particular situation. The second distiguishing feature of sprinkler irrigation is that the water must be delivered to the sprinkler under some pressure. Pressure is required not only to drive the water jet through the sprinkler nozzle, but with virtually all sprinklers used in agriculture, it is also used to drive the mechanical moving parts of the sprinkler. Except in the rare case where the water source is close to the point of delivery but at a considerably higher elevation, pressure must be provided by a pump. Pressure therefore costs money, both for the equipment and for energy to drive it. In order to deliver water to the sprinkler device under pressure, it must be conveyed from the pump to the sprinkler by pipelines. This adds a new element to the design problem: The energy loss (pressure loss) rate of water flowing at a given discharge through a pipe decreases as the pipe diameter increases, thus leading to a saving in pumping costs. However, the price of pipe increases with diameter, and it is up to the designer to find the optimal combination. The third distinguishing feature of sprinklers is that virtually all sprinklers on the market wet a circular area of land. If one wants to wet the entire land surface (which is generally the case except for some orchard crops), then the areas wetted by adjacent sprinklers must overlap. If uniform irrigation is desired, one must find a way of mounting sprinklers with circular individual wetting patterns in a square, rectangular, or triangular grid. - 234 - Types of Sprinkler Systems The entire land area served by a given water source or main pipeline is never irrigated at the same time. On the contrary, it is most economical to operate the system continuously and to irrigate only enough land at any one time to allow the entire area to be covered within the period allowed between irrigations. A group of sprinklers operating next to each other are called a "set," and diverting the water from one area to the next position is called "changing the set." One convenient way of classifying irrigation systems is by the method for changing sets, in other words, by the degree and manner of equipment portability. Another way of classifying sprinkler systems is to use the technical features of the delivery device. Classification by portability criteria. Irrigation systems may be permanent, semipermanent (semiportable), or completely portable. For any given delivery device, it is obvious that a permanent system will have the highest initial cost and the lowest labor and depreciation cost, while a completely portable system will have the lowest initial cost but the highest operating cost. Choice of system therefore requires optimization of present and predicted local economic parameters. In a truly permanent system, all components remain in place. The set is changed by closing valves leading to one group of sprinklers and opening the valves of the next group to receive water. This type of system is most suitable for perennial crops, especially orchards where labor costs of moving sprinkler lines are especially high. Permanent systems can be adapted for automated control of irrigation. The distribution pipelines are generally buried in the ground to protect them against damage and to eliminate a traffic obstacle. A variation of the permanent system is the so-called "solid-set" system. Used generally with annual crops, portable sprinkler lines are placed in the field so as to cover the entire area without having to be moved and are left in place throughout the growing season. The supply pipe may be either portable or permanent. All the portable equipment is removed from the field at the end of the irrigation season. In semipermanent systems, the supply pipes (mains and submains) are permanently in place, usually underground, and the pipes on which the sprinklers are mounted (sprinkler laterals) are portable. Changing a set consists of moving the laterals from one position to the next. This may be done in several ways (in ascending order of complexity): hand move individual pipe sections, 6-12 m in length; tractor-tow move, where each pipe section has a set of small wheels, and the entire line is pulled along its axis for a distance equal to the lateral length (typically 200 m) to the next set position--in this method, it is customary to irrigate a group of 5-10 adjacent laterals simultaneously; roll move, where wheels up to 1.2 m in diameter are mounted at intervals along the lateral, and the set is changed by rolling the lateral in a direction perpendicular to its axis for a distance as would be done with hand move, that is, the lateral spacing also used in a solid-set - 235 - system (typically 12-24 m); and center-pivot systems, where a lateral up to 400 m long is mounted on wheeled towers and connected to a water source at one end. A motor-generator supplies electric power to motors mounted on the tower wheels, and the entire lateral slowly turns around the water source. The system can be left in place for an entire season, but in some instances can also be towed from field to field during the irrigation season. In a variation of this system, the lateral moves in a straight line, perpendicular to its axis, just as in the roll-move system, using the same towers as a center-pivot system. The advantage is that while the center-pivot irrigates a circular area, the frontal-move system irrigates a strip. Provision must be made to keep the continuously moving lateral connected to a water source. Various ways of changing set are illustrated in Figures 6-17 to 6-20. Completely portable systems have the lowest initial cost and the highest operating costs of the three levels of portability. The entire conveyance network is portable; it consists of aluminum or plastic pipes laid temporarily on the soil surface. Some completely portable systems are essentially temporary extensions of a permanent conveyance system, and draw their water from the distal point of such a system. Others even have a portable water source, for example, a trailer-mounted or tractor-mounted pump drawing water from a surface source such as a lake or a river at a convenient point. In Israel, completely portable systems are used mainly for supple- mentary irrigation of winter crops such as wheat, when water is available. Such an uncertain arrangement does not justify more permanent installations. To keep labor costs down, more and more trailer-mounted sprinkler guns and center-pivot systems are being used as the delivery device in portable systems. Completely portable systems should prove useful with wastewater irrigation: they could help balance the seasonal discrepancy between supply and demand, thus decreasing the required reservoir storage capacity; they are well adapted for large-discharge sprinklers where clogging problems are minimal; and the equipment used involves minimal operator exposure to wastewater. The crops suitable for supplementary irrigation either are not grown for human consumption or are first subjected to both atmospheric drying and prior processing. This type of system also facilitates the conjunctive use of rainwater and wastewater, since it is sufficiently flexible to allow adjustment of irrigated area vs. size of application according to the rainfall pattern of a particular season. Classification according to delivery device. Many different types of devices are available to distribute water under pressure over the land; not all of them are called sprinklers since they use various mechanical principles to distribute water over the target area. Only those devices usable in agricultural fields with sewage effluent will be discussed here. The great majority of delivery devices sold are classified as impact sprinklers (Fig. 6-21). Depending on the area to be covered and discharge required, such a sprinkler has one to three nozzles and an impact arm that rotates the sprinkler about its vertical axis. The water jet from one of the - 236 - NEW POSirION SUPPLY LINE / RISER I / rr Ir VALtVE- / t 0~~~LD POSI TION Fig. 6-17. "Moving the set" of a lateral in a hand-move system. nozzles strikes the impact arm, deflecting it either to the side or downwards. The arm is returned either by a spring or a counterweight, hitting a rubber bumper on the side of the sprinkler and turning it by a few degrees. The force of the returning arm may be augmented by the water jet impacting on a special target on the arm. A properly adjusted impact sprinkler will make a full revolution about once every 60-120 seconds. Impact sprinklers are subdivided roughly according to size (low, medium, and high discharge), operating pressure, and radius of throw. The smallest models are not suitable for effluent application without the use of filters. The medium- size sprinklers operate at pressures between about 1.5 and 4.0 atm., are mounted on 3/4-inch or 1-inch risers, have a radius of throw of 10-30 m, and a discharge of 1.5 - 6 m3/hr. Special models for part-circle wetting and low trajectory models for under-tree sprinkling in orchards are available. High- pressure sprinklers, so-called sprinkler guns, operate in the pressure range of 4-8 atm., are mounted on 2-inch or 3-inch risers, have a throw radius of 35-75 m, and have a discharge of 20-150 m3/hr. All of the sprinkler models are available with choice of nozzle diameters and each will operate satisfactorily over a certain pressure range. This makes it possible to choose different sprinkler spacings both along and between laterals. The many possible combinations offer a choice of application rates between about 0.5 and 3 cm/hr. In order to avoid any possible clogging problems with effluent irrigation, it has been recommended that nozzle orifice diameter be at least 0.5 cm (Noy and Feinmesser 1977), which is why low-discharge impact sprinklers have not been discussed here. Center-pivot and frontal-move systems can deliver water in several ways. Originally designed for use with ordinary medium-pressure impact sprinklers, the machines have lately been adapted for use with low-pressure and ultra-low-pressure emitters, mainly to save pumping energy, but this has a bearing on effluent irrigation as well. Low-pressure emitters consist of a special spray nozzle facing downward from a drop tube extending down from the - 237 - NEW POSIriON SUPPLY LINE OLD POSIrION Fig. 6-18. "Moving the set" of a lateral in a tractor-tow system. - 238 - Fig. 6-19. "Moving the set" of a lateral in a roll-move system. - 239 - / 1", I ~~~~~~~~~~/ POWER AND WATER SOURCE Fig. 6-20. A center-pivot irrigation system. - 240 - : S~~~~~~~A Fig. 6-21. A typical medium-pressure impact sprinkler. lateral pipe to about 30 cm above crop canopy height. The jet from the nozzle hits a horizonal metal plate mounted on the drop tube with a stirrup, and breaks up into droplets, which are thrown horizontally in a circular pattern having a radius of 2-4 m. This emitter forms relatively large drops at a low height, thus decreasing the hazard of aerosol formation and pollution by wind drift. Uniformity of Water Distribution One of the basic requirements of a good sprinkler system is that the water be distributed over the land surface as uniformly as possible. Orchard crops are a special case; whether irrigation is applied by over-tree or by under-tree sprinklers, the tree geometry interferes with uniform distribution no matter how well the equipment itself performs. It has been shown that a certain degree of controlled nonuniformity is not harmful in orchard irriga- tion. The factor that determines the importance of nonuniformity is the size of the area receiving either more or less water than the design application in relation to the size of the root system of an individual plant. Variations within the root system of a single plant are much less important than varia- tions between plants. It is evident that in the case of close-growing field and truck crops, detectable areas of nonuniformity will include a large number of individual plants, and thus uniform distribution is important. Excess water percolates below the crop root zone, representing a loss of water and of - 241 - soluble nutrients, and it may ultimately affect groundwater quality. Areas receiving insufficient water will give a lower yield. When considering sprinkler uniformity it is well to distinguish between large-scale effects, which act on an entire lateral or groups of laterals, and small-scale effects related to individual sprinklers, and thus acting within the range of influence of adjacent sprinklers whose patterns overlap. Large-scale nonuniformity--discharge variation due to friction losses. The transport of water through a conveyance system requires the expenditure of energy, which is used up as friction between the water stream and the ditch or pipe wall, as well as friction between the water molecules themselves. A detailed treatment of the hydraulics of sprinkler systems is beyond the scope of this chapter and may be found in easily accessible sources (Christiansen 1942; Christiansen and Davis 1967). Examples used here will be simple enough so that it will not be necessary to refer to these sources. If a constant discharge flows through a pipe of constant diameter and on a constant elevation, the energy loss due to friction will express itself as a linear loss of pressure with distance. The case of a sprinkler lateral is more complicated because it has multiple outlets, uniformly spaced along its entire length. Each of these removes about an equal amount of discharge from the pipe. Furthertnore, various small losses of energy occur along the flow path owing to sudden changes of direction and pipe diameter, lumped together as "local losses." Nevertheless, as water flows through a sprinkler lateral, the pressure will gradually decrease, though not in a linear manner. The flow through a nozzle, in this case a sprinkler nozzle, is described by the equation: (3) q = cA /2 gh where q = discharge c = coefficient of discharge of the orifice A = cross-sectional area of the nozzle opening g = acceleration due to gravity h = energy head of water For our purposes, the energy head can be thought of as pressure, expressed in units of equivalent water depth, or head, with one atmosphere of pressure being virtually identical to a head of 10 m of water. The meaning of the above equation is that discharge through a nozzle varies as the square root of the pressure. This law describes the differences in discharge to be expected along a sprinkler lateral as the pressure decreases with distance from the source. - 242 - There is no law stating what variation in discharge between sprinklers is tolerable along a lateral or among a group of sprinklers operating in a set of several laterals. The irrigation profession has more or less arbitrarily accepted an upper limit of 10 percent between the sprinkler with the highest and the lowest discharge among a group of sprinklers operating simultaneously. Equation 3 can be solved simultaneously for the case of the two extreme sprinklers in a set in order to determine what difference in pressure will produce no more than a 10 percent difference in discharge: (4) qj/q2 t 2 Designating q' as the higher discharge and q2 as the lower one, their ratio will be 1.1 according to the 10 percent difference criterion, which is almost exactly the square root of 1.2. It thus turns out that a pressure difference of 20 percent between the extreme sprinklers will result in a discharge difference of about 10 percent. Since pressure is easier to measure in the field than discharge, the so-called "20 percent law" has been adopted as the design criterion. Two things must be emphasized about the use of this "law." First, the magnitude of the acceptable systematic large-scale variation in discharge was chosen more or less arbitrarily. For example, the same physical laws are often used in drip irrigation design, but using a stricter criterion, allowing only a 10 percent pressure variation. Second, while the 20 percent law was originally developed for single-sprinkler laterals, the hydraulics applies equally to a submain with a number of valves, each of which is attached to a lateral, with all the laterals operating simultaneously. In this case, the submain is treated as a lateral, and each lateral as if it were a single sprinkler. As a consequence, if a number of laterals are working simul- taneously and for the same length of time, and if it is necessary to remain within the 10 percent discharge difference in the area irrigated, the 20 percent law must be applied to the entire area. It is then no longer permissible to allow a 20 percent pressure difference along any single lateral, but it must be applied to the difference in pressure between the first sprinkler on the furthest upstream lateral and the last sprinkler on the furthest downstream lateral (Fig. 6-22). If the land is not level or nearly level, then differences of elevation must be added algebraically to differences in pressure head. The standard procedure for maintaining acceptable water distribution uniformity is to choose pipe diameters that will keep the pressure drop between extreme sprinklers below the above-mentioned 20 percent, and this is the procedure recommended when effluent is to be used. Graphic aids such as special nomograms (Christiansen 1942; Rawitz 1973) and hydraulic tables are available both in the professional literature and in equipment manufacturers' catalogs. Some manufacturers distribute special slide rules that are very useful in irrigation network design, and others are sold commercially. The appropriate equations (Christiansen 1942) can also be incorporated in fairly simple programs for use with programmable pocket calculators. - 243 - (a) / rf ~ATRAL VEV1 SUBq MAIN (b) ,,