Skip to main content

Currently Skimming:

4 Issues of Model Application
Pages 77-112

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 77...
... analytical methods, especially the use of either the waste extraction test (WET) or the toxic characteristic leaching procedure (TCEP)
From page 78...
... used does not correspond to the physical processes occurring in the scenario · Structural errors: Errors or ambiguities in the structure of the models that lead to errors in calculations. · Documentation errors: The description of the mode} differs from the model that was intended to be adopted.
From page 79...
... The specific dietary intake parameter values need to be realistic. Those used by DISC do not appear to have been selected for real-life conditions and draw upon data that are 10 to 30 years old (an example of the mistaken identity error)
From page 80...
... Some of the difficulty in the exposure assessments for the adjacent resident scenario can be traced to estimates of dietary intake, most specifically for home-grown foods. These estimates are presented in the CalTOX parameter values section of the DISC report (199Sa; pp.
From page 81...
... Neither EFH nor the DTSC report provides specific data on the number of gardens in California. It is reasonable to assume that some state-specific data for consumption of home-grown fruits and vegetables are available for California, a major agricultural state, yet only data from the EFH are used.
From page 82...
... However, the committee suspects that corn grown in home gardens is used as a vegetable, not a grain (a mistaken identity error)
From page 83...
... , on a survey of 900 rural farm households published in 1966, and they only apply tofarm households. It is highly unlikely that such numbers can apply to suburban and urban settings, where keeping livestock is usually against local ordinances (a mistaken identity error)
From page 84...
... If DTSC is attempting a worst-case analysis, then the worst-case would have to be applied. This type of error could be either a mistaken identity error if ambient water quality criteria were assumed to be relevant to groundwater, or an extrapolation error if it was assumed that groundwater concentrations correspond to surface-water concentrations.
From page 85...
... Thus, the liner protection factor is calculated from an incorrect base value. This could be classified as a mistaken identity error for all the parameter values for the unlined landfill.
From page 86...
... Calculations for Upper SERTs The difference between the calculation of the tower and upper SERTs is the liner protection factor. The upper SERTs are calculated by multiplying the lowest of the health-based level, the maximum contaminant level
From page 87...
... The spreadsheet calculations for upper SERT results appear to correspond to the values for liner protection factors present in the Crystal Ball custom distribution (which do not correspond to any documented set of values)
From page 88...
... These apply when the upper SERT is based on maximum contaminant level-ambient water quality criteria, oral reference dose, and cancer potency, respectively. The value obtained when the upper SERT is based on maximum contaminant level-ambient water quality criteria is necessarily the lowest of the six-point estimates for the liner protection factor, because the linerprotection-factor distribution is the only distribution involved in this case.
From page 89...
... This is a structural error, but DTSC's goals have not been sufficiently well defined to determine whether the structural error is potential or actual. Exposure Pathway Factor for Inhalation The definition of "Pathway Exposure Factor for Inhalation" (DTSC 199Sa, p.
From page 90...
... At the second public meeting, DTSC indicated that a concentration of 50 ,ug/m3 of dust in air (based on the National Ambient Air Quality Standards) was used as a default value and that dust emission and dispersion had not been modeled because there were no models for dust emission (DTSC, personal common., November 20, 19981.
From page 92...
... 560 and 578) , and thus correspond to a very different size distribution from wind-blown dust, so that using this parameter is an error of mistaken identity (or possibly an extrapolation error)
From page 93...
... One equation is for wet bulk density, the other is for dry bulk density, but that distinction is not made. It must be specified whether bulk_density is wet or dry bulk density (this appears to be a documentation error; the propagation of this error in the implementation has not been checked)
From page 94...
... 817) , mixing height is given as 2 m for a 4.54 x 105 m2 landfill, with a reference to an earlier 1994 DTSC report.
From page 95...
... In the context of its chosen model, therefore, DTSC's averaging method is a structural error. However, as pointed out in Chapter 3 and above, the emission mode} is incorrect for the scenario evaluated.
From page 96...
... The PEA spreadsheets have been constructed to estimate a concentration C given a fixed target value for R and fixed values of the parameters a, b, c .... When the Monte CarIo procedure is run in one of these spreadsheets, what is obtained is a set of values for C at the fixed target value for R
From page 97...
... Given fixed initial compartment concentrations (or similar inputs proportional to such concentrations, like rates of application) , a distribution of risk estimates is calculated, and the required percentile of the risk distribution obtained.
From page 99...
... However, as with the PEA model, many of the component models within CalTOX appear to be oversimplifications for the scenarios that DTSC is considering. In addition, the committee found numerous errors, in the context of the DTSC scenarios, in the parameter values used in CalTOX.
From page 100...
... (Chapter 3 also has a short discussion of some pervasive extrapolation errors in the estimation techniques used by DTSC to estimate some of the parameter values used in CalTOX, for example, partition coefficients for plant tissue and biotransfer factors to meat, milk, and eggs.) Dust-Depos;~;on Ve/oc;ty The reported dust-deposition velocity (DTSC 199Sa, p.
From page 101...
... The deposition velocity at any one location will vary substantially from time to time, but that is irrelevant for the modeling required by the DISC scenario, an error of mistaken derivation. Moreover, there is also a transcription error: 500 m/day (CV 0.30)
From page 102...
... for determining the extractable constituents of hazardous wastes not classified under the Resource Conservation and Recovery Act (RCRA) , relying rather on the use of EPA's toxic characteristic leaching procedure (TCLP)
From page 104...
... , the committee supports the development of a single test protocol to classify California's hazardous waste, and to do so in harmony with the classification test of EPA. Such a test should provide results that can be related to field-realistic exposures, including the uncertainties associated with leaching pathways in the field.
From page 105...
... The committee recognizes that the TCEP has nationwide status, use, and acceptability. Harmonizing California's extraction test with that required by EPA would minimize the testing burden on waste disposers in California, who would need to conduct only the TCEP.
From page 106...
... Biological sensitivity is fixed by the inherent toxicity of the analyses and response of the organism being exposed to the analyte under specified conditions. Comparative testing to determine if the use of detection limits as proxy values is protective under reasonable exposure scenarios is lacking.
From page 107...
... The chronic toxicity risk assessments are based on reference doses or concentrations, or cancer potency factors that were designed to protect the general population, including sensitive subpopulations. Thus, the thresholds developed based on chronic lowdose exposures of the general population would be expected to be much lower than the thresholds that might be developed for acute toxicity based on almost any acute exposure scenario.
From page 108...
... 723. DTSC presents the use of the acute oral and dermal toxicity thresholds as though they are based on various acute exposure scenarios (DTSC 199Sa, pp.
From page 110...
... DTSC should also consider the inclusion of respirator.,v, ocular, and dermal irritation testing as well as allergic sensitization testing, in its battery of acute toxicity tests. The nuisance factor of odors may also have to be taken into account to meet some goals.
From page 111...
... The current threshold for classifying waste as hazardous is based on a 96hour LC50 value of 500 mg/L (22 California Code of Regulations § 66261.24 A DTSC proposes to retain this regulation but to use only a fish acute lethality bioassay to bring wastes into the lower tier of hazardous waste.
From page 112...
... For the approach to be riskbased, DISC must consider exposure and dose or concentration simultaneously when establishing a risk threshold. The risk presented by a waste is a function of exposure concentration and a threshold for acute effects; thus, setting a single value for a threshold is inappropriate.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.