Complexity in Catchment Modelling Systems and its Impact on Predictive Reliability

Publisher:
Conference Design Pty Ltd
Publication Type:
Conference Proceeding
Citation:
30th Hydrology and Water Resources Symposium, 2006, pp. 247 - 252
Issue Date:
2006-01
Filename Description Size
Thumbnail2006005357OK.pdf304.75 kB
Adobe PDF
Full metadata record
In general, the calibration process involves minimisation of the deviation between recorded information and the simulation predictions through repeated adjustment of control parameters. Implementation of this process requires temporal and spatial information at an adequate resolution to achieve robust predictions from a catchment modelling system. Unfortunately, the available information usually is not adequate for this purpose and it becomes necessary to either use catchment average values or to use other techniques to infer the necessary information. Developments in information technology and the availability of digital information have facilitated the later approach (see, for example, Choi and Ball, 1999). Using the Centennial Park Catchment in Sydney, Australia as a test catchment, inference models for estimation of the control parameters necessary to implement the Storm water Management Model were developed. A number of alternative inference models were developed to assess the influence of inference model complexity and structure on the calibration of the catchment modelling system. These inference models varied from the assumption of a spatially invariant value (catchment average) to spatially variable with each sub-catchment having its own unique values. Furthermore, the influence of different measures of deviation between the monitored information and simulation predictions were considered. Presented herein will be the results of these investigations into the complexity and structure of models used in the calibration process.
Please use this identifier to cite or link to this item: