Thursday, June 28, 2012

Hydrologic model [SWAT] development based on open source data

[1] Hydrological Modeling:

Hydrologic models are simplified; conceptual representations of a part of the hydrologic cycle [1A].They are primarily used for hydrologic prediction and for understanding hydrologic processes. Two major types of hydrologic models can be distinguished. (A) Stochastic Models. These models are black box systems, based on data and using mathematical and statistical concepts to link a certain input (as example rainfall) to the model output (for instance runoff). Commonly used techniques are regression, transfer functions, neural networks and system identification. These models are known as stochastic hydrology models.(B)  Process-Based Models. These models try to represent the physical processes observed in the real world. Typically, such models contain representations of surface runoff, subsurface flow, evapotranspiration, and channel flow, but they can be far more complicated. These models are known as deterministic hydrology models. Deterministic hydrology models can be subdivided into single-event models and continuous simulation models. There are various type of hydrological models are now a days available in public domain [1B][1C]

Citations:


[2] SWAT model and its application:
SWAT model (Arnold et al., 1998) is a semi-distributed, continuous watershed simulator operating on a daily time step [2A]. It is developed with the joint effort with USDA and Texas University for assessing the impact of management and climate on water supplies, sediment, and agricultural chemical yields in watersheds and larger river basins. The model is semi-physically based, and allows simulation of a high level of spatial detail by dividing the watershed into a large number of sub-watersheds. The major components of SWAT include hydrology, Water supply, Water quality, weather, erosion, plant growth, nutrients, pesticides, land management, and stream routing. The robust application of SWAT model has extended all over the world because of its diversified application [2B].  Government agencies like EPA USDA  uses SWAT for tracking environmental problems like water quantity, water quality, nutrient cycling and in stream process ete[2C, 2D,]. Wider application of SWAT is mostly in the government agency and research organization of USA and EU.  . SWAT model output often uses indirectly as it has the flexibility of coupling with other models. A large number of climate and land use studies has been done all over the world with SWAT model for land use and climate change study [2E, 2F]. Large scale European Projects also uses SWAT as a central hydrologic model frequently [2G] Here is an example of SWAT model application in the Himalayan region for impact assessment of climate change [2H]






Citations



[3] Watershed Delineation
Watershed delineation is required to provide a boundary of the watershed. SWAT uses ArcHydro algorithm for watershed delineation [3A]. The watershed delineation carries out advanced GIS functions to aid the user in segmenting watersheds in to several ‘hydrologically’ connected sub watersheds for use in watershed modeling in SWAT. [3B] There are two methods for watershed delineation in SWAT model, one is the DEM-based method, which is based on the DEM of the study area  and the other is the pre-defined method in which users can define the reaches and sub basins manually. Most of the researchers use the first method at present, which has high precision only in the area with certain terrain slope [3C].  During watershed delineation flow direction and flow accumulation process is done.





Citations:





[4] HRU analysis
Hydrological response units are areas within a watershed that respond hydrologically similarly to given input. It is a means to representing the spatial heterogeneity of a watershed.  With the introduction of hydrologic response unit (HRU), it is possible to expect similar hydrologic behaviour in each unit, which can be modeled easily. Plenty of hydrological models use HRU as unit response for a sub basin [4B][4C] .





Citations


[5] Weather Inputs and First simulation
The weather data definition window is divided in six tabs. (1)  Weather Generator (2) Rainfall Data (3) Temperature Data (4) Solar Radiation Data (5) Wind Speed data and (6) Relative Humidity Data. The first section for weather data is for the location of stations. The interface will not allow the user to perform other input data processing until the Weather Generator Data is defined. The other five sections allow the user to choose between simulated or measured data for specific type of data. We will use only precipitation and temperature data for this case study and let the model generate other data.





[6] Primary results and need of calibration
We obtained a daily discharge time series from 1996 to 2010 for Mendoza river watershed at downstream (pointed with the GPS at the sub basin no 7).  We will simulate the model 6 years from 1999 to 2004. First 3 years from 1996-1998 we will keep for warming up period it will stabilize some initial model parameters. And the remaining period 2005 to 2010 we will keep for validation. We will discuss with daily and monthly time step with some hydrograph analysis. Together with visual observation we will discuss with statistical performance of hydrological model. For this post we will not go in detail the calibration process which will be done in another complete post.





[7] Running SWAT in R environment
The statistical programming tool ‘R’ is also open source [7A]  In order to get a better result from SWAT we need to test the parameter ranges and each time we have to run (for Manuel calibration). If we use excel we have to plot each time which is time consuming and tedious. Therefore, we can program the statistical performance equations like NSE MSE or Percent Bias etc etc. After that changing the basin parameter we can re run the model without plotting in excel. Recently there is a R package and a multi objective genetic algorithm tool in R is available [7B][7C]





Citations:
[7C] Multi objective Automatic Calibration of SWAT Using NSGAII in R


In the next post I will focus three issues. They are [1] SWAT calibration in MATLAB environment. [2] Application of genetic logarithm for SWAT calibration [3] Spatial map preparation from SWAT output. SO PLEASE KEEP TUNED...


Acknowledgements: Contribution of Dr. John Joseph (from South Texas Uni, USA) greatly acknowledged for sharing his ideas to calibrate SWAT in R environment and special thanks to Rocio Julia and Gissela for sharing the observed data file of Mendoza river watershed which is the part of their M.Sc thesis work.

Thursday, June 21, 2012

Utilizing online resources for climatic research [part 2]



Introductory words:
Climate change and its impact assessment has strike through a great attention these days, especially after  wining the Nobel peace prize in 2007 by AL Gores team with IPCC. Huge number of universities and research organizations started working on climate impact assessment. Like geographic data, climatic data’s are not as straight forward to extract from online resources. Various reasons involved for this problem, most often climatic data’s are uploaded in gridded points with various temporal and spatial scale.  Therefore, in this post we are going to see where we can get online data for climatic research and how we can utilize them. One more time all those data’s are completely” FREE OF CHARGE”


[1] National Climatic Data Center, USA (NCDC) 
National climatic data center for USA provides climatic data on various spatial and temporal scales for entire globe [1A]. NCDC is the world's largest active archive of weather data. NCDC produces numerous climate publications and responds to data requests from all over the world. NCDC operates the World Data Center for Meteorology which is co-located at NCDC in Asheville, North Carolina, and the World Data Center for Paleoclimatology which is located in Boulder, Colorado. Here we will be focusing time series in daily time step. It is to be noted that there are lot of variables available in the website but we will be focusing precipitation temperature (min and max) for this specific post. We will only look at Summary of the Day) [1B]. Summary of the Day (Office of Hydrology Format) is historical data set DSI-9655 archived at the National Climatic Data Center (NCDC). The Office of Hydrology (O/H) format was designed for use in implementing the National Weather Service River Forecast Service (NWSRFS) nationwide, and for calibration of hydrologic models. Parameters for these summaries of the day data set include: temperature, precipitation, winds, surface pressure, evaporation, and snow depth. The period of record is 1948-1976.




2. Precipitation data from TRMM website
The Tropical Rainfall Measuring Mission (TRMM) is a joint mission between NASA and the Japan Aerospace Exploration Agency (JAXA) designed to monitor and study tropical rainfall A good number of researches have been done to assess the relativity of TRMM data for and found reliable for data sparse region, together with that a large no of studies applied TRMM data for various purpose among them, development of agricultural information system for crop production and irrigation need . The Tropical Rainfall Measuring Mission aims to monitor tropical and subtropical precipitation and to estimate its associated latent heating (GES DISC, 2010). The daily product TRMM 3B42 was used in this study. The purpose of the 3B42 algorithm is 5 to produce TRMM-adjusted merged-infrared (IR) precipitation and root-mean-square (RMS) precipitation-error estimates. The version 3B42 has a 3-hourly temporal resolution and a 0.25_ by 0.25_ spatial resolution. The spatial coverage extends from 50_ S to 50_ N and 0_ to 360_ E. The temporal scale considered for this video clip is for daily.


Citations:

3. Working with netCDF data
NetCDF is a data abstraction for array-oriented data access and a software library that provides a concrete implementation of the interfaces that support that abstraction [3A]. The implementation provides a machine-independent format for representing arrays. Although the netCDF file format is hidden below the interfaces, some understanding of the current implementation and associated file structure may help to make clear why some netCDF operations are more expensive than others.For a detailed description of the netCDF format, see File Structure and Performance. Knowledge of the format is not needed for reading and writing netCDF data or understanding most efficiency issues. Programs that use only the documented interfaces and that make no assumptions about the format will continue to work even if the netCDF format is changed in the future, because any such change will be made below the documented interfaces and will support earlier versions of the netCDF file format.ArcGIS and MATLAB has in built library for dealing with netCDF files [3B][3C].



Citations:

4. Climatic data for Europe [PRUDENCE Project]
PRUDENCE stands for Prediction of Regional Scenarios and Uncertainties for Defining European Climate Change Risks and Effects (Christensen et al. 2002) [4A]. A large number of decision making process has been done with the climate model generated variables [4B], among them Schelali et al (2007)[4C]  studied hydropower influence. Beniston et al (2010) studied impact assessment of climate change on mountainous watershed due to prevailing climate . The main objective of the PRUDENCE project was to provide high resolution climate change scenarios for Europe at the end of the twenty-first century by means of dynamical downscaling (regional climate modeling) of global climate simulations. Prudence project provides two time slices one called control period with the time series of 1960-1990 and the other one is scenario period 2070-2100.






Citations:

In the next post we will see how to build an open source hydrological model name SWAT with all the Geographic and climatic data obtained from various sources. So please keep tuned.


Acknowledgement: Support from Dr.Chetan Maringanti and Dr. Mamuka Gvilava were duly acknowledged for preparing this post. Especially on the script writing part and compiling the information’s for possible data sources.