Spectroscopy-assisted satellite water quality monitoring
Monitoring water quality typically involves costly and time consuming in situ boat surveys, where water samples are taken and returned to a laboratory. There, the samples are analyzed and water quality indicators—such as chlorophyll-a (Chl-a), particulate organic carbon (POC), and suspended solids (SS)—are measured. This two-stage process allows accurate measurement within a body of water, but only at the discrete points where samples were obtained.1–4 To circumvent physical sampling, satellite remote sensing provides complete, synoptic geographical coverage of water quality of inland fresh water systems, such as lakes, reservoirs, rivers, and dams. Here, we pre-determine the optimal spectral region for water quality indicators spectroscopically before satellite images are processed, with the aim of improving predictive models for water quality.
Many satellite systems are used to monitor inland water quality, including the Land Satellite (Landsat) with thematic mapper (TM) or enhanced thematic mapper (ETM+) capabilities, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), and Medium Resolution Imaging Spectrometer (MERIS). Images from these satellites are used in combination with in situ water sample quality measurements to determine pollution levels.1, 3 Next, statistical techniques are used to determine correlations between satellite spectral bands, or band combinations, and the desired water quality indicators.2, 5–9 These correlations are then used to develop predictive equations for Chl-a, POC, and SS levels.
Although there have been reports on the use of satellite remote sensing of inland water bodies, relatively few studies have considered the problem of atmospheric intervention of the satellite signal.10 For example, radiation from the Earth's surface undergoes significant interaction with the atmosphere before it reaches the satellite sensor. These interactions are most severe when the target surfaces consist of water bodies,10–12 particularly when considering multi-spectral time series satellite data. Importantly, atmospheric effects constitute the majority of at-satellite measured radiance in the visible bands over inland bodies of water (60–90%). Thus, a reliable and efficient atmospheric correction during pre-processing of digital satellite data is needed.
We conducted ground measurements to determine the reflectance at the water surface, i.e., at approximately zero depth. We used a spectroradiometer (GER1500, Spectra Vista Corporation), that has a full-width half maximum of 3nm and covers the spectral range 350–1050nm. After selecting the Lower Thames Valley water treatment reservoirs (United Kingdom) as our sites (see Figure 1), we acquired measurements on both the target reservoir and a white spectralon control panel (which was used to measure the incident incoming irradiance from the sun). With a field spectroradiometer mounted with a fiber-optic probe, we measured the reflectance values at various water depths of 0–6m (see Figure 2). Next, we calculated the surface reflectance values equivalent to the Landsat-5 TM bands 1–4, where band 1 detects wavelengths of 0.45–0.52μm, band 2 detects 0.52–0.60μm, band 3 detects 0.63–0.69μm, and band 4 detects 0.76–0.90μm. To yield the in-band reflectance, we then filtered our experimental measurements through the Landsat-5 TM relative spectral response functions and averaged within the limits of the first four TM bands.
We found the possible predictors for both Chl-a and POC using linear regression analysis between the mean measured reflectance values across our spectroradiometer spectral range and concentrations (μg/L) of Chl-a and POC in the water samples. Each regression model—512 in total—corresponded to a measured wavelength, where the highest correlation coefficient (r2) values for Chl-a and POC corresponded to the wavelengths found. Specifically, we found correlations for Chl-a at 400–450nm (r2=0.80–0.60) and 730–735nm (r2>0.60: see Figure 3). For POC we found correlations at 400–530nm (r2=0.80–0.60) and 728–735nm (with r2>0.60: see Figure 4).1 These wavelengths correspond to the optimal wavelength that Chl-a and POC can be retrieved by the satellite. That is, the Landsat TM spectral regions for Chl-a is TM band 1 and, for POC, TM bands 1 and 2. Finally, we applied the darkest pixel atmospheric correction algorithm to our regression models,10, 12 and processed archived and recent Landsat TM/ETM+ images to further calibrate and validate our pre-processing.
In summary, we have developed a new method for defining the optimal spectral region—using in situ field spectroradiometric measurements—in which water quality parameters can be retrieved from satellite image data. Several linear and multiple linear regression models were applied by relating atmospheric-corrected remotely sensed reflectance values against Chl-a and POC concentration. Our future steps will use a field spectroradiometer that covers a wider range of the electromagnetic spectrum (i.e., 250–2500nm) to further optimize the retrieval bands for additional water quality parameters, such as turbidity and SS.
Department of Civil Engineering and Geomatics
Cyprus University of Technology
Diofantos Hadjimitsis is an associate professor with more than 200 publications on earth observation for environmental surveillance.
University of Southampton
Chris Clayton is a professor of Infrastructure Engineering and former professor and head of the School of Engineering and the Environment at the University of Surrey (UK). He has published over 200 journal and conference papers, books, and reports.