When looking under water with a camera, the user is hampered by low contrast, color changes, and interference due, for example, to camera noise and floating particles such as sea snow. Especially with cameras mounted on remotely operated vehicle (ROV) systems, common in both military and civil applications, these effects make observation more difficult and tiring. Applying image enhancement significantly eases the task of observation. Here, we describe a software solution for enhancing underwater imagery. Wherever possible, we have reused algorithms based on earlier developments in the area of compound security1 and UAV (unmanned aerial vehicle) image2 enhancement, and have transformed and extended them for this underwater case. Our method comprises three techniques.
A smart temporal average filter called noise reduction simultaneously reduces noise and sea snow. Noise reduction requires first estimating the camera motion. While performing this subpixel estimation, adaptive image integration reduces temporal noise, which in turn decreases sea snow. Further details on image registration for enhancement purposes are available elsewhere.1 Underwater imagery may also suffer from low contrast, especially in murky waters and when observing over larger distances. Amplifying the small contrast variations achieves a higher, evenly divided contrast for the entire image.3–5 Finally, imagery may undergo a shift in hue with respect to the original scene due to the filtering of the water. A color correction algorithm restores the original colors of the scenario as well as possible. This algorithm is applied prior to the display of the imagery. Note that color correction must be designed from scratch for the specific underwater case.6 Many hardware solutions for contrast enhancement exist. A main drawback of such methods is that applying only contrast enhancement will also amplify camera noise and sea snow.
We have implemented the combination of noise reduction, local adaptive contrast enhancement, and color correction on a commercial off-the-shelf (COTS) laptop used to control an ROV equipped with an NVIDIA Compute Unified Device Architecture (CUDA)-capable graphics processing unit (GPU). The local adaptive contrast enhancement and color correction are applied by the GPU and the noise reduction by the central processing unit (CPU). The result is displayed directly on the laptop. However, in other implementations alternative choices can be made for the hardware. Figure 1 shows the flowchart of our method.
Figure 1. Flowchart of the software filter module for remote image enhancement. (Image of the remote operating vehicle courtesy of VideoRay LLC.)
We developed and tested our approach in an operational navy environment, in collaboration with Dutch Ministry of Defence ROV operators. For the development and tuning of the algorithms, we used reference video footage, acquired in the murky waters of the Netherlands. We tested the software using data from other but similar underwater scenarios. Figure 2 shows snapshots of the original footage of a harbor bottom (left) and after enhancement with local adaptive contrast (right). Note that not only is the contrast of the object enhanced, but the sea snow is also more visible. The effects of the algorithms are better observed when watching the videos.7, 8 Figure 3 shows additional results. The upper row shows both the original image (left) and the same image following application of local adaptive contrast enhancement. The treated image is obviously better, but again the noise and sea snow are amplified. The bottom row shows the effects of all three filters (noise reduction, color correction, and local adaptive contrast enhancement) applied to the original image.
Figure 2. Still shots from reference video footage taken of a harbor bottom under water. Left: Original image. Right: Same image following contrast enhancement.
Figure 3. Effects of image enhancement. Upper left: Original image. Upper right: Following local adaptive contrast enhancement. Bottom: All three filters applied.
A number of improvements would help to enhance the system. For example, because noise reduction uses motion estimation, which relies solely on image content, further optimization could be achieved by employing an appropriate inertial measurement unit (IMU). Together with industrial partners, we have initiated work based on inexpensive MEMS (microelectromechanical systems)-based IMU solutions with active drift reduction that we expect to be operationally deployable at the end of 2015. Motion estimation is difficult, however, for images with little or no structure. Another step forward would be to use motion estimation only if the images contain sufficient structure. We are currently seeking additional collaboration with industry to develop and implement these improvements.
The authors wish to thank the Ministry of Defence and the Ministry of the Interior and Kingdom Relations of the Netherlands. This work was carried out under the government project Intensification of Civil-Military Cooperation.
Rob Kemp, Judith Dijk, Klamer Schutte
The Hague, The Netherlands
Rob Kemp received his master's degree from the Open University of the Netherlands, and has worked at TNO since 1981. He is currently a business developer in the area of above- and underwater image enhancement, robust navigation, and electro-optic atmospheric propagation modeling.
Judith Dijk is a senior research scientist in the Intelligent Imaging Department and has worked at TNO since 2004. Her current research includes image enhancement, color processing and correction, airborne and land-based scene analysis, and image-based behavior analysis.
Klamer Schutte is a senior research scientist in the Intelligent Imaging Department. He has been at TNO since 1996. His current research includes image enhancement, scene analysis, and image-based behavior analysis.
Royal Netherlands Navy
Den Helder, The Netherlands
Niels Visser works as diving supervisor at the Research and Development Department of the Armed Forces Diving Group.
1. P. B. W. Schwering, R. A. W. Kemp, K. Schutte, Image enhancement technology research for army applications, Proc. SPIE
8706, p. 87060O, 2013. doi:10.1117/12.2017855
2. J. Dijk, A. W. M van Eekeren, O. R. Rojas, G. J. Burghouts, K. Schutte, Image processing in aerial surveillance and reconnaissance: from pixels to understanding, Proc. SPIE
8897, p. 88970A, 2013. doi:10.1117/12.2029591
3. J. Dijk, R. J. M. den Hollander, J. G. M. Schavemaker, K. Schutte, Local adaptive contrast enhancement for color images, Proc. SPIE
6575, p. 65750A, 2007. doi:10.1117/12.716886
4. J. Dijk, R. den Hollander, Image enhancement for noisy color imagery, Proc. SPIE
7113, p. 71131A, 2008. doi:10.1117/12.800274
5. K. Schutte, D. J. J. de Lange, S. P. van den Broek, Signal conditioning algorithms for enhanced tactical sensor imagery, Proc. SPIE
5076, p. 92, 2003. doi:10.1117/12.487720
6. J. Dijk, K. Schutte, Image color correction, European Patent Appl. 13164740.6, 2013.