Proceedings Volume 2823

Statistical and Stochastic Methods for Image Processing

Edward R. Dougherty, Francoise J. Preteux, Jennifer L. Davidson
cover
Proceedings Volume 2823

Statistical and Stochastic Methods for Image Processing

Edward R. Dougherty, Francoise J. Preteux, Jennifer L. Davidson
View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 8 October 1996
Contents: 4 Sessions, 26 Papers, 0 Presentations
Conference: SPIE's 1996 International Symposium on Optical Science, Engineering, and Instrumentation 1996
Volume Number: 2823

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Estimation
  • Filters
  • Algorithms I
  • Algorithms II
  • Filters
  • Estimation
  • Algorithms II
  • Filters
  • Algorithms II
Estimation
icon_mobile_dropdown
Estimation of noise parameters on sonar images
Francoise Schmitt, Max Mignotte, Christophe Collet, et al.
We use the Markov random field model in order to segment sonar images, i.e. to localize the sea bottom areas and the projected shadow areas corresponding to objects lying on sea floor. This model requires on one hand knowledge about the statistical distributions relative to the different zones and ont the other hand the estimation of the law parameters. The Kolmogorov criterion or the (chi) 2 criterion allow to estimate the distribution laws. The estimation maximization algorithm or the stochastic estimation maximization algorithm are used to determine the maximum likelihood estimate of the law parameters. Those algorithms are initialized with the Kmean algorithm. Results are showing on real sonar pictures.
Imposed measure approach to stochastic clutter characterization
George W. Rogers, Tim E. Olson, Carey E. Priebe, et al.
Stochastic clutter can often be modeled as a piecewise stationary random field. The individual stationary subregions of homogeneity in the field can then be characterized by marginal density functions. This level of characterization is often sufficient for determination of clutter type on a local basis. We present a technique for the simultaneous characterization of the sub-regions of a random field based on semiparametric density estimation on the entire random field. This technique is based on a borrowed strength methodology that allows the use of observations from potentially dissimilar subregions to improve local density estimation and hence random process characterization. This approach is illustrated through an application to a set of digitized mammogram images which requires the processing five million observations. The results indicate that there is sufficient similarity between images, in addition to the more intuitively obvious within- image similarities, to justify such a procedure. The results are analyzed for the utility of such a procedure to produce superior models in terms of 'stochastic clutter characterization' for target detection applications in which there are variable background processes.
Genetic algorithms for texture model identification and synthesis
Cory J. Engebretson, Jennifer L. Davidson, Dan Ashlock
This paper presents research on texture modeling and regeneration. We view a texture as a large pattern created from regular repetitions of a small, basic texture element, or texel. Given a texture image, the problem was to find the 'best' texel for that data, regenerate the texture represented by that texel, and compare the original image and the regenerated one. The texel-finding problem was posed as an optimization procedure. We used a genetic algorithm to do the optimization. To regenerate the texture, we used a Metropolis-like algorithm. The textures regenerated from the texels found by the genetic algorithm were difficult to visually distinguish from the original data. Research efforts are continuing to improve the efficiency and accuracy of the method and to extend the method to different types of data.
Maximum-likelihood estimation for the discrete Boolean random function
Gray-scale textures can be viewed as random surfaces in gray-scale space. One method of constructing such surfaces is the Boolean random function model wherein a surface is formed by taking the maximum of shifted random functions. This model is a generalization of the Boolean random set model in which a binary image is formed by the union of randomly positioned shapes. The Boolean random set model is composed of two independent random processes: a random shape process and a point process governing the placement of grains. The union of the randomly shifted grains forms a binary texture of overlapping objects. For the Boolean random function model, the random set or grain is replaced by a random function taking values among the admissible gray values. The maximum over all the randomly shifted functions produces a model of a rough surface that is appropriate for some classes of textures. The Boolean random function model is analyzed by viewing its behavior on intersecting lines. Under mild conditions in the discrete setting, 1D Boolean random set models are induced on intersecting lines. The discrete 1D model has been completely characterized in previous work. This analysis is used to derive a maximum- likelihood estimation for the Boolean random function.
Filters
icon_mobile_dropdown
FMH postcorrection filters for the interpolation of quincunx sampled images
Mika P. Helsingius, Petri Haavisto, Jaakko T. Astola
In principle the interpolation of the quincunx subsampled images is a simple process where the neighboring pixels are used to approximate the missing ones. Several linear or non- linear interpolation filters have been proposed for this purpose. However, the are was with fine details cause difficulties, because part of the original information has been lost due to downsampling. In this paper we propose two FIR-median hybrid filters that when combined with 4-point median filters, give very good subjective and numerical interpolation results. These filters can be used for both the monochrome and color images. The performance of the filters has been evaluated using several natural and artificial test images.
Rank order filtering on bit-serial mesh-connected computers
Hongchi Shi, Hongzheng Li
Rank order filters have a wide variety of applications in image processing as an effective tool for removing noise in images without severely distorting abrupt changes in the images. The k-th rank filter with an m X m window sets each pixel the k-th smallest value of the m2 pixels in its m X m neighborhood. The median filter is the most commonly used special case of rank order filters. Rank order filtering requires intensive computation. In this paper, we consider implementation of rank order filters on bit-serial mesh-connected computers such as the Lockheed Martin CISP computer. We design rank filtering algorithms using the threshold decomposition and radix splitting techniques, and present some experimental results of the implementation of those algorithms on the CISP computer.
High-speed systolic architectures for median-type filtering
A new efficient parallel algorithm and an architecture for order statistic, weighted order statistic and stack filters are suggested in this paper. This design is based on coding the order relations between input samples within the filter's window by so called binary ordering P-matrices. A simple scheme utilized in the design allows to update the current binary matrix from the previous one using just one parallel step of comparisons and simple, regular bit shifts. The architecture of the design is simple and suits well for implementation in systolic arrays. It is unified for all the above filters meaning that in all the cases the structure is the same and only one block, the output former, is different for each of these filters.
Adaptive smoothing of images with local weighted regression
Mark S. Levenson, David S. Bright, Jayaram Sethuraman
We present a weighting scheme for local weighted regression designed to achieve tow goals: (1) to reduce noise within image regions of smoothly varying intensities; and (2) to maintain sharp boundaries between image regions. Such a procedure can function as a preprocessing step in ann image segmentation problem or simply as an image enhancement technique.
Behavior of adaptive digital erosions
Clara Cuciurean-Zapan, Edward R. Dougherty, Yidong Chen
Design of statistically optimal erosion-based filters is problematic due to the complexity of the search process. Specialized search techniques and constraints on optimality are used to mitigate the full search. Adaptation of structuring elements has also ben employed. The present paper looks at the behavior of an adaptive filter relative to the actual optimal filter for a single erosion in two models, signal-union-noise and dilation. It does so in the context of state transitions, where filter states are stacks that determine the structuring element upon thresholding.
Algorithms I
icon_mobile_dropdown
Stereo-matching algorithm based on energy minimization principle in Markov random field model
Tsuneo Saito, Hiroyuki Kudo, Taizo Anan, et al.
In this paper, we develop anew intensity-based stereo matching algorithm using maximum a posteriori estimation based on the framework of Markov random field. The intensity-based stereo matching process is formulated as a problem to search for the minimum cost energy function which maximizes the a posteriori probability. We introduce an objective cost function called energy function of piecewise smooth disparity field, in which the discontinuities and occlusions are explicitly taken into account. In order to minimize the non-convex energy function for disparity estimation, we propose a relaxation algorithm called mean field annealing which provides results nearly as good as simulated annealing but with much faster convergence. Unlike the conventional correlation matching or feature matching, the proposed method provides a dense array of disparities, eliminating the need of interpolation for the 3D structure reconstruction. Several experimental results with synthetic and real stereo images are presented to evaluate the performance s of proposed algorithm.
Signal reconstruction with a small set Gabor filters
Jos H. van Deemter, Gabriel Cristobal
A normalization algorithm is proposed that improves the reconstructions of signals. After decomposing a signal in even linear bandpass filtered signals and a low-pass residual, it can be reconstructed reasonable well for a good choice of filters. However, to obtain good results the filter parameters must be chosen in such a way that they cover the frequency domain sufficiently well. This is often difficult for a small set of filters. We derive and demonstrate that in many situations it can be profitable to normalize the reconstructed image with respect to two global statistical parameters of the original image.
Radar image modeling: a 3D spectral domain approach
Francois Meunier, Jean Meunier
We developed a 3D model to simulate the radar image formation process of natural surfaces. This 3D model is divided into two main parts. First, the radar system is modeled using a 3D point spread function (PSF). Second, the observed terrain is modeled using 3D randomly distributed scatterers as well as predetermined scatterer positions. The image formation process involved the convolution of the PSF with the terrain model, the extraction of signal envelope and sampling in the range and azimuthal directions. To enhance the computing efficiency of the convolution process we transpose the problem into the Fourier domain. Our 3D model is useful to study the parameters involved in the formation and the analysis of radar images.
Three-dimensional reconstruction from incomplete Fourier spectra: an extrapolation approach
Etienne P. Payot, Francoise J. Preteux, Yves L. Trousset, et al.
3D reconstruction from an incomplete data set is an ill- posed problem. To overcome this drawback, an approach based on constrained optimization is introduced. This approach provides a powerful mathematical framework for selecting a specific solution from the set of feasible solutions; this is done by minimizing some criteria depending on prior densitometric information. We propose a global optimization scheme using a deterministic relaxation algorithm based on Bregman's algorithm associated with half-quadratic minimization techniques. When used for 3D vascular reconstruction from 2D digital subtracted angiography data, such an approach allows reconstructing well-contrasted 3D vascular network in comparison with results obtained by using standard algorithms.
New stochastic sampling method for region extraction: theory and experiments
Taizo Anan, Makoto Ohtsu, Hiroyuki Kudo, et al.
We propose a region extraction method based on a new energy function and a new stochastic sampling method. The new energy function is based on the mixed density description derived by clustering an input image using the ISO DATA algorithm. Our energy function is suitable for natural images. We developed a new stochastic sampling method by modifying the conventional Gibbs sampler. The conventional Gibbs sampler converges to global optimum of the energy function, but is cannot be applied to region extraction because of its inability to preserve topological property of the initial region during its state transition process. To overcome this drawback, our sampling process is driven by 'dynamic site selection' which enables to preserve the topology of the initial region in the state transition process. We prove the global convergence property of our proposed sampling method by extending the existing stochastic sampling theories. We demonstrate the performances of our method by simulation studies for both synthetic and natural images.
Object detection and recognition using evidences-based image analysis
Yury V. Visilter, Sergey Yu. Zheltov, Alexander A. Stepanov
The generic technique called the 'Evidences-based Image Analysis' is proposed for a model-based object detection. Real images to be analyzed are considered as the sources of evidences generated by the procedures of low-level image processing. These evidences support or refute hypothesis connected with different objects and their features. The Bayesian theorem is of use for hypothesis testing by evidences. The unknown parameters of probabilistic model are used as the internal parameters of algorithm tuning. This approach provides the most uniform and efficient way for the fusion of any available image information: intensity and contour, 2D and 3D, multispectral, multisensor and so on. Our technique takes into account three principal points: object/background model, registration model and corruption model. This paper concentrates mainly on the registration parameters' estimation, especially on the problem of geometrically invariant object detection. It is shown that the Hough-like accumulation methods really implement the maximum a posteriori estimation of the parameters of registration model under the assumption of statistical independence of evidences. The reduction and separation of models are proved to be the legal ways for fastening of the invariant object detection. The usage of complex hierarchical models of objects is considered as another way for fast invariant detection and recognition.
Algorithms II
icon_mobile_dropdown
Segmentation-based wavelet transform for still-image compression
Gerard Mozelle, Abdellatif Seghier, Francoise J. Preteux
In order to address simultaneously the two functionalities, content-based scalability required by MPEG-4, we introduce a segmentation-based wavelet transform (SBWT). SBWT takes into account both the mathematical properties of multiresolution analysis and the flexibility of region-based approaches for image compression. The associated methodology has two stages: 1) image segmentation into convex and polygonal regions; 2) 2D-wavelet transform of the signal corresponding to each region. In this paper, we have mathematically studied a method for constructing a multiresolution analysis (VjOmega)j (epsilon) N adapted to a polygonal region which provides an adaptive region-based filtering. The explicit construction of scaling functions, pre-wavelets and orthonormal wavelets bases defined on a polygon is carried out by using scaling functions is established by using the theory of Toeplitz operators. The corresponding expression can be interpreted as a location property which allow defining interior and boundary scaling functions. Concerning orthonormal wavelets and pre-wavelets, a similar expansion is obtained by taking advantage of the properties of the orthogonal projector P(V(j(Omega )) perpendicular from the space Vj(Omega ) + 1 onto the space (Vj(Omega )) perpendicular. Finally the mathematical results provide a simple and fast algorithm adapted to polygonal regions.
Morphological approach to multiband synthetic aperture radar (SAR) image segmentation
Silvina J. Loccisano, Marta Mejail, Julio Cesar Jacobo Berlles
The present work is a report of the results obtained using M-estimators, Mahalanobis distance and openings and closings in supervised segmentation of multiband SAR images. Once the training zones on the SAR images are established, the M- estimators robust estimation method is used to determine the mean value and the covariance matrix for each segment. Then, to carry out the first stage in the segmentation process, the Mahalanobis distance is used. To improve the segmentation obtained in the previous stage, a sequence of openings and closing s with a structuring element of growing size is applied. Real and synthetic images were used to evaluate the results.
Shape features for recognition of Pap smear cells
Shelly D. D. Goggin, Scott D. Janson
Automated cytology relies on the use of features extracted form cell images to classify cells. This paper examines the classification capability of a number of shape features on a database of normal, abnormal and endocervical cell nuclei images. The features include the chain code, the directed Hausdorff distance, measured of the length of the radii of the cell and measures of ellipticity. The area under the receiver operating characteristic curve is used as a figure of merit. For the calculation of the directed Hausdorff distance, the images are filtered using the Sorbel gradient and erosion. The feature in the image with the largest chain code is considered to be the nucleus. The other features use images threshold at a percentage of the maximum intensity in the image. The best feature for the discrimination between normal cells and either abnormal or endocervical cells was the directed Hausdorff distance, but this feature is computationally expensive. The minimum diameter as determined by the chain code was the second best feature for recognizing abnormal cells and is less computationally expensive. Ellipticity was the second best feature for recognizing endocervical cells, which is also less computationally expensive than the directed Hausdorff distance. An optical design for the calculation of directed Hausdorff distance feature is included, which could reduce the computational expense.
New statistical procedure for the segmentation of contiguous nonhomogeneous regions based on the Ozturk algorithm
Mohamed-Adel Slamani, Donald D. Weiner, Vincent C. Vannicola
Using thresholding techniques it is possible to separate between contiguous non-homogeneous patches with different power levels. When the power levels of the patches are similar if not equal, the global histogram of the patches is unimodal and the thresholding approach becomes very difficult if not impossible. In this paper, we propose to use a statistical procedure to separate between contiguous non-homogeneous patches with similar power levels but different data statistics. The procedure separates different regions by distinguishing between their data probability distributions. The procedure is based on the Ozturk algorithm which uses the sample order statistics for the approximation of univariate distributions.
Multiscale methods applied to the analysis of SAR images
Albert Bijaoui, Yanling Fang, Yves Bobichon, et al.
The analysis of SAR images requires in a first step to reduce the speckle noise which is due to the coherent character of the RADAR signal. The application of the minimum variance bound estimator leads to process the energy image instead of the amplitude one for the reduction of this multiplicative noise. The proposed analyzing methods are based on a multiscale vision model for which the image is only described by its significant structural features at a set of dyadic scales. The multiscale analysis is performed by a redundant discrete wavelet transform, the a trous algorithm. The filtering algorithm is interactive. At each step we compute the ratio between the observed energy image and the restored one. We detect at each scale the significant structures, by taking into account the exponential probability distribution function of the energy for the determination of the significant wavelet coefficients. The ratio is restored from its significant coefficients, and the restored image is updated. The iterations are stopped when any significant structure is detected in the ratio. Then, we are interested to extract and to analyze the contained objects. The multiscale analysis allows us an approach well adapted to diffused objects, without contrasted edges. An object is defined as a local maximum in the wavelet transform space (WTS). All the structures form a 3D connected set which is hierarchically organized. This set gives the description of an object in the WTS. The image of each object is restored y an inverse algorithm. The comparison between images taken at different epochs is done using the multiscale vision model. THat allows us to enhance the features at a given scale which have significantly varied. The correlation coefficients between the structures detected at each scale are far form the ones obtained between the pixel energy. For example, this method is very suitable to detect and to describe faint large scale variations.
Deep seafloor characterization with multibeam echosounders by image segmentation using angular acoustic variations
Samantha Dugelay, Christine Graffigne, J. M. Augustin
The new generation of low-frequency echosounders, primarily used for bathymetric purposes, are also able to record acoustic images of the sea floor. Reflected energy, as a function of the incidence angle, is known to be strongly dependent on seabed type, and therefore stands as a potential tool in sea floor characterization. On the other hand, acoustic images of the reflected energy, illustrate the variability of the acoustic interface and are invaluable for sea floor cartography. In this paper we describe a method of semi-automatic mosaic interpretation where the two different aspects are considered simultaneously. This is achieved by supervised segmentation using a Markov Random Field model where the neighborhood system and energies have been carefully studied in order to comply to a priori knowledge. We present results obtained with this method, enhancing the possibility of using such a technique for low- frequency echosounders.
Filters
icon_mobile_dropdown
Nonlinear locally adaptive and iterative algorithms of image restoration
Vladimir P. Melnik, Vladimir V. Lukin, Alexander A. Zelensky, et al.
An adaptive approach to restoration of images corrupted by blurring, additive, impulsive and multiplicative noise is proposed. It is based on the combination of nonlinear filters, iterative filtering procedures, and the principles of local adaptation. Finally, numerical simulations and test images illustrating the efficiency of the approach are presented.
Estimation
icon_mobile_dropdown
Noisy fractional Brownian motion for detection of perturbations in regular textures
Herve Guillemet, Habib Benali, Francoise J. Preteux, et al.
A generic method for detecting the presence of perturbating signal in model-based textures is presented. An index quantifying the accuracy of the texture model is defined from estimates of maximum likelihood or maximum a posteriori. The index is computed locally and a threshold value is used to detect those parts of the texture that depart from the model. We investigate the particular case of fractal textures based on a noisy fractional Brownian motion model. A specific accuracy index is derived from the likelihood of a heuristic synthesis model known as the random midpoint displacement algorithm. The method is applied to the problem of detecting microcalcifications in digital mammography. Results show that 95 percent of the breast tissue can be classified as not containing microcalcifications, in a short computation time and without significant error, thus proving the relevance of the method.
Algorithms II
icon_mobile_dropdown
Statistical characteristics of waves in random layered medium in presence of regular refraction
Michael A. Guzev, Gennadii V. Popov
We consider a model stationary problem of wave propagation in a layered halfspace with regular and random inhomogeneities.The choice of regular pertubation corresponds to a linear waveguide near the right boundary of the halfspace. Random inhomogeneties are simulated in the framework of the white noise model. We analyze influence of inhomogeneties on probability distribution of the reflection coefficient phase. Key words:layered medium, white noise model, statistical characteristics, waves in random media.
Filters
icon_mobile_dropdown
Noise-immune phase-shifting interferometric system based on Markov nonlinear filtering method
Igor P. Gurov, D. V. Sheynikhovich
In precise measurement of objects geometric characteristics are widely used the phase-shifting interferometric systems. Simple data processing algorithms are usually realized, but it is difficult to optimize such systems for accuracy increasing on general totality of measured data because of nonlinear data transformation. Besides that an important problem is the noise-immune phase unwrapping on intervals more than 27t rad. Proposed interferometric system is free from these disadvantages. In the system new method and algorithm of phase estimation are realized, which are based on Markov theory of optimal non-linear filtering. The main advantages of proposed system are the following: data processing in real time scale, solving the phase unwrapping problem and minimization of phase errors in conditions of influence of phase fluctuations and noise correlated with the signal. Phase restoration error in typical measurement conditions does not exceed 0. 15 rad. on criterion peakvalley, while rms-error does not exceed 0.05 rad. The system provides the possibility to solve the synthesis and optimization problems of wide class of multidimensional, unstationary and non-linear systems
Algorithms II
icon_mobile_dropdown
Method of single-fiber multimode interferometer speckle signal processing
Yuri N. Kulchin, Oleg B. Vitrik, Oleg V. Kirichenko, et al.
A method of correlation processing of speckle-signals formed by multimode interferometer is theoretically and experimentally investigated. The method permits to transform modulation of a speckle-pattern to optical or electrical signal, which linear depends on correlation coefficient of reference and current speckle patterns. The functional dependence of correlation coefficient from value of lengthening of interferometer is studied. The results allow to perform quantitative measurements of lengthening value. Mathematical algorithm enabling to define a sign of lengthening is developed. A correlated change of local areas of speckle-patterns formed by multimode fiber is discovered. It is developed an analog computing system which can perform the correlation measurements in real time in wide bandwidth. Optical and electrical circuits are made on standard components, that makes the method suitable, for example, for industrial use.