Proceedings Volume 0575

Applications of Digital Image Processing VIII

cover
Proceedings Volume 0575

Applications of Digital Image Processing VIII

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 19 December 1985
Contents: 1 Sessions, 37 Papers, 0 Presentations
Conference: 29th Annual Technical Symposium 1985
Volume Number: 0575

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • All Papers
All Papers
icon_mobile_dropdown
A Comparative Analysis Of Digital Filters For Image Decimation And Interpolation
Isaac Ajewole
It is sometimes necessary to increase or decrease the sampling rate of previously sampled images. Technically speaking, the process of increasing the sampling rate is called interpolation and the process of decreasing the :sampling rate is called decimation. In this paper we present both linear and higher-order polynomial interpolation techniques in terms of linear filtering operations (convolutions) and discuss the associated impulse responses. Several simple filters for decimation are also discussed. All these filters are compared vis-a-vis their suitability for decimation and interpolation particularly the degree to which the balance between sharpness and prevention of aliasing is maintained. Images which have been decimated or interpolated using these filters are shown.
Evaluation Of Selected 3-D Imaging And 3-D Image Processing Techniques
K. Balasubramanian
Certain earlier proposed 3-D imaging and 3-D image processing techniques have been evaluated and some improvements are suggested. A method of registering blur-free sectional images of the object for the development of a volumetric 3-D TV is proposed. Also, a method for perception of depth effect from 2-D mono video image is described.
Graphical Concepts In Image Processing - A Bridge Between Two Worlds
Barbara L. Filkins
As the computer graphics world has become "rasterized", it has become increasingly similar in fundamental algorithms and concepts to image processing. This closure is evident in the definition of underlying data structures and hardware design. Unfortunately, the two fields remain rather disjoint in communication. This paper deals with the interface between image processing and graphics systems. SDC has incorporated key aspects of a graphics architecture as a natural extension to its image processing package. A system architecture is described with examples of how graphics capabilities enhance image processing system functionality for specific applications.
The Fast Hartley Transform
H. S. Hou
The Fast Hartley Transform (FHT) is similar to the Cooley-Tukey Fast Fourier Transform (FFT) but performs much faster because it requires only real arithmetic computations compared to the complex arithmetic computations required by the FFT. Through use of the FHT, Discrete Cosine Transforms (DOT) and Discrete Fourier Transforms (DFT) can be obtained. The recursive nature of the FHT algorithm derived in this paper enables us to generate the next higher-order FHT from two identical lower-order FHTs. In practice, this recursive relationship offers flexibility in programming different sizes of transforms, while the orderly structure of its signal flow graphs indicates an ease of implementation in VSLI.
Algorithms for Mathematical Morphological Operations with Flat Top Structuring Elements
Yong H. Lee
The mathematical morphological operations on grey scale images require the selections of the minimum or the maximum value within the windows set by structuring elements. This paper deals with the structuring elements which are three dimensional with flat top and infinite height. The flat top region can be various shapes of one or two dimensions such as line segment, hexagon, octagon or circle. This paper describes the optimal implementation of the morphological operations with such structuring elements, an iterative method which combines controlled image shiftings and comparisons between the original and shifted images. It is an effective image processing method without a cytocomputer architecture [1,2].
Progressive Transmission Of Digital Diagnostic Images
S. E. Elnahas, R. G. Jost, J. R. Cox, et al.
Progressive transmission of digital pictures permits the receiver to reconstruct an approximate picture first, then gradually improves the quality of image reconstruction. A performance criterion is formulated for the evaluation of alternative schemes. The use of transform coding techniques to achieve progressive transmission is discussed. Application of the concept of progressive transmission to electronic radiology is introduced, and simulation results for individual images and panels of digital diagnostic images are presented. The relative quality of intermediate image reconstruction seems to be superior to that of other progressive transmission techniques.
Deaf Phone: Sign Language Telephone
R. Hsing, Thomas P. Sosnowski
Using the technologies of image processing, spatial/temporal resolution reduction and picture coding, a terminal has been designed and constructed which allows the transmission of deaf sign language over low bandwidth telephone lines in real time. Using the hardware, six deaf subjects have evaluated this system. A series of psycho-physical experiments were conducted to determine the minimum image quality required to convey meaning in hand-sign language as used by the deaf.
A 10MHz Data Compression System for Real-Time Image Storage and Transmission
Anil K. Jain, Daniel G. Harrington
The importance of video image transmission and storage is growing rapidly, as medical, business, and military technologies utilizing images are developed. Low cost image data compression systems offer image communication opportunities in personal computers, remote imaging, medical imaging, and engineering workstation environments. Utilizing a predictive coding technique, a real-time (10MHz throughput) special purpose machine has been built to accomplish image data compression economically. The basic machine contains less than 100 chips and has demonstrated high quality compression at ratios of 2:1, 4:1, and 8:1. Further compression via the use of add-on modules is possible.
New Solution for Frequency and Pixel Domain Coding Using Convex Sets
Pete Santago, Sarah A. Rajala
This paper describes a new algorithm for combining spatial and frequency domain information about an image in order to improve coding. The algorithm utilizes successive projections onto two non-intersecting sets in order to determine the optimal frequency domain coefficients to transmit given some known spatial information. This technique is best suited to fixed block size transform coders and builds on the techniques used in previous work2,4.
Motion Compensated Coder For Videoconferencing
Ram Srinivasan, K. R. Rao
A codec for videoconferencing purposes [8-12,31] is proposed based on the one-at-a-time search (OTS) motion compensation algorithm [1,30-31]. The coder is implemented in an interframe hybrid mode using the C-matrix transfsorm (CMT) [13-15]. The motion estimation algorithm employed is simple and reduces temporal redundancy while the CMT reduces spatial redundancy in the transmitted prediction errors. The operation of the codec is presented. Simulation results based on three different scenes with varying levels of motion are presented. The coder operates in a feedback mode.
Image Restoration Via The Shift-And-Add Algorithm
William G. Bagnuolo Jr.
A new method for image restoration based on the shift-and-add algorithm is presented, the main advantages of which appear to be speed and simplicity. The shift-and-add pattern produced by an object is given by the object correlated by a non-linear replica of itself whose intensity distribution is strongly weighted toward the brighter pixels. A method of successive substitutions analogous to Fienup's algorithm can then be used to "decorrelate" the SAA pattern and recover the object. The method is applied to the case of the extended chromosphere of Betelgeuse.
Histogram Specification Techniques For Enhancement Of High Altitude Aerial Digital Images
Joseph D. Biegel
The brightness histogram of low altitude aerial images can be used to develop a transformation function which reduces the atmospheric degradation in satellite imagery. The general approach is to histogram specify the satellite image to the low altitude reference image in a spectral band serial fashion. The reference histogram(s) need not be derived from an image context similar to the satellite image. The nature of the transformations are examined using the first several moments of the brightness distributions. Results are shown in which Landsat Thematic Mapper "visible images" (band 1, 2, and 3) are processed using low altitude image histograms as the specification functions. This transformation technique can be more effective than many interactive point processing methods, without requiring the time and expense of an expert image analyst. The limitations of the context free nature of the reference distribution is discussed. The potential for the application of this technique to general interactive point processing is described, along with possible application to other image classes.
An Object-Pass Filter For Image Processing
J. Steven Mott, James A. Roskind
An object-pass filter for image processing is defined as a filter that passes or enhances objects in an image that are larger than a minimum threshold and smaller than a maximum threshold. Objects outside the pass band, i.e. too large or too small, are suppressed by replacing the pixel gray scale level with zero. An object-pass filter that is realizable in today's technology is presented. It is shown that an object-pass filter can be realized with cascaded sort-selection filters, which select the Nth largest element in a neighborhood of elements. The object-pass filter's pass region can be determined by selecting the appropriate sort-selection filter window sizes. Due to recent advances in sort-selection filter architectures and VHSIC level technology, such an object-pass filter can be built using three of the recently developed Harris custom VHSIC Phase I level chips. Such a configuration can process imagery at a rate of better than 10 Mpixels/second. Simulation results are shown for object-pass filtering on binary images and also on several forward looking infrared images.
Entropy-Constant Image Enhancement by Histogram Transformation
Lawrence O' Gorman, Lynne Shapiro Brotman
Image enhancement techniques are described for increasing visual contrast in an intensity range (or ranges) of interest. Enhancement is accomplished by histogram transformation where (except for quantization error) there is no change in entropy between the input and output images. This preservation of entropy is desired for some applications where it is imperative that no artifacts be introduced - as may result from such enhancement techniques as spatial filtering and adaptive histogram transformation. The contribution of this work is toward the compilation and organization of useful image enhancement techniques and toward an understanding of matching particular transformations with the desired purposes. Two types of methods for obtaining histogram transformations are distinguished here. In one type the output histogram is specified as a ramp-shaped function, and the transformation is found. This type is useful for automatically enhancing low or high intensity ranges, and for histogram equalization. For the second type, the general form of the transformation function is defined to be a third order polynomial function, which is specified by the choice of the intensity region of interest and the desired amount of contrast enhancement. This type is useful for interactive enhancement. Examples are shown which illustrate where each of the transformations is applicable, and what results are obtained from such transformations.
Digital Image Processing For Instantaneous Frequency Mapping
D. A. Seggie, Mark Doherty, S. Leeman, et al.
A hybrid demodulation technique for Ultrasonic Pulse-Echo Imaging is demonstrated. A digital Implementation of the technique is used to investigate various mappings of experimentally obtained backscattered data. Results illustrating the processing options offered by the method are presented and discussed.
Architectures For Parallel Image Processing
Robert Y. Wong
Images taken by airborne and laboratory sensors have been used for many purposes including autonomous guidance, robot vision, medical diagnosis, automated inspection and automated measurements. Scene recognition operations are relatively costly in computing time due to the large amount of data needed to be analyzed. To obtain real-time operations, image processing and scene recognition can be done with several microprocessors operating in parallel. Since many of the operations are repetitive in nature, the use of multipro-cessors can greatly enhance the speed of operations. Architectures of several multiprocessor systems capable of performing image processing and scene matchingare described. The design of a new architecture is also described. This system is designed to take advantages of computing capability of a single stream/multi-data stream structure and the architectural simplicity of a pipeline computer.
Correlation Synthetic Discriminant Functions for Object Recognition and Classification in High Clutter
David Casasent, William Rozzi, Donald Fetterly
Correlation synthetic discriminant functions (SDFs) represent a practical and novel extension of matched spatial filter (MSF) correlators for distortion-invariant multi-object and multi-class pattern recognition. This paper reviews the off-line synthesis of such filters and the advantageous features of correlation shape control that they provide. We then concentrate on extensive tests performed with these filters to assess their performance in the identification of ship images, subjected to 3-D distortions. The pattern recognition problem addressed involves multi-object, multi-class recognition with aspect distortion-invariance in the presence of clutter. An adaptive threshold is shown to allow recognition of objects in the presence of spatially-varying modulation. The noise performance of these filters is also found to be most excellent. Correct classification rates approaching 98% can be obtained with these correlation SDFs.
Model-Based Matching of Line Segments in Aerial Images
Z Chen, K H. Tsai, K R Lu, et al.
An edge-oriented scheme for matching a sensed aerial image with a reterence Image will be proposed. The scheme will automatically extract linear line segments from the image. Each segments forms a high-level unit with two other segments in the neighborhood. Topological and geometrical features which are invariant under translation and rotation are defined. to derive the line mapping between the sensed and reference line segments. An iterative matching process with a root in the Shafer's mathematical theory of evidence is formulated- The whole process leads to a rather successful scene matching scheme.
Detection Of Maneuvering Target Tracks
Mary L. Padgett, Sarah A. Ra j ala, Wesley E. Snyder, et al.
This paper presents a new technique for the detection of moving target tracks, where those tracks are linear paths or segments of circles[1,2]. The images used as input represent a time-varying sequence of noisy satellite images of terrain and a moving target(s). Preprocessing of the image sequence involves use of third order differencing to remove stationary points and produce a sequence of intermediate images containing only the target track(s) and noise from various sources[3]. The new procedure described in this paper begins by selection of a window from the preprocessed image sequence. A generalized Hough transform technique is then employed to obtain the equation for the line traveled by any target, and an extension of the linear technique is used to detect circular tracks. New strategies for reduction of the dimensionality of the Hough transformation are also described. The method has been shown to be robust when tested on simulated noisy target tracks.
Adaptive Filtering Of Target Features In A Time-Sequence Of Forward-Looking Infrared (FLIR) Images
R. N. Strickland, M. R. Gerber, D. W. Tipper
This paper addresses the problem of estimating the profile of a ship target from a sequence of forward-looking infrared (FLIR) imagery obtained at video rates. Signal processing takes place in two stages: first, a profile vector is extracted from each image frame by edge-detection processing; second, the resulting sequence of profile vectors is adaptively filtered in order to increase the signal-to-noise ratio. Several factors peculiar to FLIR data preclude the use of an otherwise straightforward temporal averaging approach. First, target resolution increases slowly over a sequence of image frames, gradually revealing details of the target profile. Second, over the course of a long sequence of frames a target may be affected by FLIR noise - random speckle, occlusions, and flaring - caused by various atmospheric and background phenomena. These factors must be met by a reliable, adaptive, nonlinear vector smoothing technique. In this paper we discuss techniques for simulating long sequences of realistic ship profiles, based on actual FLIR ship imagery. Profile extraction by hypothesis testing is also discussed. A number of adaptive and nonadaptive vector filtering techniques are considered. These are based on recursive filters, median filters, LMS filters, the shift-and-add method, segmenting algorithms, and combinations of these. We suggest operating constraints under which these algorithms are likely to be successful.
Pattern Recognition Through Dynamic Programming
B. Burg, Ph. Missakian, B. Zavidovique
The aim of this study is to adapt a speech recognition time warping algorithm to picture analysis. Our goal is to recognize patterns despite variations in scale and orientation. We may recognize objects regardless of whether they are embedded in other parts or they are distorted. The programs input real pictures, extract the contours and then encode and compare them to a pattern dictionary. The computer time is particularly short for such a recognition process.
Merging Images Through Pattern Decomposition
Peter J. Burt, Edward H. Adelson
It is frequently desirable to combine different sources of image information into a composite image prior to undertaking image analysis. For example, multiple images may be merged to extend the field of view or resolution, or to eliminate foreground obstacles. Or stereo images may be combined so that regions occluded in one camera's view are filled smoothly with regions seen by the other camera. The essential problem in image merging is "pattern conservation": important details of the component images must be preserved in the composite while no spurious pattern elements are introduced by the merging process. Simple approaches to image merging often create edge artifacts between regions taken from different source images, and these may confound subsequent image analysis. We describe an approach to image merging based on pattern decomposition. Each source image is first transformed into a set of primitive pattern elements. Pattern sets for the various source images are then combined to form a single set for the composite image. Finally the composite is reconstructed from its set of primitives. We illustrate the pattern decomposition technique with several practical applications. These include image merging to eliminate foreground objects, and merging to extend the depth of field. In all cases the Laplacian pyramid is used to encode images in terms of sets of primitives which resemble Gaussians of many scales.
Knowledge-Based Tactical Terrain Analysis
John Gilmore, David Ho, Steve Tynor, et al.
The performance of autonomous vehicle systems is currently limited by their inability to accurately analyze their surrounding environment. In order to function in a dynamic real world environment, an autonomous vehicle system must be capable of interpreting terrain based upon predetermined mission goals. This paper describes a concept of knowledge-based terrain analysis currently being developed to support the information needs of an autonomous helicopter system. The terrain analysis system consists of five integrated processing stages. Each process is discussed in detail and supported by a number of mission oriented examples.
CCD For Two-Dimensional Transform
A. M. Chiang, B. B. Kosicki, R. W. Mountain, et al.
With the ever increasing demand for image transmission and image storage, various algorithms for image data compression have been developed. 1 To transmit pictures at lower bandwidth or to minimize the memory size for storage, the image must be compressed by the removal of redundant information. Transform image coding has been proven to be an efficient method for image compression.2,3 In the basic transform image coding concept, an image is divided into small blocks of pixels and each block undergoes a two-dimensional transformation to produce an equal-sized array of transform coefficients. The coding process is then performed on the transformed block image. It has been shown that the compression factor of the discrete cosine transform (DCT) compares closely with that of the Karhunen-Loeve transform, which is considered to be the optima1.4 But, comparatively, using the fast cosine transform (FCT) algorithm, the implementation of DCT is much simpler. Therefore, in many transform coding systems a large amount of digital hardware is dedicated to perform the 2-D FCT, because it is believed to be the only practical approach to get close to optimal performance. This consideration leads the recent research efforts on the transform image coding concentrated on the improvement of the coding process only.
Practical Realisation Of Arithmetic Coding Unit For Document Treatment
S. Desmet, D. Vandaele, A. Oosterlinck
We describe a practical realisation of a coding unit for binary document treatment. The coding procedure used is a modification of an arithmetic code discussed by Langdon and Rissanen, which reaches a compression factor almost 30% higher than the two-dimensional CCITT code with k = 4 . Hardware has been developed capable of encoding or decoding an A4 binary document, scanned with a 200 pixels/inch resolution in less than 1 second.
An Interactive Digital Image Processing Workstation For The Earth Sciences
Michael Guberek, Stephen Borders
An interactive digital image processing workstation has been developed for oceanographic, metorological, geophysical applications (Fig 1.). The turn-key system provides the capability to process imagery from commonly used ocean observation spacecraft, in conjunction with in situ data sets. The system is based on the Hewlett-Packard 9000, a high-performance 32-bit processor (CPU), with a direct address range of 500 Megabytes. The Metheus Omega series of display controllers are used to drive the color CRT display. The controller memory may be configured to hold up to 1280x1024x32-bit images. The workstation provides the Global Applications Executive, which standardizes the link between the user and applications programs under the UNIX operating system. The user can operate the system in three modes. In the menu mode, the user is asked to make a selection from a list of menus and applications. In the command mode, the user communicates with the system via simple English-like commands. Finally, in tutor mode, the user is prompted for all parameters which must be supplied to a program. The applications software includes programs to perform geometric correction, earth location, and registration of remotely sensed data. These programs handle imagery from the Advanced Very High Resolution Radiometer (AVHRR), the Coastal Zone Color Scanner (CZCS), the Multispectral Scanner (MSS), the Scanning Multichannel Microwave Radiometer (SMMR), and the Visual and Infrared Spin Scan Radiometer (VISSR). Other programs permit displaying monochrome and true-color images. Line graphics, such as contoured data can be overlayed onto the displayed image in different colors. Interactive manipulation of these images is possible via a digital tablet provided. Interactive functions include panning, histogram normalization and pseudocolor manipulation.
PC-Based Image Processing With Half Toning
John A. Saghri, Hsieh S. Hou, Andrew G. Tescher
A versatile and useful personal computer-based procedure to generate halftone representation of digitized image data is presented. This feature provides limited interactive image processing capability for a basic personal computer work station without having to acquire a video monitor and a digital-to-analog converter board. The generated halftone images can be displayed on the standard dot-addressable graphics monitor and/or they can be printed out on regular paper via an associated printer. The halftoning procedure is based on an error diffusion technique in which dots are produced randomly, one at a time, based on both the local grey level value and some accumulated measure of the error due to previous assignments. The key elements in halftoning such as dynamic, spatial, and display medium resolutions are defined and the mathematical correspondence among them are established.
A Multiprocessor System For Image Processing
L. Van Eycken, P. Wambacq, J. Romme laere, et al.
When using our first image computer, a number of specific needs were discovered, this computer could not answer to. These lacks resulted in a new image computer, which is presented in this paper. The basic hardware, namely the interconnection network and its interfaces, is kept modular and extensible, as to anticipate future needs. Besides this design frame, the most relevant features of some existing processor modules will be given in order to illustrate the possibilities of the image computers.
Fast Adaptive Filtering With Special Hardware
P. Wambacq, L. Van Eycken, J. Rommelaere, et al.
In this paper, a few aspects concerning the hardware implementation of adaptive filtering algorithms are discussed. Two different approaches to adaptive filtering are considered, each having their own hardware implications. The ideas that are developed are illustrated with some practical examples.
Image Processing on Photon-counting Imaging
T. Kurono, T. Kawashima, M. Katoh, et al.
Photon-counting image acquisition system (PIAS)1 has been developed for use in detection of two-dimensional low-light-level phenomena. PIAS consists of a photon-counting imager,' a position analyzer, and an image processor. When a photo-electron is detected, its position is stored in a frame memory, and immediately displayed as image on a TV monitor. An acquired photon-counting image has an ideal input-output linearity and S/N at theoretic limits, because the noise of PIAS is the photon's noise itself. The confirmation of the interference in Young's experiment4 under photon-counting level has been obtained in real time using PIAS system. PIAS has also been utilized for applications such as astronomical observations, energy 7 energy analysis of charged particles and optical microscopy etc. For analyzing these acquired images, it is necessary to elaborate on how to smooth and display the photon-counting images.. We propose variable subregion filtering(VSF) with which it is possible to alter the shape of subregion in neighboring windows according to the status of the original image. This algorithm is also found to be useful for edge preserved smoothing. And the result of that experiment is shown.
Material Stress Inspection By Digital Thermographic Image Processing
R . A . Fiorini, P. Coppa, O. Salvatore
Within the elastic range, a body subjected to tensile or compressive stresses experiences a reversible conversion between mechanical and thermal form of energy. If adiabatic conditions are maintained, the relationship between reversible temperature change and the corresponding change in the sum of the principal stresses is linear and independent of loading frequency. The inspection of the thermoelastic effect on spot weldings using infrared-based systems can give significant data for non destructively evaluation of welding defects. A specimen was first modeled by Finite Element Technique and then tested on an MTS unit monitored by an infrared camera. A digital image processing computer facility was used to compare sequences of images to investigate the dynamic behaviour of the temperature differences. Theo-retical and experimental results are compared.
Digital image processing with the Texas Instruments Professional Computer (TIPC) CCD camera system
Robert John Gove
The TIPC has been combined with a CCD Microcamera to produce a unique image acquisition and processing system which has applications in widely diverse areas. This is a fully integrated system which utilizes important features of each component, including the COD camera, the computer, the video digitizer/processor and the displays. The system is a complete solution for many applications, but can also be used as an integral component for more complex systems. The structure, operation and examples of applications of this digital image processing system will be discussed in this paper.
Color Coding Medical Ultrasound Tissue Images With Frequency Information
W. T. Mayo, P. V. Sankar, L.. A. Ferrari
Pseudocolor enhancement of gray scale images from ultrasound, X-ray, or any other single parameter imaging modality has not gained widespread clinical use. Color has found some acceptance where more than one parameter is imaged, as in Doppler blood flow visualizations. Recently, the frequency variation of ultrasound echoes from tissue has been shown to be useful for both measuring attenuation and for enharlciing the apparent resolution of amplitude gray scale images of small tissue structure . In order to separately observe these effects, we have treated the backscatter amplitude and frequency as independent variables and have created a color display which observers find both pleasing and useful.
A Perimetric Sampling Technique Applied to Biological Images
c. Katsinis, A. D. Poularikas, H. P. Jeffries
This paper examines the classification potential of a cyclical sampling approximation applied to binary images of zooplankton. Image pixels are rearranged into a one dimensional sequence by selecting samples in a spiral manner starting from the edge of the image and proceeding toward the center. The properties of this sample sequence are examined by Fourier transform techniques, using images from six major zooplankton categories of varying orientation. The ability of features extracted from spiral sequences to classify zooplankton samples, and their classification accuracy are investigated.
Image Processing and Geographic Information
Ronald G. McLeod, Julie Daily, Kenneth Kiss
A Geographic Information System, which is a product of System Development Corporation's Image Processing System and a commercially available Data Base Management System, is described. The architecture of the system allows raster (image) data type, graphics data type, and tabular data type input and provides for the convenient analysis and display of spatial information. A variety of functions are supported through the Geographic Information System including ingestion of foreign data formats, image polygon encoding, image overlay, image tabulation, costatistical modelling of image and tabular information, and tabular to image conversion. The report generator in the DBMS is utilized to prepare quantitative tabular output extracted from spatially referenced images. An application of the Geographic Information System to a variety of data sources and types is highlighted. The application utilizes sensor image data, graphically encoded map information available from government sources, and statistical tables.
Eye Spacing Measurement for Facial Recognition
Mark Nixon
Few approaches to automated facial recognition have employed geometric measurement of characteristic features of a human face. Eye spacing measurement has been identified as an important step in achieving this goal. Measurement of spacing has been made by application of the Hough transform technique to detect the instance of a circular shape and of an ellipsoidal shape which approximate the perimeter of the iris and both the perimeter of the sclera and the shape of the region below the eyebrows respectively. Both gradient magnitude and gradient direction were used to handle the noise contaminating the feature space. Results of this application indicate that measurement of the spacing by detection of the iris is the most accurate of these three methods with measurement by detection of the position of the eyebrows the least accurate. However, measurement by detection of the eyebrows' position is the least constrained method. Application of these techniques has led to measurement of a characteristic feature of the human face with sufficient accuracy to merit later inclusion in a full package for automated facial recognition.
Preliminary Study Of Alternative Tomographic System For Radiotherapy Planning
Tzouh-Liang Sun
Computerized axial tomography (CT) is a marvelous scientific invention. It has well been established on the realm of diagnostic radiology. But what can CT devote its help to radiation oncology? One of the leading topics should be the CT scanner used in radiotherapy treatment planning. Because CT image can tell not only the precise location of tumour, but also the very subtle differences in tissue heterogeneities on the same tumour level of body cross section. Those information is essential to increase the accuracy in dose delivery for some curative radiotherapy treatment planning. However, the high cost and large requisition for CT scanning make the situation too difficult to do routine curable cancer patient's tumour localization. Under this circumstances, the idea of linking the available facility such as treatment planning computer, isocentrical simulator as well as isotope machine or linear accelerator together with some efforts in order to serve as an alternative tomographic system have already been carried out successfully by several foreign institutes. In this preliminary study, both the theory and the real practical test were done in order to prepare for further development of a true alternative tomographic system for radiotherapy planning.