Proceedings Volume 0786

Applications of Artificial Intelligence V

cover
Proceedings Volume 0786

Applications of Artificial Intelligence V

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 11 May 1987
Contents: 1 Sessions, 80 Papers, 0 Presentations
Conference: 1987 Technical Symposium Southeast on Optics, Electro-Optics, and Sensors 1987
Volume Number: 0786

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • All Papers
All Papers
icon_mobile_dropdown
A Survey Of Diagnostic Expert Systems
John F. Gilmore, Kurt Gingher
Diagnostic expert systems is an area of growing interest in the application of expert system technology. Diagnosis is a step-by-step process that attempts to ascertain the internal characteristics of a physical system. The first generation of knowledge-based systems for hardware diagnosis are based primarily on the largely empirically derived knowledge of human experts. A number of diagnostic expert systems dealing with electronic and computer hardware have been developed over the last several years. The majority of these systems have utilized a variety of diagnostic techniques and concepts to solve a specific hardware application. This paper surveys four of the most successful diagnostic expert systems in the areas of computers and electronic hardware and analyzes each in a comparative manner. Several additional systems are summarized to provide the reader with a relatively extensive overview of existing research and development activities in the area of diagnostic expert systems.
A Model-Based Expert System For Digital Systems Design
J.G. Wu, W.P. C. Ho, Y. H Hu, et al.
In this paper, we present a model-based expert system for automatic digital systems design. The goal of digital systems design is to generate a workable and efficient design from high level specifications. The formalization of the design process is a necessity for building an efficient automatic CAD system. Our approach combines model-based, heuristic best-first search, and meta-planning techniques from AI to facilitate the design process. The design process is decomposed into three subprocesses. First, the high-level behavioral specifications are translated into sequences of primitive behavioral operations. Next, primitive operations are grouped to form intermediate-level behavioral functions. Finally, structural function modules are selected to implement these functions. Using model-based reasoning on the primitive behavioral operations level extends the solution space considered in design and provides more opportunity for minimization. Heuristic best-first search and meta-planning tech-niques control the decision-making in the latter two subprocesses to optimize the final design. They also facilitate system maintenance by separating design strategy from design knowledge.
A Model-Based Expert System For Component-Level Fault Diagnosis
Peter J. Angeline, Thomas W. D'Onofrio
This paper summarizes research completed for the development of a prototype expert system to diagnose failures of a complex computer down to an individual component. Our research goal was to model the diagnostic process followed by expert technicians to achieve a component-level diagnosis. The development goal was to construct an initial prototype system for a single board of the computer which could be used as a model for the construction of a diagnostic system for the entire computer.
Capital Expert System
Laurie Dowell, Jack Gary, Bill Illingworth, et al.
Gathering information, necessary forms, and financial calculations needed to generate a "capital investment proposal" is an extremely complex and difficult process. The intent of the capital investment proposal is to ensure management that the proposed investment has been thoroughly investigated and will have a positive impact on corporate goals. Meeting this requirement typically takes four or five experts a total of 12 hours to generate a "Capital Package." A Capital Expert System was therefore developed using "Personal Consultant." The completed system is hybrid and as such does not depend solely on rules but incorporates several different software packages that communicate through variables and functions passed from one to another. This paper describes the use of expert system techniques, methodology in building the knowledge base, contexts, LISP functions, data base, and special challenges that had to be overcome to create this system. The Capital Expert System is the successful result of a unique integration of artificial intelligence with business accounting, financial forms generation, and investment proposal expertise.
Minc: A Deniable Expert System That Reasons With Simplifying Assumptions.
Sankar Virdhagriswaran, Niki Afshartous
Proposing specific computer systems to meet customer requirements expressed at a very high level is an iterative process. Performance prediction rules are used to generate intermediate configurations based on highly abstract inputs. These intermediate configurations are then used interactively to further define the requirements. Conclusions reached based on incomplete information are denied and new information is input in the system. This paper describes an expert system under development which uses an Extended Contradiction Resolution Mechanism based on a Truth Maintenance System (TMS) to implement this interactive computer configuration generation process.
An Expert System And Simulation Approach For Sensor Management & Control In A Distributed Surveillance Network
Barbara D. Leon, Paul R. Heller
A surveillance network is a group of multiplatform sensors cooperating to improve network performance. Network control is distributed as a measure to decrease vulnerability to enemy threat. The network may contain diverse sensor types such as radar, ESM (Electronic Support Measures), IRST (Infrared search and track) and E-0 (Electro-Optical). Each platform may contain a single sensor or suite of sensors. In a surveillance network it is desirable to control sensors to make the overall system more effective. This problem has come to be known as sensor management and control (SM&C). Two major facets of network performance are surveillance and survivability. In a netted environment, surveillance can be enhanced if information from all sensors is combined and sensor operating conditions are controlled to provide a synergistic effect. In contrast, when survivability is the main concern for the network, the best operating status for all sensors would be passive or off. Of course, improving survivability tends to degrade surveillance. Hence, the objective of SM&C is to optimize surveillance and survivability of the network. Too voluminous data of various formats and the quick response time are two characteristics of this problem which make it an ideal application for Artificial Intelligence. A solution to the SM&C problem, presented as a computer simulation, will be presented in this paper. The simulation is a hybrid production written in LISP and FORTRAN. It combines the latest conventional computer programming methods with Artificial Intelligence techniques to produce a flexible state-of-the-art tool to evaluate network performance. The event-driven simulation contains environment models coupled with an expert system. These environment models include sensor (track-while-scan and agile beam) and target models, local tracking, and system tracking. These models are used to generate the environment for the sensor management and control expert system. The expert system, driven by a forward chaining inference engine, makes decisions based on the global database. The global database contains current track and sensor information supplied by the simulation. At present, the rule base emphasizes the surveillance features with rules grouped into three main categories: maintenance and enhancing track on prioritized targets; filling coverage holes and countering jamming; and evaluating sensor status. The paper will describe the architecture used for the expert system and the reasons for selecting the chosen methods. The SM&C simulation produces a graphical representation of sensors and their associated tracks such that the benefits of the sensor management and control expert system are evident. Jammer locations are also part of the display. The paper will describe results from several scenarios that best illustrate the sensor management and control concepts.
An Expert System For Multispectral Threat Assessment And Response
Alan N. Steinberg
A concept has been defined for an automatic system to manage the self-defense of a combat aircraft. Distinctive new features of this concept include: a. the flexible prioritization of tasks and coordinated use of sensor, countermeasures, flight systems and weapons assets by means of an automated planning function; b. the integration of state-of-the-art data fusion algorithms with event prediction processing; c. the use of advanced Artificial Intelligence tools to emulate the decision processes of tactical EW experts. Threat Assessment functions (a) estimate threat identity, lethality and intent on the basis of multi-spectral sensor data, and (b) predict the time to critical events in threat engagements (e.g., target acquisition, tracking, weapon launch, impact). Response Management functions (a) select candidate responses to reported threat situations; (b) estimate the effects of candidate actions on survival; and (c) coordinate the assignment of sensors, weapons and countermeasures with the flight plan. The system employs Finite State Models to represent current engagements and to predict subsequent events. Each state in a model is associated with a set of observable features, allowing interpretation of sensor data and adaptive use of sensor assets. Defined conditions on state transitions allow prediction of times to critical future states and are used in planning self-defensive responses, which are designed either to impede a particular state transition or to force a transition to a lower threat state.
Fiber Optic Network Design Expert System
Timothy J. Artz, Roy M. Wnek
The Fiber Optic Network Design Expert System (FONDES) is an engineering tool for the specification, design, and evaluation of fiber optic transmission systems. FONDES encompasses a design rule base and a data base of specifications of system components. This package applies to fiber optic design work in two ways, as a design-to-specification tool and a system performance prediction model. The FONDES rule base embodies the logic of design engineering. It can be used to produce a system design given a requirement specification or it can be used to predict system performance given a system design. The periodically updated FONDES data base contains performance specifications, price, and availability data for current fiber optic system components. FONDES is implemented in an artificial intelligence language, TURBO-PROLOG, and runs on an IBM-PC.
Expert System Issues In Automated, Autonomous Space Vehicle Rendezvous
Mary Ann Goodwin, Daniel C. Bochsler
The Rendezvous Expert (RENEX) program simulates autonomous rendezvous and proximity operations during spaceflight for selected active vehicles and target vehicles. Space Shuttle mission flight rules were used as the basis for developing the knowledge base for trajectory planning/replanning and monitoring. Emphasis was on so-called "day of rendezvous" activities. The RENEX expert system software simulates real-time system monitoring as well as trajectory planning and software control functions. Conventional software simulates the vehicle guidance, navigation, and control functions. RENEX was developed to support streamlining operations for the Space Shuttle and Space Station programs and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample Return unmanned missions. The goal was to develop expert system technology which could be applied to operational decision support systems and/or to automating appropriate vehicle operations. The expert system and the insight it has provided into the development and the use of expert systems to achieve greater automation of rendezvous operations is discussed.
Coupled Probabilistic And Possibilistic Uncertainty Estimation In Rule-Based Analysis Systems
L. Tsoukalas, M. Ragheb
A methodology is developed for estimating the Performance of monitored engineering devices. Inferencing and decision-making under uncertainty is considered in Production-Rule Analysis systems where the knowledge about the system is both probabilistic and possibilistic. In this case uncertainty is considered as consisting of two components: Randomness describing the uncertainty of occurrence of an object, and Fuzziness describing the imprecision of the meaning of the object. The concepts of information granularity and of the probability of a fuzzy event are used. Propagation of the coupled Probabilistic and possibilistic uncertainty is carried out over model-based systems using the Rule-Based paradigm. The approach provides a measure of both the performance level and the reliability of a device.
A Methodology For Applying Expert Systems To Process Plan Execution
Mark C. Maletz
This paper describes a methodology for applying expert systems to the task of executing process plans as a control mechanism for the factory floor. This methodology includes an architecture for representing both declarative and procedural knowledge. Declarative knowledge describes the state of the factory floor and the process plan operations that can be performed on parts on the factory floor (e.g., machining and assembly operations). Declarative knowledge is represented using both facts and schemata. Procedural knowledge identifies when and how process plan operations are to be executed. Procedural knowledge is represented using data-driven rules with an asynchronous event handling capability. Rules that determine when process plan operations can be performed embody largely domain-independent scheduling knowledge. Rules that determine how to execute process plan operations require domain-dependent knowledge about particular operations. Both of these types of rules also require information about the current state of the factory floor. A context graph mechanism is included in the methodology to represent temporal precedence constraints among process plan operations (i.e., the structure of a context graph indicates process plan operation precedences). Contexts are also used to represent a part configuration relative to a process plan operation. A prototype expert system based on the methodology presented in this paper has been developed using the Automated Reasoning Tool (ART).
A Modifiable Approach To Expert Systems Development
James C. Sanborn
Rule based expert systems programmers experience similar difficulties in developing and maintaining large application programs: rules become instantiated when they shouldn't; the execution order of rules is undesirably nondeterministic, or worse, simply incorrect; and modifications to program behavior are difficult or unwieldy. All of these problems arise from the control strategies used by the development language, their implementation, and the programmers control over (and awareness of) them. This paper explores the impact of rule based program control on overall program modifiability. We present a language designed with efficiency, modifiability, and ease of use in mind. Throughout, we discuss traditional control strategies, improvements made through our research, and directions for further study.
RESKB: A Relational Solution To An Expert System's Knowledge Base
Frank D. Anger, Rita V. Rodriguez, Douglas D. Dankel II
Despite the obvious equivalence between many of the problems and concepts of the knowledge bases in intelligent systems and standard databases, important differences exist which make a unified treatment difficult. This paper presents a combination of expert-system concepts with database management techniques that creates a more responsive and flexible rule-based system. The complete design, known as RESKB (Relational Expert System Knowledge Base) provides a widely applicable model for the actual construction and management of a rule base. The presentation of RESKB uses database design tools to organize the knowledge base and extends to the handling of "procedural inclusions" and their control. The whole development focuses on maintaining system transparency, simplicity, and expandability. Two possible methods of implementing the RESKB model are also presented.
Implementation Of A Generic Blackboard Architecture
Stephen D. Tynor, Stefan P. Roth, John F. Gilmore
The development of general purpose expert system tools has led to a decrease in the development time of application oriented expert systems. Recently, the need for communicating expert systems has spurred interest in the blackboard architecture. Until blackboard systems are relatively easy to implement; however, their use will be restricted to knowledge engineers willing to write their system from the ground up. This paper describes the development of a blackboard architecture in the Generic Expert System Tool (GEST) developed by the Artificial Intelligence Branch of the Georgia Tech Research Institute. GEST has been developed as a general purpose tool applicable to a wide variety of application domains. Recently, GEST has been enhanced by incorporating a blackboard architecture which allows several GEST expert systems to cooperate with one another. This paper outlines GEST's _software architecture, including its knowledge representation schemes, control sturctures, and blackboard.
Traditional Versus Rule-Based Programming Techniques: Application To The Control Of Optional Flight Information
Wendell R. Ricks, Kathy H. Abbott
To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. This paper describes a study that compared the cost-related factors associated with traditional programming techniques to rale-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the resuli:s should be widely applicable.
A Data Analysis Expert System For Large Established Distributed Databases
Anne-Marie Gnacek, Y.Kim An, J.Patrick Ryan
The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.
Model-Based Analysis System For Tracking Of Limiting Conditions For Operation And Surveillance Requirements In Power Plants
M. Ragheb, M. Abdelhai
A Model-Based Production-Rule Analysis System is developed for the tracking of the Limiting Conditions for Operation (LCOs) and the Surveillance Requirements in power plants. These LCOs and Surveillance Requirements are delineated in Power Station's Technical Specifications. The system is constructed as a Production-Rule Analysis System having the same structure as the E-Mycin system with a Natural Language Interface. The Goal-Tree constituting the Knowledge-Base is generated and is searched by the Inference Engine using Backward-Chaining with a Deduction-Oriented Antecedent-Consequent Logic. The functioning of the system is demonstrated in the identification and tracking of the LCOs and Surveillance Requirements and the associated applicable actions for maintaining the operability status of the components in the plant reactivity control systems.
Lee: Loan Evaluation Expert, An Expert System For The Assessment Of Loan Applications
Angelo G. Bravos
This paper describes LEE, a knowledge-based system for the assessment of loan applications. LEE (Loan Evaluation Expert) attempts to automate the assessment process and provide expert advice regarding the action that a financial institution should take on a loan application. LEE uses a qualitative reasoning approach and provides the basis for: a) the explicit representation of the various symbolic structures (such as customers, bankers, industries, and various types of loan applications), and dependencies between them, b) the explicit representation of the behavior of these structures in terms of symbolic reasoning, and c) end-user-oriented graphic aids to present to the developers and users the representation and reasoning. LEE is implemented in Loops (Lisp Object Oriented Programming System) running on a Xerox 1100 series lisp machine. Loops is a multi-paradigm knowledge programming language developed at Xerox PARC and implemented in and as an extension of Interlisp-D.
A Model Expert System For Machine Failure Diagnosis (MED)
Yin Liqun
MED is a model expert system for machine failure diagnosis. MED can help the repairer quickly determine milling machine electrical failure. The key points in MED are a simple method to deal with the "subsequent visit" problem in machine failure diagnosis, a weighted list to interfere in the control of AGENDA to imitate an expert's continuous thinking process and to keep away erratic questioning and problem running away caused by probabilistic reasoning, the structuralized AGENDA, the characteristics of machine failure diagnosis and people's thinking pattern in faulure diagnosis. The structuralized AGENDA gives an idea to supply a more powerful as well as flexible control strategy in best-first search by using AGENDA. The "subsequent visit" problem is a very complicated task to solve, it will be convenient to deal with it by using a simple method to keep from consuming too much time in urgent situations. Weighted list also gives a method to improve control in inference of expert system. The characteristics of machine failure diagnosis and people's thinking pattern are both important for building a machine failure diagnosis expert system. When being told failure phenomena, MED can determine failure causes through dialogue. MED is written in LISP and run in UNIVAC 1100/10 and IBM PC/XT computers. The average diagnosis time per failure is 11 seconds to CPU, 2 minites to terminal operation, and 11 minites to a skilful repairer.
Temporal Segmentation Of Image Sequences
I. K. Sethi, V. Salari, S. Vemuri
Detection of motion discontinuity across time is an important issue in motion analysis using extended image sequences. Hitherto, this issue has not received its due attention because of the preoccupation of the dynamic scene researchers with the two-frame analysis approaches. However, with the increasing use of long image sequences in current motion research, it is desirable that methods to identify motion discontinuities across time be developed. In this paper, we present one such method to perform temporal segmentation of image sequences based on motion coherence. The proposed method is capable of detecting significant motion changes in the observer motion as well as in the motion of objects. Experimental results using real image sequences are presented to show the effectiveness of the proposed temporal segmentation process.
A Parametric Training Algorithm For Image Understanding Systems
Mehmet Celenk
This paper describes a new parametric training algorithm for image understanding and computer vision systems. It is developed within the context of human color perception and a color order system. Eventual goal of the research is to obtain a mathematical evaluation criterion to guide the operation of an unsupervised pattern recognition technique for detecting the image clusters or modes in the measurement or color space. For this purpose, the peak modality is selected as the mathematical evaluation criterion. Area, mode dispersion, approximated curvature, and steepness are some of the measured quantities for modality test. Although this systematic procedure is developed primarily for the gray level and color images of natural scenes, it can also be applied to the multispectral images effectively. The proposed method is a computational one and is more suitable for array or parallel processors.
Qualitative Constraint Reasoning For Image Understanding
John L. Perry
Military planners and analysts are exceedingly concerned with increasing the effectiveness of command and control (C2) processes for battlefield management (BM). A variety of technical approaches have been taken in this effort. These approaches are intended to support and assist commanders in situation assessment, course of action generation and evaluation, and other C2 decision-making tasks. A specific task within this technology support includes the ability to effectively gather information concerning opposing forces and plan/replan tactical maneuvers. Much of the information that is gathered is image-derived, along with collateral data supporting this visual imagery. In this paper, we intend to describe a process called qualitative constraint reasoning (QCR) which is being developed as a mechanism for reasoning in the mid to high level vision domain. The essential element of QCR is the abstraction process. One of the factors that is unique to QCR is the level at which the abstraction process occurs relative to the problem domain. The computational mechanisms used in QCR belong to a general class of problem called the consistent labeling problem. The success of QCR is its ability to abstract out from a visual domain a structure appropriate for applying the labeling procedure. An example will be given that will exemplify the abstraction process for a battlefield management application. Exploratory activities are underway for investigating the suitability of QCR approach for the battlefield scenario. Further research is required to investigate the utility of QCR in a more complex battlefield environment.
Developing Sensor-Driven Robots For Hazardous Environments
Mohan M. Trivedi, Ralph C. Gonzalez, Mongi A. Abidi
Advancements in robotic technology are sought to provide enhanced personel safety and reduced costs of operation associated with nuclear power plant manufacture, construction, maintenance, operation, and decommissioning. We describe main characteristics of advanced robotic systems for such applications and suggest utilization of sensor-driven robots. Research efforts described in the paper are directed towards developing robotic systems for automatic inspection and manipulation of various tasks associated with a test panel mounted with a variety of switches, controls, displays, meters, and valves.
An Expert Vision System For Autonomous Land Vehicle (ALV) Road Following
Sven J. Dickinson, Jacqueline Le Moigne, Rand Waltzman, et al.
A blackboard model of problem solving is applied in the design of a vision system by which an autonomous land vehicle (ALV) navigates roads. The ALV vision task consists of hypothesizing objects in a scene model and verifying these hypotheses using the vehicle's sensors. Object hypothesis generation is based on an a priori map, a planned route through the map, and the current state of the scene model. Verification of an object hypothesis involves directing the sensors toward the expected location of the object, collecting evidence in support of the object, and depositing the verified object in the scene model. An object is a hierarchy of frames connected by part/whole, spatial, and inheritance relationships; these frames reside on a structured blackboard. Each level of the blackboard corresponds to a class of object in the part/whole hierarchy, with the lowest levels containing primitive sensor image features. In top-down verification, an object hypothesis posted at an upper level activates knowledge sources which generate hypotheses at lower levels representing the object's components. In bottom-up analysis, used when knowledge of the environment is limited, sensor-driven hypotheses posted at lower levels generate multiple hypotheses at higher levels. Each blackboard level is a YAPS production system, whose rules represent the knowledge sources, and whose facts are object frames modeled by Lisp Flavors. The implementation strategy thus integrates object-oriented design and production system methodology. The system has been tested successfully with the single task of building a scene model containing a straight road. New feature extractors, sensors, and objects classes are currently being added to the system.
Model Guided Segmentation Of Ground Targets
G. A. Roberts
A technique for segmenting objects in an image guided by a model is shown. Specific methods used for model selection and model guided segmentation are shown along with a method for segmentation evaluation. Examples of segmenting ground targets in infrared imagery are shown.
Knowledge Based Sar Images Exploitation
David L. Wang
One of the basic functions of SAR images exploitation system is the detection of man-made objects. The perfor-mance of object detection is strongly limited by perfor-mance of sementation modules. This paper presents a detection paradigm composed of an adaptive segmentation algorithm based on a priori knowledge of objects followed by a top-down hierarchical detection process that generates and evaluates object hypotheses. Shadow information and inter-object relationships can be added to the knowledge base to improve performance over that of a statistical detector based only on the attributes of individual objects.
The Need For A Quantitative Model Of Human Preattentive Vision
Richard W. Conners
The goal of every computer vision system is to at least match human perceptual abilities at performing a desired task. Hence the need to have an interest in and be knowledgeable of human perceptual abilities is implicit to the discipline of computer vision. If one desires to create a general purpose computer vision system the best place to look to gain general insights into how to construct such a system would seemingly be to examine what is known about human perceptual mechanisms. With this in mind the general scene analysis and image analysis strategies of the human visual system will be reviewed. Based on an analysis of these strategies it will be argued that what is needed to create a general purpose vision system is a quantitative model of human preattentive vision. Further it will be argued that this quantitative model should be based on what has been classically called texture operators.
A Closed-Form Solution For The Estimation Of Optical Flow
Shankar Moni
A closed-form solution for the determination of optical flow is presented. In addition to being closed-form, this approach does not require the optical flow field to be spatially smooth. By assuming that the m-th order time-derivative of the optical flow is zero, (m can be chosen as large as required depending on the desired accuracy), the relationships between optical flow and its derivatives in a sequence of frames to those at a particular frame of the sequence are obtained. These relations in conjunction with the brightness constraint equation are used to form a system of linear equations in which the unknowns are the components of optical flow which can be recovered by the inversion of a 2m x 2m matrix. The entries of this matrix contain only the first order spatial and temporal brightness derivatives. An alternative solution to the problem is considered in which each component of the optical flow is modelled as a polynomial (in time) of degree m-l.
An Intelligent Pictorial Information System
Edward T. Lee, B. Chang
In examining the history of computer application, we discover that early computer systems were developed primarily for applications related to scientific computation, as in weather prediction, aerospace applications, and nuclear physics applications. At this stage, the computer system served as a big calculator to perform, in the main, manipulation of numbers. Then it was found that computer systems could also be used for business applications, information storage and retrieval, word processing, and report generation. The history of computer application is summarized in Table I. The complexity of pictures makes picture processing much more difficult than number and alphanumerical processing. Therefore, new techniques, new algorithms, and above all, new pictorial knowledge, [1] are needed to overcome the limitatins of existing computer systems. New frontiers in designing computer systems are the ways to handle the representation,[2,3] classification, manipulation, processing, storage, and retrieval of pictures. Especially, the ways to deal with similarity measures and the meaning of the word "approximate" and the phrase "approximate reasoning" are an important and an indispensable part of an intelligent pictorial information system. [4,5] The main objective of this paper is to investigate the mathematical foundation for the effective organization and efficient retrieval of pictures in similarity-directed pictorial databases, [6] based on similarity retrieval techniques [7] and fuzzy languages [8]. The main advantage of this approach is that similar pictures are stored logically close to each other by using quantitative similarity measures. Thus, for answering queries, the amount of picture data needed to be searched can be reduced and the retrieval time can be improved. In addition, in a pictorial database, very often it is desired to find pictures (or feature vectors, histograms, etc.) that are most similar to or most dissimilar [9] to a test picture (or feature vector). Using similarity measures, one can not only store similar pictures logically or physically close to each other in order to improve retrieval or updating efficiency, one can also use such similarity measures to answer fuzzy queries involving nonexact retrieval conditions. In this paper, similarity directed pictorial databases involving geometric figures, chromosome images, [10] leukocyte images, cardiomyopathy images, and satellite images [11] are presented as illustrative examples.
Incorporation Of Map Information In Object Detection
Joseph J. Besselman III, Mohan M. Trivedi
This paper elucidates the viability of utilizing map information effectively in the interpretation of aerial images. Motivation for using ancillary information is the desire to harness knowledge intrinsic to a study area. A knowledge-based scheme is presented that resembles post-classification sorting in principle, but exceeds it in sophistication. The prototype is data-driven, with the knowledge configured in a production system format. Information derived from maps can be utilized to support or refute the presence of an object detected by analyzing image data. Several experiments were conducted to evaluate various belief maintenance schemes applied to refining object detection results produced by the analysis of image-only data.
Parallel Fuzzy Reasoning On Mesh-Connected Fine-Grain Computer Architectures
M. A. Eshera, J. W. Lewis
This paper presents a scheme for very high-speed inexact reasoning and fuzzy inference configured for mesh-connected fine-grain parallel computer architectures. The underlying inference mechanism for the proposed scheme is a general rule-based fuzzy inference network. The inference rules are dis-tributed on the different processors of the array of processing elements and executed in parallel. The scheme also operates in a pipeline configuration in the array, where the array is configured into stages and different sets of data move between the stages. The tradeoff between space and time com-plexity of the proposed technique is discussed and we describe a space-efficient and fast implementation of the approach on the Geometric Array Parallel Processor, GAPP, which is an SIMD mesh-connected fine-grain archi-tecture developed by Martin Marietta Corporation. The proposed scheme achieves very high-speed performance on the GAPP.
Edge Detection To Subpixel Accuracy
R. Manmatha
The accurate location of edges and boundaries in images is desirable for a number of reasons. Among the advantages include greater precision in the measurements of object sizes and locations and improved matching in stereo and rigid body motion studies. Here, a method to locate edges to subpixel accuracies is proposed. The imaging device is assumed to consist of an array of rectangular sensors. Edges are modelled as linear step functions of brightness over a window two pixels wide. Pixels are first marked as edge/non-edge using a conventional edge detector. The edge contrast is also obtained. It is shown that there exists a one to one mapping from the set of all possible intensities of pairs of adjacent "edge" pixels to the set of all edges passing through those pixels. A functional relation for the position and slope of the edges is thus obtained in terms of the pixel intensities.A two way ambiguity is resolved using a third pixel. The performance of the algorithm under noise was investigated using synthetic images. It is found that the edge location error is less than 0.05 pixel for signal to noise ratios greater than 15db. The method has been used to measure the width of a rectangular metal strip to within 0.05 pixel.
Analysis And Implementation Of Direct Passive Navigation
Shantanu K. Shee, E. J. Weldon Jr.
We study the sensitivity of the method for determining the motion parameters of a planar surface, having known orientation,directly from image gradient measurements. A bidirectional sinusoidal texture is assumed, to which Gaussian noise is added. The distribution of the errors in estimates of the motion parameters as a function of signal to noise ratio, notion parameters, parameters of the plane, texture frequency and the field of view is determined. Finally, we present the results of simulations with synthetic data as well as with real images of a sinusoidally textured planar scene.
A Tool To Support Failure Mode And Effects Analysis Based On Causal Modelling And Reasoning
W. E. Underwood, S. L. Laib
A prototype knowledge-based system has been developed that supports Failure Mode & Effects Analysis (FMEA). The knowledge base consists of causal models of components and a representation for coupling these components into assemblies and systems. The causal models are qualitative models. They allow reasoning as to whether variables are increasing, decreasing or steady. The analysis strategies used by the prototype allow it to determine the effects of failure modes on the function of the part, the failure effect on the assembly the part is contained in, and the effect on the subsystem containing the assembly.
Ejaundice: A General-Purpose Expert System Building Tool
Li-Min Fu
This paper describes EJAUNDICE, which is designed to be a general-purpose expert system building tool. Considerations behind a number of design decisions for purposes of generality are examined. EJAUNDICE provides several control schemes, including biphasical control with goal-directed reasoning, data -driven processing, and control blocks, and integrates rule-based, frame-based, and logic-based reasoning paradigms in its framework.
Abest: The Ada - Based Expert System Tool
Antonio C. Semeco, David Ho
This article describes the design and implementation of ABEST, the ADA Based Expert System Tool. ABEST is a general purpose software tool intended for use in the development of expert system applications in an ADA environment. Since it may not be clear whether it is feasible (or even desirable) to use ADA in such applications, a discussion of the advantages and disadvantages of the language in that respect is included in the Introduction. An overview of ABEST follows, illustrating its main features through a sample application in digital circuit fault diagnosis. A discussion of planned enhancements and extensions is also provided.
The Arm Editor: A Knowledge-Based Approach To Software Documentation Support
Dirk Ourston, Robert W. McBeth
The Adaptive Response Model (ARM) Editor, a research prototype, is a knowledge-based system that supports the definition and manipulation of documents used in software development. This system represents a new approach to software documentation support in that knowledge about documentation is used to facilitate the automatic production of software documents. Most previous work has been on structure editor systems that allow manipulation of document parts as objects, or on user-embedded code systems, such as formatting systems, requirements traceability systems, and document generator systems. These latter systems work only if the proper codes have been embedded.
Architectural Design Decisions For A Knowledge-Based Distributed Systems Manager
Joseph Pasquale
A major fundamental problem in decentralized resource control in distributed systems is that in general, no decision-making node knows with complete certainty the current global state of the system. We present an architecture for an Expert Manager which provides a framework for dealing with this problem. Expert system techniques are used to infer the global system state using whatever partial state information is at hand, along with mechanisms for reasoning. Decision-making is enhanced by taking into account the uncertainty of observations, and that concurrent decisions made by many Expert Managers may conflict.
The Role Of Knowledge-Based Systems In Communications System Control
Robert A. Meyer, Charles Meyer
The role of knowledge-based systems in communications system control is presented in this paper as a collection of distributed, multiple agent systems serving as intelligent advisors to human controllers. Based on an analysis of system control functions, and interviews with field personnel, we describe a model for communications system control in the Defense Communications System. Us'ing this model we have designed an architecture for a diversely distributed, multiple agent, knowledge-based system. A simulation testbed has been implemented to support experimental development and testing of the components of this architecture in a network of Lisp machines.
A Methodology For Evaluating Knowledge-Based Systems
Rajesh Dube, Naseem A. Khan
This paper presents a method for evaluating expert systems. The evaluation method consists of criteria development, empirical performance assessment and feedback. Included are assessments by the domain expert, knowledge engineers, end users and management. The evaluation criteria include factors traditionally used to evaluate expert systems ( quality of advice, correctness of reasoning strategy, user interface, hardware environment, and response time) and factors that concern field deployment as a product (expectations of the end users, domain expert and management). The criteria are developed and applied by incorporating the viewpoints of various parties concerned with the development and field use of the expert system. The problem solving performance of the expert system is evaluated by using a data base of correctly diagnosed cases obtained from the field. The evaluation method was developed to test and refine a prototype knowledge-based system GEMS (Generalized Expert Maintenance System) and has been successfully used to evaluate the first phase of GEMS, the Trunk Trouble Analyzer (TTA). GEMS-TTA analyzes outage codes that can occur on trunks terminating on the 4ESS switch. The evaluation method has shown that GEMS-TTA covers 60% of all possible outage codes that can occur, correctly analyzes 100% of the trouble tickets in the test sample and uses reasoning strategies identical to those used by an expert technician. The evaluation method is comprehensive and general enough to test and refine expert systems in other domains as well.
Using A Blackboard Architecture For Control In A Knowledge-Based Document Understanding System
Debashish Niyogi, Sargur N. Srihari
This paper describes a knowledge-based document understanding system that makes inferences about document images. This document understanding system basically consists of a rule-based reasoning system, a set of image processing modules, and a blackboard control system. The rule-based system controls the classification of the blocks in the document based on the various characteristics of the different blocks that are extracted by the low-level image processing routines. The interaction between the rule-based system and the image processing routines is controlled by the blackboard control architecture. This blackbord control system contains domain and control blackboards that store image data as well as information about intermediate processing states, and a blackboard control mechanism that monitors the invocation of various image processing operations on the document image and keeps track of the current processing status of the rule-based system. The exchange of control between the rule-based system and the blackboard control architecture brings up some interesting issues about opportunistic reasoning versus explicit control in reasoning, which are discussed in this paper.
Plex Phase II: A Truth Maintenance System (TMS)-Based Placement Program For Printed Wire Boards Using Dependency-Directed Backtracking And Human-Aided Machine Design
Susan Pitts, Sankar Virdhagriswaran
The problems inherent in expert systems which model human design processes are discussed. An example of such a system is given with PLEX, a knowledge-based expert system that does initial placement of components on a printed circuit board. A discussion follows on how dependency-directed backtracking and human-aided machine design are being added to PLEX to help overcome some of the difficulties in modeling a human designer.
Interpreting Signals With An Assumption-Based Truth Maintenance System
Rowland R. Johnson, Thomas W. Canales, Darrel L. Lager, et al.
This paper presents an expert system that interprets seismic events. The general problem of signal interpretation is formally described. The system uses an assumption based truth maintenance system. The advantages of using this approach for this application are described.
Development Of Knowledge Systems For Trouble Shooting Complex Production Machinery
Richard L Sanford, Thomas Novak, James R. Meigs
This paper discusses the use of knowledge base system software for microcomputers to aid repairmen in diagnosing electrical failures in complex mining machinery. The knowledge base is constructed to allow the user to input initial symptoms of the failed machine, and the most probable cause of failure is traced through the knowledge base, with the software requesting additional information such as voltage or resistance measurements as needed. Although the case study presented is for an underground mining machine, results have application to any industry using complex machinery. Two commercial expert-system development tools (M1 TM and Insight 2+TM) and an Al language (Turbo PrologTM) are discussed with emphasis on ease of application and suitability for this study.
Situation-Assessment And Decision-Aid Production-Rule Analysis System For Nuclear Plant Monitoring And Emergency Preparedness
D. Gvillo, M. Ragheb, M. Parker, et al.
A Production-Rule Analysis System is developed for Nuclear Plant Monitoring. The signals generated by the Zion-1 Plant are considered. A Situation-Assessment and Decision-Aid capability is provided for monitoring the integrity of the Plant Radiation, the Reactor Coolant, the Fuel Clad, and the Containment Systems. A total of 41 signals are currently fed as facts to an Inference Engine functioning in the backward-chaining mode and built along the same structure as the E-Mycin system. The Goal-Tree constituting the Knowledge Base was generated using a representation in the form of Fault Trees deduced from plant procedures information. The system is constructed in support of the Data Analysis and Emergency Preparedness tasks at the Illinois Radiological Emergency Assessment Center (REAC).
A Self-Organising Dictionary For Conceptual Structures
B. J. Garner, E. Tsui
The provision of greater intelligence in the storage and reorganisation of knowledge bases is seen to be fundamental to future directions in automated knowledge acquisition, discourse understanding and the design of conceptual processors. In this paper, we report the design and implementation of a self-organising conceptual dictionary for conceptual structures as the basis of an intelligent knowledge-base manager. The dictionary is a connected, directed graph with capabilities for storing graphs, indexing incoming structures, generalising between incoming structures and existing (indexed) structures, identification of graph-subgraph relationships, pattern matching and proposing new structures. Existing work on self-organising retrieval systems is compared and contrasted with our new design.
Eils: A Directed-Search-Based Decision Aid For Battlefield Intelligence
Asdrubal Garcia-Ortiz, Keith A. Hickey, John R. Wootton
In the area of electronic intelligence (ELINT), one of the most demanding tasks to be performed in the battlefield is the determination of the emanation point of an intercepted electronic signal. Because this function has to be performed under battlefield conditions, an enormous amount of stress is placed on the intelligence officer. The result is suboptimal performance of the ELINT installation. One way of improving the situation is to provide the intelligence officer with a computer-based decision aid. This paper describes a knowledge-based system prototype of such a decision aid. In performing its function the system makes use of a directed-search strategy. The use of a search strategy represents a departure from, and in this case a better approach than, the more commonly used AI technique of hypothesis generation/testing. The search strategy is dictated by the fact that the dimensionality of the solution space is such that solutions cannot be "pre-canned" for later recalling as is typical of the hypothesis generation/ testing strategy. The absence of "pre-canned" solutions also imparts on the system the desirable property of graceful degradation.
A Real-Time Advisory System For Airborne Early Warning
D . B. Kirk, M. E. Cromwell, M. L. Donnell, et al.
Decision speed and quality can be greatly enhanced by the use of decision augmentation software to assist operators in information analysis and tactical problem solving, dynamic resource allocation, and in determining strategies which optimize overall system performance. One example of such software is the real-time advisory system (RTAS) being constructed to assist in tactical decision-making for airborne early warning (AEW) aircraft, particularly the carrier-based Navy E-2C. Using a vector logic approach, the current AEW RTAS is a real-time backward chaining expert system which provides advice for both threat interception and refueling in the complex Outer Air Battle Scenario. This paper describes the current system, discusses a number of design issues for such a system, and describes ongoing modifications to the current AEW RTAS using SAIC's frame-based knowledge repre-sentation language (KRL).
Multi-Level Re-Planning For Reconnaissance Drones
Glenn David Blank
The Navy currently uses airborne drones for reconnaisance, controlling them by radio uplink or autopilot technology. Radio contact is not always possible, however, and autopilot control does not monitor non-navigational sensors, such as a CDC camera. We are developing a system that will monitor the autopilot as it follows a pre-planned mission, and re-plans in response to conditions detected by non-navigational sensors. At its heart is a novel approach to real time system control, called Register Vector Grammar. We demonstrate how this modified finite state automaton is able to model both hierarchical decomposition and heterarchical responsiveness--both needed to handle the range and interaction of behaviors susceptible to intelligent control--without calling for inordinate computational complexity.
Minimization In Digital Design As A Meta-Planning Problem
William P. C. Ho, Jung-Gen Wu
In our model-based expert system for automatic digital system design, we formalize the design process into three sub-processes - compiling high-level behavioral specifications into primitive behavioral operations, grouping primitive operations into behavioral functions, and grouping functions into modules. Consideration of design minimization explicitly controls decision-making in the last two subprocesses. Design minimization, a key task in the automatic design of digital systems, is complicated by the high degree of interaction among the time sequence and content of design decisions. In this paper, we present an AI approach which directly addresses these interactions and their consequences by modeling the minimization prob-lem as a planning problem, and the management of design decision-making as a meta-planning problem.
Application Of Dynamic Knowledge In The Design Of Distributed Operating Systems
B. J. Garner, Swamy Kutti
In this paper we report current research on a SYSTEMS MAP based on conceptual graph theory for representing the knowledge requirements of distributed operating systems. This knowledge-base has been used on an interactive basis for the design of an UPPER KERNEL (higher level functions such as job management) for a distributed system.2 The SYSTEMS MAP comprises knowledge about the configuration of the distributed system, services available and users enrolled. An interface to the systems map is provided via the Knowledge-Base Editor (KBE) which makes a set of functions available for both users and the system. This facilitates the creation, modification and accumulation of graphs in maintaining the systems map dynamically, as well as to search for and schedule users' requests. The semantics of conceptual graph models are useful in developing a natural language interface. A brief description of the proposed distributed system and the notion of systems objects (whose descriptions form the peripheral knowledge) is also delineated.
Temporal Aspects Of Modeling System Behavior
Lee A. Appelbaum, Thomas C. Fall
Systems that the authors have developed and delivered determine behavior by utilizing evidence which is uncertain and spread over time. The systems match data to templates of behavior to pick out a thread of activity which matches the evidence. By choosing those threads which are most supported, the systems can predict future activity. Hypotheses about groups of entities can be formed by selecting one thread for each entity of the group. The impact of constraints on groups can thus be evaluated out into the future. This paper describes the effort in building these systems and reviews the lessons learned. The thrust of the discussion is centered on behavior models, the templates which model a system's activity. The methods and representations we employ for temporal reasoning are actually consequences of these behavior models. We also discuss how such systems can easily be expanded to operate on less rigidly defined behavior patterns.
Designing Vlsi Computer Architectures With Knowledge
Phillip C. Sheu
One major development that should reach fruition in the near future is a system that integrates design tools and design databases, and to design an entire engineering system based on formal specifications. In this paper we introduce a knowledge engineering framework for designing VLSI computer architectures. This framework introduces three core concepts: knowledge abstraction, object-oriented design, and very high-level design programming. The input to the system is a descriptive specification of the behavior of a computing system. The description is then matched against the existing design knowledge in the knowledge base, where the knowledge is abstracted and organized as classes. If a match can be found, the abstract knowledge is instantiated and can be reused; otherwise the designer has to go one level down decom-posing the system into a set of smaller systems, where a structural description among different modules is specified but only the behavioral specification for each module is given. The process is then repeated until a complete structural description is developed.
A Hybrid Architecture For Natural Language Understanding
R.Bruce Loatman
The PRC Adaptive Knowledge-based Text Understanding System (PAKTUS) is an environment for developing natural language understanding (NLU) systems. It uses a knowledge-based approach in an integrated hybrid architecture based on a factoring of the NLU problem into its lexi-cal, syntactic, conceptual, domain-specific, and pragmatic components. The goal is a robust system that benefits from the strengths of several NLU methodologies, each applied where most appropriate. PAKTUS employs a frame-based knowledge representation and associative networks throughout. The lexical component uses morphological knowledge and word experts. Syntactic knowledge is represented in an Augmented Transition Network (ATN) grammar that incorporates rule-based programming. Case grammar is used for canonical conceptual representation with constraints. Domain-specific templates represent knowledge about specific applications as patterns of the form used in logic programming. Pragmatic knowledge may augment any of the other types and is added wherever needed for a particular domain. The system has been constructed in an interactive graphic programming environment. It has been used successfully to build a prototype front end for an expert system. This integration of existing technologies makes limited but practical NLU feasible now for narrow, well-defined domains.
Simplified Pattern Recognition Based On Multiaperture Optics
Richard T. Schneider, Shih-Chao Lin
Multiaperture optics systems are similar in design to the concepts applying to the insect eye. Digitizing at the detector level is inherent in these systems. The fact that each eyelet forms one pixel of the overall image lends itself to optical preprocessing. There-fore a simplified pattern recognition scheme can be used in connection with multiaperture optics systems. The pattern recognition system used is based on the conjecture that all shapes encountered can be dissected into a set of rectangles. This is accomplished by creating a binary image and comparing each row of numbers starting at the top of the frame with the next row below. A set of rules is established which decides if the binary ones of the next row are to be incorporated in the present rectangle or start a new rectangle. The number and aspect ratios of the rectangles formed constitute a recognition code. These codes are kept and updated in a library. Since the same shape may give rise to different recognition codes depending on the attitude of the shape in respect to the detector grid, all shapes are rotated and normalized prior to dissecting. The rule is that the pattern is turned to maximize the number of straight edges which line up with the detector grid. The mathematical mechanism for rotation of the shape is described. Assuming a-priori knowledge of the size of the object exists, the normalization procedure can be used for distance determination. The description of the hardware for acquisition of the image is provided.
Two-Dimensional Grammars And Their Applications To Artificial Intelligence
Edward T. Lee
During the past several years, the concepts and techniques of two-dimensional grammars1,2 have attracted growing attention as promising avenues of approach to problems in picture generation as well as in picture description3 representation, recognition, transformation and manipulation. Two-dimensional grammar techniques serve the purpose of exploiting the structure or underlying relationships in a picture. This approach attempts to describe a complex picture in terms of their components and their relative positions. This resembles the way a sentence is described in terms of its words and phrases, and the terms structural picture recognition, linguistic picture recognition, or syntactic picture recognition are often used. By using this approach, the problem of picture recognition becomes similar to that of phrase recognition in a language. However, describing pictures using a string grammar (one-dimensional grammar), the only relation between sub-pictures and/or primitives is the concatenation; that is each picture or primitive can be connected only at the left or right. This one-dimensional relation has not been very effective in describing two-dimensional pictures. A natural generaliza-tion is to use two-dimensional grammars. In this paper, two-dimensional grammars and their applications to artificial intelligence are presented. Picture grammars and two-dimensional grammars are introduced and illustrated by examples. In particular, two-dimensional grammars for generating all possible squares and all possible rhombuses are presented. The applications of two-dimensional grammars to solving region filling problems are discussed. An algorithm for region filling using two-dimensional grammars is presented together with illustrative examples. The advantages of using this algorithm in terms of computation time are also stated. A high-level description of a two-level picture generation system is proposed. The first level is the picture primitive generation using two-dimensional grammars. The second level is picture generation using either string description or entity-relationship (ER) diagram description. Illustrative examples are also given. The advantages of ER diagram description together with its comparison to string description are also presented. The results obtained in this paper may have useful applications in artificial intelligence, robotics, expert systems, picture processing, pattern recognition, knowledge engineering and pictorial database design. Furthermore, examples related to satellite surveillance and identifications are also included.
Thesaurus Building With Transitive Closures For Kadre
Gautam Biswas, J. C. Bezdek, Li-ya Huang
Currently there is a big thrust towards the development of automated, user-friendly online information retrieval systems, which are software packages that allow a user population to query and receive appropriate information that is stored in a computer data-base. Our research adopts a knowledge based approach to the design and development of KADRE (Knowledge Assisted Document Retrieval Expert), an experimental online document retrieval system. Knowledge based techniques allow us to incorporate some of an expert librarian's heuristic techniques into the retrieval process. We present the mathematical model of the retrieval system as a five step process and discuss the system thesaurus in some detail. Six different transitive closures of the relational matrix of term pairs are used to compute the thesaurus, and the retrieval output produced by these techniques are compared. Representation of the thesaurus as a transitive closure offers two very significant advan-tages: this method provides a means for completion of partial knowledge which is represented as numerical relational data; and the ensuing completion has a well defined property of (mathemati-cal) consistency.
An Intelligent Computerized Stretch Reflex Measurement System For Clinical And Investigative Neurology
P. M. Flanagan, J. G. Chutkow, M. T. Riggs, et al.
We describe the design of a reliable, user-friendly preprototype system for quantifying the tendon stretch reflexes in humans and large mammals. A hand-held, instrumented reflex gun, the impactor of which contains a single force sensor, interfaces with a computer. The resulting test system can deliver sequences of reproducible stimuli at graded intensities and adjustable durations to a muscle's tendon ("tendon taps"), measure the impacting force of each tap, and record the subsequent reflex muscle contraction from the same tendon -- all automatically. The parameters of the reflex muscle contraction include latency; mechanical threshold; and peak time, peak magnitude, and settling time. The results of clinical tests presented in this paper illustrate the system's potential usefulness in detecting neurologic dysfunction affecting the tendon stretch reflexes, in documenting the course of neurologic illnesses and their response to therapy, and in clinical and laboratory neurologic research.
A Mechanism To Automate The Production Of A Keyword Table For An Intelligent Active Assistance System
Ronald L. Sobczak, Manton Matthews, Gautam Biswas
This paper describes the mechanism for generating a keyword table for an Intelligent Active Assistance System for small computers. This system runs under Unix (a trademark of AT&T Bell Laboratories). The mechanism outlined would be applicable for any system where hierarchically organized text in the form of screens is used to explain to the user a variety of concepts. The goal of the Assistance System is to help the user learn more about Unix and its capabilities. It accomplishes this in two ways. First, it allows the user to ask questions about different system features and commands. Second, it monitors the user, maintains a history, and prompts the user on more efficient command sequences. Because of these two modes, this system is considered both intelligent and active. The keyword table is an essential component of the Assistance System's ability to answer user questions. When a user enters a question on a particular UNIX feature or asks how to accomplish a specific task, the Assistance System recognizes the key concepts in the question and uses the keyword table to determine which screens contain the appropriate information for answering the question. Since UNIX has a large number of features, the Assis-tance System Knowledge Base has developed incrementally and the keyword table needs to be frequently updated to include references to the new information. Therefore, we have automated the production of the keyword table. Auto-mation includes revising both the list of keywords and the references to screens containing each of the keywords. This is done in three steps: preparing a new list of keywords, pruning the list of unnecessary words, and searching the screens for references to each of the keywords.
Fuzzy Sets And Autonomous Navigation
Robert N. Lea
The problems of dealing with imprecise data and inexact models are inherent to navigation systems. These systems classically have data from sensors feeding into them for use in updating a state vector. A human expert observes the system inputs and outputs and decides whether the system is performing properly. He makes inputs and changes which he considers necessary for the proper processing of the sensor data. Many of the procedures the expert follows are fuzzy procedures in the sense that no algorithm exists that will indicate a precise course of action. These fuzzy procedures deal with using common sense reasoning to process data when a set of conditions are almost satisfied. In this paper, the use of fuzzy sets is considered in modeling the human expert for certain Space Shuttle navigation problems. Particular areas addressed are onboard and ground console data monitoring tasks historically performed by astronauts and engineers. Specific problems include determining the quality of sensor data and determining the quality of the filter state. Results of the study indicate that the fuzzy models can perform as well as the expert on both nominal and non-nominal data.
Learning Significant Class Descriptions
Joseph F. Blumberg, James A. Hendler
A program using a learning-by-examples algorithm creates descriptions that are used to differentiate between two classes of prosthetic devices. The best descriptions are selected by the learning algorithm based on a "significance" bias. This bias is automatically derived by a rule system which deduces a level of significance for each of the learned descriptions. The basis for deriving a level of significance for a class description is dependent upon the relationships between the class attributes. Generalized rules are developed which capitalize on the relationships between attributes of a class description in order to deduce a level of significance. It is further hypothesized that the rules are applicable to any domain in which the relationships between class attributes are known a priori. The exchange of information between the learning-by-examples algorithm and the rule system is outlined. The rules are shown along with the representation structure of the class attributes. The results of utilizing three different biases within the learning-by-examples algorithm are also presented. It is shown that the maximum significance bias, equal cost bias, and minimal significance bias provide decreasingly useful descriptions of the prosthetic devices respectively.
A Technique For Planning In A Blocks World Environment
Vincent Tat, Robert McLaren
A method for moving blocks, referred to as "blocks world", is described in a book on LISP by Winston and Horn. In simplifying that implementation, the basic method avoids using a physical coordinate system for the blocks and the gripper. This "trade-off" leads to the question as to how a block is moved from one location to another. This paper presents a method to augment the basic scheme by proposing a three-dimensional coordinate system. The proposed method can generate a path for moving a block from one location to another without encountering any obstacles. This path is described by the three-dimensional coordinates. With an appropriate graphics program, one can project the coordinates to effect a dynamic graphics display of a block along the path. In the context of a "real world" robotics system, the gripper is moving a block autonomously after being given the initial state of the environment, represented possibly by the output of an image process and a description of the object to be moved. Appropriate computer programs are included to illustrate how the method is implemented.
Experiences In The Design And Use Of An Expert System For Computer�Aided Control System Design
J.Douglas Birdwell, J.Robin B. Cockett
An expert system, the Computer-Aided Systems and Control. Analysis and Design Environment (CASCADE) has been implemented for the design of multivariable controllers for linear plants, using the linear quadratic Gaussian/loop transfer recovery (LQG/I,T11.) design methodology [1,2,3]. The expert system was implemented using a novel shell, DECIDE [4,5], which uses a decision expression methodology and incorporates explanation and backtracking, and linkage with the numeric and graphics software necessary for control system analysis and design. CASCADE was evaluated by generating trial designs for several realistic applications in the following areas: High voltage DC power transmission, robot manipulator path tracking, large structure active stiffening, aircraft stability augmentation, wind power cogeneration, bioreactor control, and steam power plant control. Since these designs were performed by graduate students, this provided an evaluation of the CASCADE implementation as both an expert system-based design package and an aid in education. We found that not only was the expert system useful for computer-aided design and education; it also served to demonstrate several subtle fallacies in the theoretical design methodology. Our surmise is that these fallacies had remained undiscovered because of the significant investment required in the mechanics of the design process in previous work. The expert system allowed attention to be focused-on a codification of design knowledge rather than on the algorithmic details.
Expert Systems Application In Manufacturing
Pradip Som, Ramesh Chitturi, A.J. G. Babu
Expert system, a special branch of Artificial Intelligence finds its way in the domain of manufacturing. This paper presents the basic ideas and features of the expert systems, problems in manufacturing and application of expert systems in manufacturing. As the process planning is an important phase in manufacturing, the suitability of expert systems for process planning area has been highlighted. Several expert systems, developed to solve manufacturing problems are also discussed in the paper.
Hierarchical Path Planning In Complex Domains
Glen Pearson, JoLan Yao
This paper describes hierarchical path planning as a technique for planning paths at different levels of abstraction by using a hierarchical representation of the domain. The path planner makes use of a terrain map and a grid-level search algorithm to perform intelligent path planning. The terrain map is made up of three types of objects, pixels, mapels and maps, each containing information about the terrain. The grid-level search algorithm is a two-pass algorithm that uses these object representations. The results of the path planner show various paths computed through complex terrain.
A Generic Controller For Manufacturing Workcells
Jeffrey S. Wright
Many manufacturing companies are studying how the systematic use of computers throughout their operation can improve productivity, with the goal of achieving computer-integrated manufacturing (CIM). It is now realized that computer control should be applied to systems as a whole in addition to individual processes. A manufacturing workcell is composed of a number of stations that perform value-adding processes linked by a material-handling system and computer control. Although the cellular approach to factory automation is widely accepted, there is a shortage of commercial software for controlling cells. The cell control problem involves high level functions of production scheduling and resource management, as well as the monitoring and control of real-time processes. An intelligent control algorithm is presented that can be applied to manufacturing workcells. Since it is easier to add cell-specific knowledge to a knowledge base than to modify a conventional program, a knowledge-based implementation supports the goal of a generic controller. Results from implementing such a controller for a simulated workcell indicate that this approach merits further development.
Automatic Generation Of 3-D Pictorial Drawing From Intensity Image
C. L. Huang, J. T. Tou
This paper presents an approach to generate the pictorial drawing of a 3-D object from its 2-D images. Through preprocessing, the intensity image can be converted to a binary boundary image from which the system fetches the existing features, i.e. junctions, corners, lines, and regions. However, the small intensity differences may produce dangling lines and missing lines. They can be identified by using the properties of surface gradient and heuristic rules. The system is designed to make the right connection of the dangling lines, to restore the missing lines, and to generate a pictorial drawing.
Security Applications Of Computer Motion Detection
Andrew P. Bernat, Joseph Nelan, Stephen Riter, et al.
An important area of application of computer vision is the detection of human motion in security systems. This paper describes the development of a computer vision system which can detect and track human movement across the international border between the United States and Mexico. Because of the wide range of environmental conditions, this application represents a stringent test of computer vision algorithms for motion detection and object identification. The desired output of this vision system is accurate, real-time locations for individual aliens and accurate statistical data as to the frequency of illegal border crossings. Because most detection and tracking routines assume rigid body motion, which is not characteristic of humans, new algorithms capable of reliable operation in our application are required. Furthermore, most current detection and tracking algorithms assume a uniform background against which motion is viewed - the urban environment along the US-Mexican border is anything but uniform. The system works in three stages: motion detection, object tracking and object identi-fication. We have implemented motion detection using simple frame differencing, maximum likelihood estimation, mean and median tests and are evaluating them for accuracy and computational efficiency. Due to the complex nature of the urban environment (background and foreground objects consisting of buildings, vegetation, vehicles, wind-blown debris, animals, etc.), motion detection alone is not sufficiently accurate. Object tracking and identification are handled by an expert system which takes shape, location and trajectory information as input and determines if the moving object is indeed representative of an illegal border crossing.
Robust Notion Vision For A Vehicle Moving On A Plane
Shankar Moni, E. J. Weldon Jr.
A vehicle equipped with a cemputer vision system moves on a plane. We show that subject to certain constraints, the system can determine the motion of the vehicle (one rotational and two translational degrees of freedom) and the depth of the scene in front of the vehicle. The constraints include limits on the speed of the vehicle, presence of texture on the plane and absence of pitch and roll in the vehicular motion. It is possible to decouple the problems of finding the vehicle's motion and the depth of the scene in front of the vehicle by using two rigidly connected cameras. One views a field with known depth (i.e. the ground plane) and estimates the motion parameters and the other determines the depth map knowing the motion parameters. The motion is constrained to be planar to increase robustness. We use a least squares method of fitting the vehicle motion to observer brightness gradients. With this method, no correspondence between image points needs to be established and information fran the entire image is used in calculating notion. The algorithm performs very reliably on real image sequences and these results have been included. The results compare favourably to the performance of the algorithm of Negandaripour and Horn [2] where six degrees of freedom are assumed.
Implementing Viewing Spheres: Automatic Construction Of Aspect Graphs For Planar-Faced, Convex Objects
John H. Stewman, Kevin W. Bowyer
The concept of an aspect graph was described by Koenderink and van Doorn as part of their explanation of the functioning of human vision. Several researchers have subsequently proposed the use of aspect graphs in the development of computer vision systems. This paper details an algorithm for the construction of aspect graphs from boundary surface representations of convex, planar-faced, 3-D objects. Our approach is based on the creation and use of an intermediate data structure which represents the complete parcellation of space based on the geometry of the object. All information necessary for identification of object aspects and corresponding cells is obtained as a result of the parcellation. We introduce a cell numbering system that allows unique identification of each cell/aspect and provides a system for encoding information about the boundary of each cell and about the identity of each object face visible as a part of the aspect. The aspect graph created by this process is used as the basis for our viewing sphere approach to the 3-D object recognition problem.
Segmentation Of Binary Images Into Text Strings And Graphics
Lloyd Alan Fletcher, Rangachar Kasturi
An automated system for document analysis is extremely desirable. A digitized image consisting of a mixture of text and graphics should be segmented in order to more efficiently represent both the areas of text and graphics. This paper describes the development and implementation of a new algorithm for automated text string separation which is relatively independent of changes in text font style and size, and of string orientation. The algorithm does not explicitly recognize individual characters. The principal components of the algorithm are the generation of connected components and the application of the Hough transform in order to logically group together components into character strings which may then be separated from the graphics. The algorithm outputs two images, one containing text strings, and the other graphics. These images may then be processed by suitable character recognition and graphics recognition systems. The performance of the algorithm, both in terms of its effectiveness and computational efficiency, was evaluated using several test images. The results of the evaluations are described. The superior performance of this algorithm compared to other techniques is clear from the evaluations.
Automated Fake Color Separation: Combining Computer Vision And Computer Graphics
Deborah Walters
A system is described for the automation of the color separation process. In current color separation systems, humans must visually segment line-art images, and using pen and ink, delineate the segments in a manner that enables a computer graphics system to be used interactively to color in each segment. The goal of this research was to remove the labor-intensive human visual segmentation, by adding rudimentary visual processing capabilities to the computer graphics system. This is possible through the use of computer vision algorithms which incorporate general knowledge about line-art, and are based on image features that are used by the human visual system in the early stages of visual processing. A major color separation company is planning the hardware implementation of a vision-graphics system based on these algorithms, and the State University of New York is applying for two patents based on this research.
Scale Space Technique To Finding Primitives In Images With Application To Road Following1
Arun K. Sood, Mubarak Shah
In this paper, we present an approach to find primitives in images using scale space. There are two parts of our scheme. First, the hypothesis framing algorithm is utilized to get qualitative support for the presence of primitive by verifying its behavior at various scale. During this phase we also eliminate the noisy contours. Next, the optimization algorithm is applied to fit the model of primitive to the edge points in the scale space. Through, fitting we get a kind of quantitative support and estimate the unknown parameters in the model of a primitive. In this paper, we only consider a pulse primitive, but our approach is valid for other primitives. We have also used a pulse function to model a road. There are two discontinuities in the pulse mode/ one corresponding to the left edge of the road and the other corresponding to right edge of the road. We present a new algorithm to identify road edges using scale space. Attraction of our approach is that it reduces two dimensional analysis to only single dimension.
A New Segmentation Method And Its Application To Stereo Vision
Olivier Monga
We present a new approach to the segmentation problem by optimizing a criterion which estimates the quality of a segmentation. Our method offers a general framework for solving a large class of segmentation problem. We use a graph-based description of a partition of an image and a merging strategy based on the optimal use of a sequence of criteria. An efficient data structure enables our implementation to have a low algorithmic complexity. We show how we adapt this method to segment 2-d natural images including color images and how we use results for solving the stereo matching problem.
A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing
J. C. Heudin, C. Metivier, D. Demigny, et al.
It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.
Process Modeling And Simulation With Peps
Werner Dilger, Andreas Espen, Felix Schuck
PEPS is an expert system for the modeling and simulation of technical processes, whose knowledge has the features that it is gained mainly by experiments, consists out of both, quantitative and qualitative data, and depends often on estimations of experts. This is demonstrated by means of the automatic welding process. For the representation of this kind of knowledge, in the system PEPS quasi-quantitative values and plausibility estimations are used. Processes are defined by their parameters and rule sets assigned to them and by influences between the parameters. Elementary processes can be aggregated to larger ones.
Expo - An Expert System For The Load Dispatching Control Center Of A Power Supply Company
Renate Meyer, Ulrich Pradel, Michael Rieskamp-von der Warth, et al.
EXPO is an expert system which supports the staff of a load dispatching control center. Given any (static) state of a power supply system it determines whether a malfunction has occurred and, if any exists, which correcting actions can be applied. Actually, a first version of EXPO is installed as a training. EXPO is implemented as a rule-based system. It is mainly written in Prolog and, therefore, takes advantage of most features of Prolog, like, for example, its inference mechanism. The user interface is menu-driven and offers a grafical representation of the state of the power supply system. A knowledge acquisition component allows an easy integration of new knowledge as additional rules or new equipment. The explanation component supports a 'quasi-natural' communication with the user.
Graphically Interactive, Knowledge Assisted Pattern Classification
K.F. Kraiss, H. Kuettelwesch
This paper describes an interactive approach to sensor data classification utilizing interactive graphics and artificial intelligence techniques. In this concept prominent features are extracted by a human observer. Subsequently a computer performs knowledge based feature interpretation and hypothesis generation. Final verification and classification remains with the observer.
Causal Reasoning In Diagnostic Expert Systems
Pietro Torasso, Luca Console
In order to deal efficiently with difficult diagnostic problems, deep models (based on causal knowledge) have been adopted in some experimental diagnostic expert system. This paper describes a two levels architecture for a diagnostic expert system: CHECK (Combining HEuristic and Causal Knowledge). CHECK is based on the close interaction of two levels of knowledge representation, heuristic and causal respectively. In the heuristic (shallow) level knowledge is represented by means of a hybrid formalism combining at various levels frames and production rules; in the deep level knowledge is represented by means of causal networks in which (physical or physiological) states are connected via cause-effect relations. The two levels strictly cooperate in the diagnostic process, in particular the heuristic level is used to focus reasoning, generating diagnostic hypotheses to be refined, confirmed (disconfirmed) and explained by the deep level. Heuristic (surface) level knowledge is invoked first to generate diagnostic hypotheses. These hypotheses are then passed to the underlying level for a deep confirmation (so that they are used to focus reasoning in the causal network). If a hypothesis can be confirmed, a precise explanation is generated, unaccounted and/or unexpected data are taken into account and correlated hypotheses suggested. If a hypothesis is rejected, alternative hypotheses to be considered are suggested to the surface level. Deep level knowledge can be used also to provide general explanations about the causal model of the domain, independently from the data of a particular consultation. As an example for validating the architectural choices of CHECK we have implemented a version of it for diagnostic reasoning in the field of hepatology. Production rules, frames and causal networks are described by the knowledge engineer in a knowledge representation language we have designed and then coded, through the use of a preprocessing tool, in Prolog. Particular object-oriented schemes are used to design the features of the causal network.
Conceptual Transformer And Its Relevance In Semantic Clustering
B. Shekar, M.Narasimha Murty
A new class of concepts called 'Conceptual Transformers' is defined and illustrated with the help of examples. The knowledge structure to represent this class of concepts is detailed. The role of these transformers, in clustering of objects using a semantic measure is looked into. Salient features of this class are examined and illustrated. An example from real-world to depict Transformer-based Semantic Clustering is given.