SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2019 | Call for Papers

2018 SPIE Optics + Photonics | Register Today



Print PageEmail Page

Electronic Imaging & Signal Processing

Simulation scientists apply finesse to complex subject

From OE Reports Number 159 - March 1997
28 March 1997, SPIE Newsroom. DOI: 10.1117/2.6199703.0005

Great strides are being made in the area of simulation science, largely because of increased funding and increased communication between researchers. Al Sisti, an Electronics Engineer with the Intelligence and Reconnaissance Directorate at Rome Laboratory (Rome, NY), says that in the past several years, he has awarded about 18 contracts for R&D into enabling technologies for advancing the state-of-the-art in simulation science.

"Research is being done in many areas for increasing the realism and validity of large-scale simulations," says Sisti, " in a way that its results can be made available to military decision- makers in (near) real-time." Simulations of a multiday military campaign, for example, may include representations of hundreds of thousands of entities. "In the past year," says Sisti, "we were able to collect several of the prominent researchers in the field [so they] could hear what each other was doing." The increased exposure of the researchers at conferences and in journals, "has fostered an incredible amount of interest, and great strides have been made in all areas of basic research in the art of simulation." Some enabling technologies are model abstraction, hierarchical simulation, mixed fidelity simulation, object-oriented simulation, and concurrent simulation.

Model management, which can be considered analogous to database management, is relatively mature. Model management would provide a frame for the model, a way of integrating sub models. Although there are several commercially available software packages for modeling discrete events, "in order to make decisions based on simulation, one usually needs to run a large number of simulations and then carefully manage all output data collected on a case-by case basis," says Professor Christos Cassandras of Boston University (Boston, MA), one of the major researchers in the field. Models simulate different levels of detail, ranging from the most specific engineering- and component-level models to platform- and mission-level models to large-scale theater- and campaign-level models. "Right now, a battle manager using a large scale simulation to test out attack options," explains Sisti, does not have the time to run detailed models. Instead, a coarser model is used, which runs fast enough to provide results, but may be too coarse to provide valid results. By developing a hierarchic simulation, the higher-level (larger-scale) models could be designed to either use lower-level codes or model abstractions, in a plug-and-play format. This would allow reuse of codes, faster simulation setup, and more realistic large-scale simulations.

Model abstraction deals with reducing the complexity of models while still capturing the essence that yields statistically accurate results, to provide faster computation. This would allow validated codes to be used at different levels of resolution. Sisti says, "the best minds in the world are taking a wide variety of approaches to solving [the model abstraction problem], while at the same time, it's obviously nowhere near reaching a final solution yet."

Mixed-fidelity simulation, such as described above, requires the different levels of the simulation to coordinate what level of detail is available and needed by two communicating models. An engineering model may provide too much detail to a higher-level model, whereas a higher-level model is likely to provide too little detail to any lower-level model. At some point, users may want the amount of detail in a simulation to change, in a sort of "software zoom." This area is just beginning to be investigated.

Paul Fishwick's group at the University of Florida (Gainesville, FL) is developing software for building hierarchic models. The Multimodeling Object-Oriented Simulation Environment (MOOSE) and its predecessor, SimPack, are designed for object-oriented physical modeling. The environments are being written using Tk/Tcl, Java and C++ to be as platform-independent as possible. More information is available on the Web at http://www.cise.ufl.edu/~fishwick

"Qualitative reasoning and fuzzy systems have sound theoretical foundations," says Sisti, and are now being applied to modeling and simulation in order to speed simulations. Cassandras explains, "Often, the purpose of simulation is to compare many alternatives in order to identify the optimal one." Usually each alternative is ranked using one performance objective function, which can be complicated, time consuming, and sometimes impossible to calculate. Qualitative optimization, however, is driven by the relative order of estimates of this function-not the specific values. "Why waste time to get 'good' performance estimates," says Cassandras, "when relatively 'poor' but quickly obtained estimates can be provably adequate to order these performance estimates?"

Another major time-saver could be concurrent simulation. If an analyst wants to answer "what if" questions about a system that has been simulated, then often a further simulation must be run for each possibility being tested. But what if several questions need to be answered in a single simulation? Traditional perturbation theory has often focused on small changes to parameters of a model in order to estimate the sensitivities of particular performance measures. But it can also be used, Cassandras explains, "to develop methodologies and specific algorithms with the capability to obtain answers to multiple 'what if' questions of great generality from a single simulation run." Such concurrent or parallel simulations provide more information about system behavior than a standard simulation.

The overall push in simulation is to avoid depending on brute force number crunching in simulations, despite the technical advances in computers. With the increased interest and communication in this area, future advances are likely to advance the state of the art considerably.

Ed. note: Al Sisti chairs the conference on Enabling Technologies for Simulation Science at Aerosense: Aerospace/Defense Sensing, Simulation, and Controls, 20-25 April in Orlando, Florida. For a copy of the advance program, contact SPIE at 360/676-3290. Fax: 360/647 1445.

Yvonne Carts-Powell

Yvonne Carts-Powell, based in Boston, writes about optoelectronics and the Internet.