Proceedings Volume 5423

Enabling Technologies for Simulation Science VIII

cover
Proceedings Volume 5423

Enabling Technologies for Simulation Science VIII

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 13 August 2004
Contents: 13 Sessions, 45 Papers, 0 Presentations
Conference: Defense and Security 2004
Volume Number: 5423

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Model Composability: Problems, Progress, and Prospects
  • M and S for Course-of-Action Generation and Assessment
  • Cognitive Influences and Effects in Decision-Making
  • Modeling and Detecting Deceptive Information
  • M and S for National Defense: Requirements, Issues, and Capabilities
  • Technology for Wargaming Support
  • Dealing with Complexity in Simulation
  • High-Performance Computing Applications for M and S
  • M and S Validation and Testing
  • Joint Synthetic Battlespace (JSB)
  • Collaborative Decision Support Systems and Environments
  • Enterprise Modeling Applications and Techniques
  • Simulation Environments and Frameworks
Model Composability: Problems, Progress, and Prospects
icon_mobile_dropdown
Prospects for composability of models and simulations
Paul K. Davis, Robert B. Anderson
This paper is the summary of a recent RAND study done at the request of the U.S. Defense Modeling and Simulation Office (DMSO). Commissioned in recognition that the last decade's efforts by DoD to achieve model "composability" have had only limited success (e.g., HLA-mediated exercises), and that fundamental problems remain, the study surveyed the underlying problems that make composability difficult. It then went on to recommend a series of improvement measures for DMSO and other DoD offices to consider. One strong recommendation was that DoD back away from an earlier tendency toward overselling composability, moving instead to a more particularized approach in which composability is sought within domains where it makes most sense substantively. Another recommendation was that DoD needs to recognize the shortcomings of standard software-engineering paradigms when dealing with "models" rather than pure software. Beyond this, the study had concrete recommendations dealing with science and technology, the base of human capital, management, and infrastructure. Many recommendations involved the need to align more closely with cutting edge technology and emerging standards in the private sector.
A methodology for integrative multimodeling: connecting dynamic and geometry models
Modeling techniques tend to be found in isolated communities: geometry models in CAD and Computer Graphics and dynamic models in Computer Simulation. When models are included within the same digital environment, the ways of connecting them together seamlessly and visually are not well known even though elements from each model have many commonalities. We attempt to address this deficiency by studying specific ways in which models can be interconnected within the same 3D space through effective ontology construction and human interaction techniques.
Rethinking families of models and games for new circumstances
Analytical organizations have long recognized the desirability of hierarchical families of models. Although good model hierarchies have existed from time to time, practice has often fallen short of the ideal. Further, given changes in the nature of warfare, as well as the advent of new theories and technologies, it is time to rethink the entire issue of model families, to set fresh ambitions, and to go about constructing good mutually informed families of both models and analytically structured human war games. The paper offers initial suggestions about how to do so. It is intended, however, as a starting point for a fresh look by the community-i.e., as a stimulus to a more extended debate.
M and S for Course-of-Action Generation and Assessment
icon_mobile_dropdown
Mining simulation data for insights about a decision space: application to an urban combat COA
B. Chandrasekaran, John R. Josephson, Janet O'May, et al.
We start with a vision of an integrated decision architecture to assist in the various stages and subtasks of decisionmaking. We briefly describe how the Seeker-Filter-Viewer (S-F-V) architecture for multi-criterial decision support helps realize many components of that vision. The rest of the paper is devoted to one of the components: developing insights about the course of action (COA) decision space from COA simulations. We start with data obtained from multiple simulation executions of an urban combat COA in a specified scenario, where the stochastic nature of different executions produce a range of intermediate events and final outcomes. The Viewer in the S-F-V decision architecture is used to make and visually test hypotheses about how sensitive different events and outcomes are to different aspects of the COA and to various intermediate events. The analyst engages in a cycle of hypothesis making, visually evaluating the hypothesis, and making further hypotheses. A set of snapshots illustrates an investigational sequence of abstractions in an example of iterating on hypotheses. The synergy of data mining tools, high performance computing, and advanced high-resolution combat simulation has the potential to assist battle planners to make better decisions for imminent combat.
An approach to effects-based modeling for wargaming
Effects-based operations (EBO) are proving to be a vital part of current concepts of operations in military missions and consequently need to be an integral part of current generation wargames. EBO focuses on the producing effects from military activities, as opposed to the direct result of attacking targets. Alternatively, the emphasis of conventional wargames is focused on attrition-based modeling and is incapable of assessing effects and their contribution to the overall mission objectives. For wargames to be effective, they must allow users to evaluate multiple ways to accomplish the same goal with a combination of direct, indirect and cascading events (actions). The focus of this paper is to describe the development of a methodology for the implementation of EBO concepts into modern wargames. The design approach was to develop a generic methodology and demonstrate how simulation objects can incorporate EBO capabilities. The authors will illustrate the application of the methodology utilizing an EBO scenario example, which was developed to test the system.
Simulation-based planning for theater air warfare
Douglas A. Popken, Louis A. Cox Jr.
Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.
Dynamic course-of-action analysis
Michael L. Curry, John Stucklen, Bill Sexton
Dynamic Course Of Action Analysis integrates the Predictive Battlespace Awareness process with the air battle plan to predict the likelihood of Time Sensitive Target occurrence, the likelihood of their discovery, the likelihood that, once discovered, they can be successfully attacked, and the overall probability that TSTs can be successfully countered given the configuration of the daily air battle plan. The Dynamic Course Of Action Decision-aid is a tool that generates predictions and presents them in the form of probabilistic maps, indicating the likelihood of TST occurrence in a given geographical area and the Strike and ISR coverage in those areas. These maps are overlaid on a situational display to provide operators information regarding "gaps" in TST coverage. This paper addresses the problem of properly anticipating TSTs, develops prediction models that assist in the decision making process, and defines the DCOAD architecture. DCOAD's architecture consists of three major components; the graphical user interface, the shared data services, and the estimator framework. These fit into the paradigm of the model-view-controller architecture used in most graphical applications.
Constructing adversarial models for threat/enemy intent prediction and inferencing
Eugene Santos Jr., Allesandro Negri
We examine an adversary model that captures goals, intentions, biases, beliefs, and perceptions based on a dynamic cognitive architecture that evolves over time. The model manages the uncertainty surrounding the adversary using probabilistic networks. In particular, we consider the challenges of constructing such adversaries and provide solutions towards more effective and efficient engineering of such adversaries. We present the AII Template Generator tool which enables the rapid deployment of adversary models as well as on-demand construction of new adversary components.
Cognitive Influences and Effects in Decision-Making
icon_mobile_dropdown
Toward a synthesis of paradigms for decision support
Michael Egner, Paul K. Davis
Over the past half-century, the study of human decision making has evolved from dry philosophy into a diverse set of experimentally-tested, behavior-centered theories. However, the sheer volume of disciplines and sub-disciplines-and the often-esoteric debates that divide them-threaten to obscure the very real advances that have been made in modeling human decision making. This paper, giving preliminary analysis from a longer study,[1] begins to address the "so-what" factor in decision making theory, specifically as related to Air Force modeling, simulation, and decision-support needs. While a general consensus is forming on how humans make decisions (descriptive), there are still major conflicts on how humans should make decisions (normative), and by extension, how human decision making can be improved (prescriptive). The first half of this paper surveys modern decision science, focusing on two of the most influential sub-disciplines: the heuristics & biases paradigm (HBP) and the naturalistic paradigm (NP). The second half of this paper will attempt to sketch out a normative/prescriptive synthesis between the two schools, and chart implications for design of decision support.
Cognitive architectures, rationality, and next-generation AI: a prolegomenon
Paul Bello, Selmer Bringsjord, Yingrui Yang
Computational models that give us insight into the behavior of individuals and the organizations to which they belong will be invaluable assets in our nation's war against terrorists, and state sponsorship of terror organizations. Reasoning and decision-making are essential ingredients in the formula for human cognition, yet the two have almost exclusively been studied in isolation from one another. While we have witnessed the emergence of strong traditions in both symbolic logic, and decision theory, we have yet to describe an acceptable interface between the two. Mathematical formulations of decision-making and reasoning have been developed extensively, but both fields make assumptions concerning human rationality that are untenable at best. True to this tradition, artificial intelligence has developed architectures for intelligent agents under these same assumptions. While these digital models of "cognition" tend to perform superbly, given their tremendous capacity for calculation, it is hardly reasonable to develop simulacra of human performance using these techniques. We will discuss some the challenges associated with the problem of developing integrated cognitive systems for use in modelling, simulation, and analysis, along with some ideas for the future.
Cognitive/emotional models for human behavior representation in 3D avatar simulations
Simplified models of human cognition and emotional response are presented which are based on models of auditory/ visual polymodal fusion. At the core of these models is a computational model of Area 37 of the temporal cortex which is based on new isocortex models presented recently by Grossberg. These models are trained using carefully chosen auditory (musical sequences), visual (paintings) and higher level abstract (meta level) data obtained from studies of how optimization strategies are chosen in response to outside managerial inputs. The software modules developed are then used as inputs to character generation codes in standard 3D virtual world simulations. The auditory and visual training data also enable the development of simple music and painting composition generators which significantly enhance one's ability to validate the cognitive model. The cognitive models are handled as interacting software agents implemented as CORBA objects to allow the use of multiple language coding choices (C++, Java, Python etc) and efficient use of legacy code.
Modeling and Detecting Deceptive Information
icon_mobile_dropdown
Advances in automated deception detection in text-based computer-mediated communication
Mark Adkins, Douglas P. Twitchell, Judee K. Burgoon, et al.
The Internet has provided criminals, terrorists, spies, and other threats to national security a means of communication. At the same time it also provides for the possibility of detecting and tracking their deceptive communication. Recent advances in natural language processing, machine learning and deception research have created an environment where automated and semi-automated deception detection of text-based computer-mediated communication (CMC, e.g. email, chat, instant messaging) is a reachable goal. This paper reviews two methods for discriminating between deceptive and non-deceptive messages in CMC. First, Document Feature Mining uses document features or cues in CMC messages combined with machine learning techniques to classify messages according to their deceptive potential. The method, which is most useful in asynchronous applications, also allows for the visualization of potential deception cues in CMC messages. Second, Speech Act Profiling, a method for quantifying and visualizing synchronous CMC, has shown promise in aiding deception detection. The methods may be combined and are intended to be a part of a suite of tools for automating deception detection.
Toward detecting deception in intelligent systems
Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.
Cognitive hacking and intelligence and security informatics
This paper describes research on cognitive and semantic attacks on computer systems and their users. Several countermeasures against such attacks are described, including a description of a prototype News Verifier system. It is argued that because misinformation and deception play a much more significant role in intelligence and security informatics than in other informatics disciplines such as science, medicine, and the law, a new science of intelligence and security informatics must concern itself with semantic attacks and countermeasures.
M and S for National Defense: Requirements, Issues, and Capabilities
icon_mobile_dropdown
Maritime homeland security modeling: Coast Guard issues and perspective
Timothy R. Girton, Kevin Downer
Under the Department of Homeland Security (DHS), the U.S. Coast Guard (USCG) is the lead agency for maritime homeland security and, as such, is the primary guardian of America's waterways and coastlines. Maritime homeland security presents unique challenges for the USCG operational community and researchers tasked with finding improved technologies, tactics, and procedures that can enhance mission success. USCG "solutions" are seldom myopic; multiple simultaneous traditional mission program areas cannot be ignored. The USCG Research and Development Center (RDC) has been actively engaged in developing models for decision support across all program areas. This paper discusses some homeland security modeling efforts recently conducted by the RDC and the issues associated with their use. Though they are based on Department of Defense-(DoD) developed models, many of the critical aspects of USCG-specific model applications are different from DoD model applications. Day-to-day USCG operations require sorting through a wide variety of benign (legitimate) traffic and activities to detect and prosecute illegal activities. Sensor performance, tactical processes, environmental characteristics, and traffic considerations illustrate the requirements associated with modeling USCG operations. The complexities of determining how to measure mission success are also discussed.
Technology for Wargaming Support
icon_mobile_dropdown
The road ahead for wargaming: the why and how of achieving the next generation of wargaming
Wargames have historically given their users a competituve edge through enhancing the development of individual strategist and particular strategies as well as serving as a catalyst for transformation. While there have always beeen problems with both the accuracy and speed of wargame forecasts these problems have become more acute due to recent trends. This paper will describe the enhancements needed to bring wargaming to its next generation, restoring--even increasing the benefits wargaming has historically provided. These enhancements will increase both the accuracy and speed of wargaming. Accuracy will be enhanced by explicitly modeling; human factors, (both differences in human effectiveness and differences in human decision making), effects of physical systems, (cascading within and between systems), and the depiction of time (influences on the decision loop of each decision node, differences in decision frequency between parent and child nodes). Speed will be enhanced through more efficient user interfaces, greater deployability and built-in reach back.
Theoretical foundations for rational agency in third-generation wargames
Conflict between groups of armed men is as old as recorded history. Effective reasoning and decision-making are fundamental to the successful execution of military operations. These activities are of paramount importance, given the high-stakes nature of conflict; most especially in this modern era of asymmetric threats, and unconventionally armed rogue states. Yet as high as the stakes are, there does not exist a sufficiently formal military theory of reasoning and decision-making that instantiates modern warfighting doctrine. Large bodies of knowledge on reasoning and decision-making exist, but they are not integrated, and they (to the author's knowledge) have not been cast effectively into a military context. Herein, I describe a new theory of military rationality which fully captures the reasoning and decision-making processes of homo militius, military man. The goal of the third generation wargaming effort at the Air Force Research Laboratory's Information Directorate is to produce a high-fidelity simulation of conflict environments in order to facilitate a new brand of highly immersive training for our warfighters and supporting personnel. This environment will be populated by a new breed of intelligent agents that we affectionately call ASC-ME's (Advanced Synthetic Characters for Military Environments). I shall briefly highlight the philosophical foundations for the construction of such entities, and the formal techniques by which they may be modelled and engineered.
Warcon: a wargame construction toolset for military simulations
Daniel D. Fu, Ryan Houlette
The use of wargames in the Air Force curriculum to date has been hindered by the significant amount of time and effort required to develop new wargames, as well as by drastically varying interfaces that impose a steep learning curve on the student. We describe an ongoing effort to develop an advanced wargame construction toolset, called Warcon, that will empower Air Force instructors to create small-scale instructional wargames that embody modern warfare principles. The aim of this toolset is to make the authoring process accessible to a wide range of instructors, via an intuitive visual interface and advanced authoring assistance that eliminate the need for programming. The toolset will feature a customizable adjudication engine with advanced features for modeling effects-based operations, psyops, Military Operations Other Than War, and other aspects of modern warfare. The toolset will present students with a standardized interface, allowing them to build experience with one wargame that will carry to the next, thereby enabling them to focus on the content. Using the proposed toolset, Air Force instructors will be able to design and deploy new wargames into their teaching curriculum more rapidly.
The implementation of AI technologies in computer wargames
Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.
Cyberwar XXI: quantifying the unquantifiable: adaptive AI for next-generation conflict simulations
Joseph Miranda, Peter von Kleinsmid, Tony Zalewski
The era of the "Revolution in Military Affairs," "4th Generation Warfare" and "Asymmetric War" requires novel approaches to modeling warfare at the operational and strategic level of modern conflict. For example, "What if, in response to our planned actions, the adversary reacts in such-and-such a manner? What will our response be? What are the possible unintended consequences?" Next generation conflict simulation tools are required to help create and test novel courses of action (COA's) in support of real-world operations. Conflict simulations allow non-lethal and cost-effective exploration of the "what-if" of COA development. The challenge has been to develop an automated decision-support software tool which allows competing COA’s to be compared in simulated dynamic environments. Principal Investigator Joseph Miranda's research is based on modeling an integrated military, economic, social, infrastructure and information (PMESII) environment. The main effort was to develop an adaptive AI engine which models agents operating within an operational-strategic conflict environment. This was implemented in Cyberwar XXI - a simulation which models COA selection in a PMESII environment. Within this framework, agents simulate decision-making processes and provide predictive capability of the potential behavior of Command Entities. The 2003 Iraq is the first scenario ready for V&V testing.
Dealing with Complexity in Simulation
icon_mobile_dropdown
XML-based resources for simulation
Robert L. Kelsey, Jane M. Riese, Ginger A. Young
As simulations and the machines they run on become larger and more complex the inputs and outputs become more unwieldy. Increased complexity makes the setup of simulation problems difficult. It also contributes to the burden of handling and analyzing large amounts of output results. Another problem is that among a class of simulation codes (such as those for physical system simulation) there is often no single standard format or resource for input data. To run the same problem on different simulations requires a different setup for each simulation code. The eXtensible Markup Language (XML) is used to represent a general set of data resources including physical system problems, materials, and test results. These resources provide a "plug and play" approach to simulation setup. For example, a particular material for a physical system can be selected from a material database. The XML-based representation of the selected material is then converted to the native format of the simulation being run and plugged into the simulation input file. In this manner a user can quickly and more easily put together a simulation setup. In the case of output data, an XML approach to regression testing includes tests and test results with XML-based representations. This facilitates the ability to query for specific tests and make comparisons between results. Also, output results can easily be converted to other formats for publishing online or on paper.
An automated parallel simulation execution and analysis approach
State-of-the-art simulation computing requirements are continually approaching and then exceeding the performance capabilities of existing computers. This trend remains true even with huge yearly gains in processing power and general computing capabilities; simulation scope and fidelity often increases as well. Accordingly, simulation studies often expend days or weeks executing a single test case. Compounding the problem, stochastic models often require execution of each test case with multiple random number seeds to provide valid results. Many techniques have been developed to improve the performance of simulations without sacrificing model fidelity: optimistic simulation, distributed simulation, parallel multi-processing, and the use of supercomputers such as Beowulf clusters. An approach and prototype toolset has been developed that augments existing optimization techniques to improve multiple-execution timelines. This approach, similar in concept to the SETI @ home experiment, makes maximum use of unused licenses and computers, which can be geographically distributed. Using a publish/subscribe architecture, simulation executions are dispatched to distributed machines for execution. Simulation results are then processed, collated, and transferred to a single site for analysis.
Multiple strategy generation for wargaming
Timothy Revello, Robert McCartney, Eugene Santos Jr.
In this paper we present a framework for the automated generation of strategies that accounts for the multiple kinds of uncertainty found in war games, provides for a domain independent approach to strategy generation, and results in robust strategies. Our approach is to sample over multiple trials for varying victory conditions, different threat profiles, and variable system performance to achieve a degree of independence in the resulting strategy. This allows a search for robust strategies versus those that are effective only under specific conditions. War games have uncertainty in what is needed to achieve victory, in system performance, and in threat behavior. There are multiple options for forces, employment, and warfare styles. All these factors combine to produce a large, complex space of possible solutions or strategies. Through the use of powerful search techniques like evolutionary computation and modern computing assets it has become practical to search this space for strategies with robust performance. Our framework is modular in nature, allowing a variety of search techniques, warfare scenarios, system models, and other parameters to be interchanged. In the paper the framework described above is demonstrated using an antisubmarine warfare scenario. Evolutionary programming techniques are used to search the space of possible strategies.
Integrated development of light armored vehicles based on wargaming simulators
Marc Palmarini, John Rapanotti
Vehicles are evolving into vehicle networks through improved sensors, computers and communications. Unless carefully planned, these complex systems can result in excessive crew workload and difficulty in optimizing the use of the vehicle. To overcome these problems, a war-gaming simulator is being developed as a common platform to integrate contributions from three different groups. The simulator, OneSAF, is used to integrate simplified models of technology and natural phenomena from scientists and engineers with tactics and doctrine from the military and analyzed in detail by operations analysts. This approach ensures the modelling of processes known to be important regardless of the level of information available about the system. Vehicle survivability can be improved as well with better sensors, computers and countermeasures to detect and avoid or destroy threats. To improve threat detection and reliability, Defensive Aids Suite (DAS) designs are based on three complementary sensor technologies including: acoustics, visible and infrared optics and radar. Both active armour and softkill countermeasures are considered. In a typical scenario, a search radar, providing continuous hemispherical coverage, detects and classifies the threat and cues a tracking radar. Data from the tracking radar is processed and an explosive grenade is launched to destroy or deflect the threat. The angle of attack and velocity from the search radar can be used by the soft-kill system to carry out an infrared search and track or an illuminated range-gated scan for the threat platform. Upon detection, obscuration, countermanoeuvres and counterfire can be used against the threat. The sensor suite is completed by acoustic detection of muzzle blast and shock waves. Automation and networking at the platoon level contribute to improved vehicle survivability. Sensor data fusion is essential in avoiding catastrophic failure of the DAS. The modular DAS components can be used with Light Armoured Vehicle (LAV) variants including: armoured personnel carriers and direct-fire support vehicles. OneSAF will be used to assess the performance of these DAS-equipped vehicles on a virtual battlefield.
High-Performance Computing Applications for M and S
icon_mobile_dropdown
Toward accelerated line-of-sight intervisibility calculations using clusters of GPUs
Guy A. Schiavone, Judd Tracy, Eric Woodruff, et al.
The processing power of graphical processing units in recent years has been increasing at a rate that exceeds the so-called "Moore's Law" for general purpose CPUs, while the prices for these GPUs has dropped precipitously. Beginning in the late 1990's, researchers realized that this enormous processing power could be used to solve problems other than simply image generation. Almost in parallel to these developments, other researchers began using dedicated networks of commodity computers and supporting network hardware to construct low-cost supercomputers (Beowulf clusters) capable of solving particular problems formerly requiring much more expensive proprietary supercomputers. In this paper we examine combining these two concepts with the eventual intention of rapidly accelerating intervisibility calculations for CGF and constructive simulations. We present initial experimental results on the computation time and scalability of using clustered GPUs to calculate intervisibility over densely populated terrain databases. We also discuss intervisibility correlation between CGF and GPU-based approaches, and present an example of differences in intervisibility calculations that are inherent in the different systems.
SIMPAR: a portable object-oriented simulation-science-based metamodel framework for performance modeling, prediction, and evaluation of HPC systems
We present a novel, portable, platform-independent, object-oriented, simulation-science-based, metamodel framework (SimPar) for performance evaluation, estimation, and prediction of High-Performance Computing (HPC) systems. This UML-based, parallel meta-model enhances the Bulk Synchronous Parallel (BSP) computation model. The UML activity diagram is used to model the computation, communication, and synchronization operations of an application. We also identify the UML building blocks that characterize the message passing and shared memory parallel paradigms. This helps in modeling large and complex parallel applications. Using the collaboration diagram concept, parallel applications are mapped onto different multiprocessor architecture topologies such as hypercube, 2D mesh, ring, tree, star, etc. We present unique UML structural and behavioral extensions for modeling the inter-object interactions in BSP model. The communication semantics such as BROADCAST, GATHER, and SCATTER are incorporated in the metamodel using UML building blocks. In its present form, UML cannot satisfy all the modeling needs. In addition, none of the currently available tool sets deploy UML-based modeling. This underscores the uniqueness of parallel, cluster-based UML-enhanced framework presented here. We have validated the proposed model through benchmarks, simulation-science case studies and real-time parallel applications.
Spatial multibody modeling and vehicle dynamics analysis of advanced vehicle technologies
Michael D. Letherwood, David D. Gunter, David J. Gorsich, et al.
The US Army vision, announced in October of 1999, encompasses people, readiness, and transformation. The goal of the Army vision is to transition the entire Army into a force that is strategically responsive and dominant at every point of the spectrum of operations. The transformation component will be accomplished in three ways: the Objective Force, the Legacy (current) Force, and the Interim Force. The objective force is not platform driven, but rather the focus is on achieving capabilities that will operate as a “system of systems.” As part of the Objective Force, the US Army plans to begin production of the Future Combat System (FCS) in FY08 and field the first unit by FY10 as currently defined in the FCS solicitation(1). As part of the FCS program, the Future Tactical Truck System (FTTS) encompasses all US Army tactical wheeled vehicles and its initial efforts will focus only on the heavy class. The National Automotive Center (NAC) is using modeling and simulation to demonstrate the feasibility and operational potential of advanced commercial and military technologies with application to new and existing tactical vehicles and to describe potential future vehicle capabilities. This document will present the results of computer-based, vehicle dynamics performance assessments of FTTS concepts with such features as hybrid power sources, active suspensions, skid steering, and in-hub electric drive motors. Fully three-dimensional FTTS models are being created using commercially available modeling and simulation methodologies such as ADAMS and DADS and limited vehicle dynamics validation studies are will be performed.
M and S Validation and Testing
icon_mobile_dropdown
Creating a flexible environment for testing scientific software
Mark C. Smith, Robert L. Kelsey, Jane M. Riese, et al.
When writing scientific modeling and simulation software, frequent regression tests can expose bugs that would otherwise create future obstacles. For this reason, regression testing should be a fundamental part of any development process in medium to large-sized projects. In order to implement a flexible solution to this problem, a software testing framework that is based on simple one-to-one comparisons was designed. The comparisons are performed between two different representations of a simulation with one representation considered valid and the other unknown. Using a simple framework has proven to be advantageous in several ways. One of the biggest advantages is that of portability for testing other software. Implementing standardized design patterns allows a degree of flexibility which keeps it from being bound to specific software. For output, the framework is designed to use the eXtensible Markup Language (XML). This results in the ability to publish results in several different formats, archive into a database, and maintain compatibility with other simulation outputs. The preliminary results of implementing this framework have proven promising. Using object-oriented design has not only simplified development but has allowed for a more user friendly approach to testing. Future improvements include user-customized test cases, ad hoc queries for archived results, and automatic test result publication.
Joint Synthetic Battlespace (JSB)
icon_mobile_dropdown
Analysis of technical alternative technologies for the development of context-driven composable environmental representations for JSB
John R. Hummel, Jeff J. Bergenthal, William F. Seng, et al.
The Joint Synthetic Battlespace for the Air Force (JSB-AF) is being developed to provide realistic representations of friendly and threat capabilities and the natural environmental conditions to support a variety of Department of Defense missions including training, mission rehearsal, decision support, acquisition, deployment, employment, operations, and the development of Courses of Action. This paper addresses three critical JSB issues associated with providing environ-mental representations to Modeling and Simulation (M&S) applications. First, how should the requirements for envi-ronmental functionality in a JSB-AF application be collected, analyzed, and used to define an Authoritative Environ-mental Representation (AER)? Second, how can JSB-AF AERs be generated? Third, once an AER has been generated, how should it be “served up” to the JSB-AF components? Our analyses of these issues will be presented from a general M&S perspective, with examples given from a JSB-AF centered view. In the context of this effort, the term “representa-tions” is meant to incorporate both basic environmental “data” (e.g., temperature, pressure, slope, elevation, etc.) and “effects”, properties that can be derived from these data using physics-based models or empirical relationship from the fundamental data (e.g., extinction coefficients, radiance, soil moisture strength, etc.) We present a state-of-the-art review of the existing processes and technologies that address these questions.
JSB composability and web services interoperability via extensible modeling and simulation framework (XMSF) and model driven architecture (MDA)
Donald P. Brutzman, Andreas Tolk
This paper summarizes research work combined by academic and commercial organizations concerned with interoperable distributed information technology (IT) applications. Although the application focus is distributed modeling and simulation (M&S) the results and findings are in general easily applicable to other distributed mission operations. The core idea of this paper is to show the necessity of applying open standards for component description, implementation, and integration accompanied by aligned management processes and procedures to enable continuous interoperability for legacy and M&S components of the live, virtual, and constructive domain within USAF Joint Synthetic Battlespace (JSB). The applied methods are derived from the Extensible Modeling and Simulation Framework (XMSF) and the Model Driven Architecture (MDA) ensuring reuse, composability, and orchestration of heterogeneous components to fulfill user-driven requirements for persistent and ad-hoc IT federations.
A flexible simulation environment for command and control
A 1995 vision statement for Air Force Modeling and Simulation (M&S) highlighted the need for a Joint Synthetic Battlespace (JSB); an environment wherein warfighters could train and exercise on their real-world equipment while immersed in a realistic contingency or wartime environment. This paper describes our efforts to develop a Joint Synthetic Battlespace for Research and Development (JSB-RD), which will provide a realistic environment within which technologies being developed at AFRL's Information Directorate can be analyzed and tested. Where possible, this environment will attach to operational systems in order to provide military realism that will ultimately improve and shorten the tech transition process. This reconfigurable testbed will provide scalability and evolve over time building upon previous federations, attaching to other federations, while incorporating lessons learned along the way.
Full-spectrum CSE prototype for sensor analysis
The emerging Common Synthetic Environment (CSE) soon will contain specialized models for sensor performance covering the Infrared and Visual and the tactical RF bandwidths. Sufficient GFE models exist to cover the full exploited electromagnetic spectrum, supporting systems from long wavelengths (thousands of meters wavelength), to Ultraviolet (10 nanometers). The CSE can easily be expanded to cover the full spectrum since it is our premise the available government models overlap and cover completely. The issues of consistent representation of environmental effects, correlated behavior of effects toward sensors, and “fair fight,” all point toward maximizing the use of a centralized Environment Server, containing in one place the air, space, terrain, and ocean data needed to support effects models, and use of a broad range of models covering various frequencies, levels of detail, and application domains. The beginnings of such servers already exist in the JWARS Environment Server and the emerging CSE. Careful management of the populating of such a server will assure that all models in a JSB federation are compatible both with the basic environment data and the correlated effects models each federate will use.
Complex simulation system infrastructure supporting composability and performance
Matthew Dorsch, Barbara Hannibal, Victor Skowronski, et al.
The increased emphasis on the need for composability in simulations has led to architectures with server simulations that provide a service to several other simulations. One such architecture was the Joint Synthetic Battlespace (JSB) Spring Experiment 2002 architecture, which included a Common Synthetic Environment (CSE) server for atmospheric effects. The intent of using a CSE was to provide a standard model for all atmospheric effects. The common model would assist in the evaluation of sensor systems by removing any bias due to different environmental effects models. When the CSE server was proposed, it became apparent that the number of requests for environmental effect calculations might overwhelm the server, or cause excessive network activity. This paper examines the ability of the current architecture to scale operational levels. The paper also proposes modifications to the current architecture that can enhance the scalability of the architecture without impairing its composability.
Collaborative Decision Support Systems and Environments
icon_mobile_dropdown
The collaboration grid: trends for next-generation distributed collaborative environments
Distributed collaboration will be a pervasive technology that will significantly change how decisions are made in the 21st century. Advanced collaborative technologies are evolving rapidly with changes in the underlying computer and information technology. Collaboration is typically defined as two or more geographically dispersed entities working together to share and exchange data, information, knowledge, and actions. This paper will address how evolving technologies and new trends such as web services and grid computing will impact distributed collaborative environments. A new conceptual environment called the Collaboration Grid based on these new standards is evolving. The marriage of advanced information, collaboration, and simulation technologies will provide the decision maker with a new generation of collaborative virtual environments for planning and decision support.
Multidomain operations science and technology
Matthew J. Kochan, Timothy A. Farrell
It is said that information superiority is perhaps the greatest asset in war. In this era of both network and Coalition centric warfare, each aimed at establishing information dominance, technologies which support multi-domain operations are more critical than ever. This paper examines the components of multi-domain operations and presents two complementary methods, cross-domain information sharing and multi-domain windowing, to achieve its four major functions: dissemination, discovery & retrieval, collaboration, and resource management. Analysis of these method based solutions reveals the opportunity to devise a collection of services which align with DoD and industry migration towards service oriented architectures as well as more diversity in secure networking schemes.
Architecture for a simulation assembly language supporting sequential, distributed, and hardware operational modes
We define an assembly-layer block language as the Dynamics eXchange Language (DXL), and discuss methods for supporting sequential simulation and distributed simulation by varying the target code generator. DXL is an XML-based language that positions itself between higher level modeling languages and a programming code. Through the use of the XML Document Object Model (DOM), we demonstrate a translation approach that yields a target code in two languages for simulation and distributed computing.
Virtual collaboration: face-to-face versus videoconference, audioconference, and computer-mediated communications
Lynne Wainfan, Paul K. Davis
As we increase our reliance on mediated communication, it is important to be aware the media's influence on group processes and outcomes. A review of 40+ years of research shows that all media-videoconference, audioconference, and computer-mediated communication--change the context of the communication to some extent, reducing cues used to regulate and understand conversation, indicate participants' power and status, and move the group towards agreement. Text-based computer-mediated communication, the “leanest” medum, reduces status effects, domination, and consensus. This has been shown useful in broadening the range of inputs and ideas. However, it has also been shown to increase polarization, deindividuation, and disinhibition, and the time to reach a conclusion. For decision-making tasks, computer-mediated communication can increase choice shift and the likelihood of more risky or extreme decisions. In both videoconference and audioconference, participants cooperate less with linked collaborators, and shift their opinions toward extreme options, compared with face-to-face collaboration. In videoconference and audioconference, local coalitions can form where participants tend to agree more with those in the same room than those on the other end of the line. There is also a tendency in audioconference to disagree with those on the other end of the phone. This paper is a summary of a much more extensive forthcoming report; it reviews the research literature and proposes strategies to leverage the benefits of mediated communication while mitigating its adverse effects.
Collaborative effects-based planning using adversary models and target set optimization
Nicholas J. Pioch, Troy Daniels, Bradford Pielech
The Strategy Development Tool (SDT), sponsored by AFRL-IFS, supports effects-based planning at multiple levels of war through three core capabilities: plan authoring, center of gravity (COG) modeling and analysis, and target system analysis. This paper describes recent extensions to all three of these capabilities. The extended plan authoring subsystem supports collaborative planning in which a user delegates elaboration of objectives to other registered users. A suite of collaboration tools allows planners to assign planning tasks, submit plan fragments, and review submitted plans, while a collaboration server transparently handles message routing and persistence. The COG modeling subsystem now includes an enhanced adversary modeling tool that provides a lightweight ontology for building temporal causal models relating enemy goals, beliefs, actions, and resources across multiple types of COGs. Users may overlay friendly interventions, analyze their impact on enemy COGs, and automatically incorporate the causal chains stemming from the best interventions into the current plan. Finally, the target system analysis subsystem has been extended with option generation tools that use network-based optimization algorithms to select candidate target set options to achieve specified effects.
An application of JBI technology to distributed simulation
The Air Force Research Laboratory (AFRL) is continually conducting research into new technologies for future aerospace command and control. The Virtual Testbed for Advanced Command and Control (VTAC), an AFRL/IFSD effort under development, provides a realistic context for developing, demonstrating, and assessing information technologies for command and control. VTAC incorporates collaborative technologies with modeling and simulation to drive and evaluate proposed command and control systems. The most recent phase of VTAC development focuses upon applying the AFRL Joint Battlespace Infosphere (JBI) to distribute information efficiently throughout the VTAC. To this end, efforts are under way to integrate JBI with two key components of the VTAC: the AFRL Collaborative Enterprise Environment (CEE) and the High Level Architecture (HLA). CEE is the collaborative framework for VTAC, while HLA provides VTAC a capability for distributed simulation. These integration efforts promise new and interesting information and decision flows within the VTAC. Areas under investigation include data representation methodologies, intelligent agents, and user interfaces. The integration of JBI, CEE, and HLA promises to enhance VTAC capabilities in support of key initiatives such as Agile Research and Development/Science and Technology, Predictive Battlespace Awareness, and Effects-Based Operations.
Enterprise Modeling Applications and Techniques
icon_mobile_dropdown
End-to-end network models encompassing terrestrial, wireless, and satellite components
Chandler L. Boyarko, John S. Britton, Phil E. Flores, et al.
Development of network models that reflect true end-to-end architectures such as the Transformational Communications Architecture need to encompass terrestrial, wireless and satellite component to truly represent all of the complexities in a world wide communications network. Use of best-in-class tools including OPNET, Satellite Tool Kit (STK), Popkin System Architect and their well known XML-friendly definitions, such as OPNET Modeler's Data Type Description (DTD), or socket-based data transfer modules, such as STK/Connect, enable the sharing of data between applications for more rapid development of end-to-end system architectures and a more complete system design. By sharing the results of and integrating best-in-class tools we are able to (1) promote sharing of data, (2) enhance the fidelity of our results and (3) allow network and application performance to be viewed in the context of the entire enterprise and its processes.
Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies
The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.
SimBOX: a scalable architecture for aggregate distributed command and control of spaceport and service constellation
Guru Prasad, Sanjay Jayaram, Jami Ward, et al.
In this paper, Aximetric proposes a decentralized Command and Control (C2) architecture for a distributed control of a cluster of on-board health monitoring and software enabled control systems called SimBOX that will use some of the real-time infrastructure (RTI) functionality from the current military real-time simulation architecture. The uniqueness of the approach is to provide a “plug and play environment” for various system components that run at various data rates (Hz) and the ability to replicate or transfer C2 operations to various subsystems in a scalable manner. This is possible by providing a communication bus called “Distributed Shared Data Bus” and a distributed computing environment used to scale the control needs by providing a self-contained computing, data logging and control function module that can be rapidly reconfigured to perform different functions. This kind of software-enabled control is very much needed to meet the needs of future aerospace command and control functions.
Simulation Environments and Frameworks
icon_mobile_dropdown
A customizable approach to visual programming using dynamic multimodeling
There has always been a fine line separating the art of programming and dynamic modeling for simulation. This research attempts to bridge the two by explicitly defining how dynamic multimodeling methods, from the field of simulation, can be used to create computer programs. We investigate the relation between modeling and programming, especially from the field of UML. We also cover the elements of how programming language principles and theory can be manifested in dynamic systems, and illustrate this using a customizable visual modeling system.
Integration and cooperation of Army logistics simulations for multiphased military deployments
Richard J. Love, Dariusz Blachowicz, Mark Bragen, et al.
Military deployment planners and analysts must consider the constraints, options, and available infrastructure of a network of installations and ports, from the beginning of the transportation system in the United States to the end of the deployment in the host country. Argonne National Laboratory developed a suite of models that simulate and visualize these deployments. There are discrete event simulations (the Enhanced Logistics Intra-theater Support Tool, the Transportation System Capability model, and the Port Simulation model) as well as several data editing and visualization tools. This paper presents the models, and discusses how they interact and leverage their shared data and technologies, to facilitate deployment analysis.
Software framework in support of dynamic situation assessment and predictive capabilites for JSB-RD
Robert M. McGraw, Craig Lammers, Jeffrey S. Steinman
Recent technological advances and emerging threats greatly compress the timeline between target detection and action to an order of a few minutes. As such, decision support tools for today's C4I systems must assist commanders in anticipating potential outcomes by providing predictive assessments of alternate Courses Of Action (COAs). These assessments are supported by faster-than-real-time predictive simulations that analyze possible outcomes and re-calibrate with real-time sensor data or extracted knowledge in real-time. This capability is known as a Dynamic Situation Assessment and Prediction (DSAP) capability. This capability allows decision-makers to assess the effects of re-tasking opportunities, providing the decision-maker with tremendous latitude to make time-critical, mid-course decisions. This paper details the development of a software infrastructure that supports a DSAP capability for decision aids as applied to a Joint Synthetic Battlespace for Research and Development (JSB-RD). This infrastructure supports capabilities that allow objects to be dynamically created, deleted and reconfigured, allows simulations to be calibrated with live data feeds, and provides a reduction in overheads for simulations in order to execute faster-than-real-time in order to provide a predictive capability. In particular, this paper will focus on a Multiple Replication Framework that can be used to support a DSAP infrastructure.