Proceedings Volume 6578

Defense Transformation and Net-Centric Systems 2007

cover
Proceedings Volume 6578

Defense Transformation and Net-Centric Systems 2007

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 30 April 2007
Contents: 9 Sessions, 46 Papers, 0 Presentations
Conference: Defense and Security Symposium 2007
Volume Number: 6578

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 6578
  • Net-Centric Systems, Architectures and Services
  • Information Management Architectures and Experimentation
  • Sensor Networks
  • Communications and Networks
  • Self-organizing Collaborative ISR Robotic Teams I: Joint Session with Conference 6561
  • GeoInt Systems
  • Predictive Analytic Modeling
  • MDA Session
Front Matter: Volume 6578
icon_mobile_dropdown
Front Matter: Volume 6578
This PDF file contains the front matter associated with SPIE Proceedings Volume 6578, including the Title Page, Copyright information, Table of Contents, Introduction, and the Conference Committee listing.
Net-Centric Systems, Architectures and Services
icon_mobile_dropdown
Tactical service-oriented architecture over wireless communications
This paper reports on the results of testing General Dynamics AIS' Tactical Service-Oriented Architecture over wireless communications during flight tests run at the Air Force's Airborne Networking CRADA in 2006. The wireless, tactical domain presents a number of challenges. In particular, details of efficiency, reliability, and interoperability are a concern in this solution. Provided within this paper is discourse on why these details are relevant and how the approach taken addresses these details in a tactical domain. Also shown is how this approach differs from a traditional enterprise Service-Oriented Architecture. Finally there is a discussion of the results of the testing, as well as what steps can be taken in the future and what challenges must be overcome.
Testbed for large volume surveillance through distributed fusion and resource management
Pierre Valin, Adel Guitouni, Éloi Bossé, et al.
DRDC Valcartier has initiated, through a PRECARN partnership project, the development of an advanced simulation testbed for the evaluation of the effectiveness of Network Enabled Operations in a coastal large volume surveillance situation. The main focus of this testbed is to study concepts like distributed information fusion, dynamic resources and networks configuration management, and self synchronising units and agents. This article presents the requirements, design and first implementation builds, and reports on some preliminary results. The testbed allows to model distributed nodes performing information fusion, dynamic resource management planning and scheduling, as well as configuration management, given multiple constraints on the resources and their communications networks. Two situations are simulated: cooperative and non-cooperative target search. A cooperative surface target behaves in ways to be detected (and rescued), while an elusive target attempts to avoid detection. The current simulation consists of a networked set of surveillance assets including aircraft (UAVs, helicopters, maritime patrol aircraft), and ships. These assets have electrooptical and infrared sensors, scanning and imaging radar capabilities. Since full data sharing over datalinks is not feasible, own-platform data fusion must be simulated to evaluate implementation and performance of distributed information fusion. A special emphasis is put on higher-level fusion concepts using knowledge-based rules, with level 1 fusion already providing tracks. Surveillance platform behavior is also simulated in order to evaluate different dynamic resource management algorithms. Additionally, communication networks are modeled to simulate different information exchange concepts. The testbed allows the evaluation of a range of control strategies from independent platform search, through various levels of platform collaboration, up to a centralized control of search platforms.
An evaluation of case-based classification to support automated web service discovery and brokering
Roy Ladner, Elizabeth Warner, Fred Petry, et al.
In this paper we evaluate the use of case-based classification to resolve a number of questions related to information sharing in the context of an Integrated Web services Brokering System (IWB). We are developing the IWB to independently decompose and analyze ad hoc Web services interface descriptions in order to identify Web services of interest. Our approach is to have the IWB cache information about each service in order to support an autonomous mediation process. In this mediation process, the IWB independently matches the user's data request to the correct method within the appropriate Web service, translates the user's request to the correct syntax and structure of the Web service request, dynamically invokes the method on the service, and translates the Web service response. We use casebased classification as a means of automating the IWB's analysis of relevant services and operations. Case-based classification retrieves and reuses decisions based on training data. We use sample Web Service Description Language (WSDL) files and schema from actual Web services as training data in our approach and do not require the service to pre-deploy an OWL-S ontology. We present our evaluation of this approach and performance ratings in the context of meteorological and oceanographic (MetOc) Web services as it relates to the IWB.
Enabling dynamic interoperability with multiple community of interest (COI) systems
A range of Community of Interest (COI) Infospheres and systems is being independently developed and deployed by separate elements of U. S. forces and potential coalition partners. Because future operations will increasingly rely on seamless exchange of information between coalition partners, it is critical that all tactical and command elements be able to dynamically interact with these diverse systems. Solving this issue requires that each network element (platform, commander, war-fighter, etc.) be able to span, dynamically join and leave different COI systems as operational requirements dictate. The COI Interoperability Agent (CIA) is the centerpiece of our solution. It will enable each battle space entity to join, interact with, and leave multiple COIs. Each CIA consists of a common core containing the Information Router, COI Initiator (COIN) factory, Platform Initiator (PIN) factory, and Security Manager Components along with one or more platform modules and COI modules. Bi-directional information flow is directed by the Information Router. The COIN enables dynamic connection to a COI. A COIN consists of two parts: 1) a Java Jar file containing the COI Module code establishing a COI connection and 2) a data component that configures the COI Module. The CIA uses the COIN factory to load and configure new COI Modules. The PIN factory fills a similar role for Platform Modules. The Platform Module contains code to link to a specific tactical entity. The CIA concept provides a path for the war-fighter to dynamically connect to multiple COIs without a priori knowledge of COIs that will be needed.
An investigative analysis of information assurance issues associated with the GIG's P&P architecture
B. S. Farroha, R. G. Cole, D. L. Farroha, et al.
The Global Information Grid (GIG) is a collection of systems, programs and initiatives aimed at building a secure network and set of information capabilities modeled after the Internet. The GIG is expected to facilitate DoD's transformation by allowing warfighters, policy makers and support personnel to engage in rapid decision making. The roadmap is designed to take advantage of converged services of voice, data, video, and imagery over common data links. The vision is to have commanders identify threats more effectively, make informed decisions, and respond with greater precision and lethality. The information advantage gained through the GIG and network-centric warfare (NCW) allows a warfighting force to achieve dramatically improved information positions, in the form of common operational pictures that provide the basis for shared situational awareness and knowledge, and a resulting increase in combat power. The GIG Precedence and Preemption (P&P) requirements stem from the need to utilize scarce resources at critical times in the most effective way in support of national security, the intelligence community and the war-fighter. Information Assurance (IA) enables all information and data to be available end-to-end to support any mission without delay in accordance to the sensitivity of the task. Together, P&P and IA ensure data availability integrity, authentication, confidentiality, and non-repudiation. This study addresses and analyzes the QoS and P & P requirements and architecture for the GIG. Threat scenarios are presented and used to evaluate the reference architectures. The goal of the study is to assess the Information Assurance concerns associated with implementing Precedence and Preemption within the GIG and to guarantee an acceptable minimum level of security and protection for DoD networks.
Embedded instrumentation systems architecture
This paper describes the operational concept of the Embedded Instrumentation Systems Architecture (EISA) that is being developed for Test and Evaluation (T&E) applications. The architecture addresses such future T&E requirements as interoperability, flexibility, and non-intrusiveness. These are the ultimate requirements that support continuous T&E objectives. In this paper, we demonstrate that these objectives can be met by decoupling the Embedded Instrumentation (EI) system into an on-board and an off-board component. An on-board component is responsible for sampling, pre-processing, buffering, and transmitting data to the off-board component. The latter is responsible for aggregating, post-processing, and storing test data as well as providing access to the data via a clearly defined interface including such aspects as security, user authentication and access control. The power of the EISA architecture approach is in its inherent ability to support virtual instrumentation as well as enabling interoperability with such important T&E systems as Integrated Network-Enhanced Telemetry (iNET), Test and Training Enabling Architecture (TENA) and other relevant Department of Defense initiatives.
Widely distributed C4ISR
Advances in networking and communications make the dream of a highly connected mobile war fighter, persistent networked sensors, and distributed command and control a reality. However, being able to communicate is only the first part of the problem. The ability to easily communicate with a wide variety of highly distributed sensors and systems presents significant new problems that need to be addressed. First, an application must discover what services are available and establish communications with the desired services. Secondly, time synchronization across all of the networked systems is critical to correctly correlating the information into a coherent picture. In addition, maintaining data consistency in a highly distributed environment is an extremely challenging problem. Given the amount of data available clients must be able to subscribe to specific data in order to avoid information/system overload. Finally the information must be presented to the user in a form and on a platform well suited to the task at hand. All of these problems, and many more, must be solved in order to deliver a truly effective net-centric C4ISR system. A software architecture will be presented that attempts to solve the issues described above. The architecture inherently includes many features designed to address these issues. In addition, the user can select data from a wide variety of services, both local and remote and control how it is accessed, processed, and displayed. A detailed analysis of each of these techniques and how it impacts the effectiveness of the system will be discussed.
Measuring kill chain performance in complex environments
Scott James, Andrew Coutts, Colin Stanford, et al.
Offensive Support (OS) modelling has generally not been implemented within a closed simulation in such a way that its contribution to the overall mission performance can be captured, measured and integrated. However, the issue of realistically measuring OS performance becomes more critical as new technologies are proposed to improve or compress the Kill Chain, particularly in the context of complex environments. A study is being conducted to determine and implement an explicit Kill Chain in CASTFOREM such that it can be configured to use a variety of components and its impact on performance can be measured and compared. To assess the Kill Chain, six measures have been adopted from the original research done by the Royal Australian Air Force and Air Operations Division, DSTO. These are Timeliness, Appropriateness, Precision, Discrimination, Orchestration and Survivability, referred to as TAPDOS. These performance measures will allow the study to align with accepted standards of OS usage in the Australian joint fires environment, and to facilitate the use of Subject Matter Experts to support the study and promulgate performance results. The outcome of the study will be a closed simulation capable of identifying and reporting specific Kill Chain events and measures associated with the target performance demands, system performance availability, system selection and performance delivered.
Information Management Architectures and Experimentation
icon_mobile_dropdown
Evaluating technologies for tactical information management in net-centric systems
Ming Xiong, Jeff Parsons, James Edmondson, et al.
Recent trends in distributed real-time and embedded (DRE) systems motivate the development of tactical information management capabilities that ensure the right information is delivered to the right place at the right time to satisfy quality of service (QoS) requirements in heterogeneous environments. A promising approach to building and evolving large-scale and long-lived tactical information management systems are standards-based QoS-enabled publish/subscribe (pub/sub) platforms that enable applications to communicate by publishing information they have and subscribing to information they need in a timely manner. Since there is little existing evaluation of how well these platforms meet the performance needs of tactical information management, this paper provides two contributions: (1) it describes three common architectures for the OMG Data Distribution Service (DDS), which is a QoS-enabled pub/sub platform standard, and (2) it evaluates three implementations of these architectures to investigate their design tradeoffs and to compare their performance. Our results show that DDS implementations perform well in general and are well-suited for certain classes of data-critical tactical information management systems.
Dynamic policy enforcement in JBI information management services with the KAoS policy and domain services
Justin Donnelly, Jacob Madden, Alden Roberts, et al.
English-language policies about the desired behavior of computer systems often suffer from translation errors when implemented with a proliferation of low-level rules governing access control, resource allocation and configuration. To solve this, Dynamic Policy Enforcement systems replace these low-level rules with a relatively small number of semantically grounded, machine-understandable policy statements. These statements use domain terms defined in an ontology that are formally defined so that they can be enforced by the system but also meaningful to human administrators to ensure that they accurately represent organizational policies. In this paper, we describe the application of one such Dynamic Policy Enforcement system, KAoS, to the control of distributed, information-management services defined by the Air Force Research Laboratory's Joint Battlespace Infosphere (JBI) program. Our research allows administrators to define the desired behavior of the participants in the system, both human and software, with one collection of well defined policies. As a result, a single set of tools for the definition, analysis, control, and monitoring of policy can be used to implement access control, service configuration, and service delivery prioritization.
A QoS management system for dynamically interoperating network-centric systems
Joseph P. Loyall, Praveen K. Sharma, Matthew Gillen, et al.
Effective network-centric warfare requires information exchange with suitable quality of service (QoS) to meet the warfighter's needs. Information delivered too late or with the wrong resolution, form, or precision is insufficient for the user to perform his role in a warfighting scenario. Key characteristics of network-centric warfare environments, as instantiated by the Global Information Grid, are dynamic reconfiguration and interoperability, in which Communities of Interest (COIs) can be formed and reformed dynamically to respond to real-time threats and unfolding situations. There is a need for a QoS management capability that can support the dynamic interoperability and real-time requirements of networkcentric warfare. In order to be effective, this QoS management capability must manage the production, delivery, and consumption of information within available resources, mediate competing demands for resources, and adjust to dynamic conditions. In this paper, we describe the architecture for a QoS Management System (QMS) that works alongside information management systems in dynamic COIs. The QMS provides QoS management (including resource management and quality of information management) in dynamically changing, mission driven environments for interoperating assets within a COI and for assets and resources shared among COIs. The QMS provides mechanisms for QoS policy specification, QoS enforcement and monitoring, dynamic resource allocation, and application adaptation in dynamic COIs. It is based on a layered architecture that maps mission requirements to QoS policies and enforcement. We describe the QMS architecture, prototype implementation, demonstration, and evaluation. Based on these experiences, we also discuss future research directions.
AIMS taking on roles to support tactical information dominance
Military solutions to enable information sharing are being developed that will fundamentally change future concepts of operation. The development of sophisticated approaches to managing this information is a key element to reliably disseminate valued information to the tactical edge. This paper will describe the merging of two such systems to support these tactical edge users; the Air Force Research Laboratory (AFRL) Joint Battlespace Infosphere Reference Implementation (JBI/RI) and the Northrop Grumman Advanced Information Architecture (AIATM). The newly formed system is called the Advanced Information Management System (AIMS). The resulting technology, rooted in a service oriented approach, provides a managed information dissemination approach through the use of publish, subscribe, and query services. Information can be collected and shared among Communities of Interest (COI) without specific involvement from the tactical users. Persistence (via archiving to repositories), is a new capability added to the existing AIATM. Extreme care is taken to effectively manage the information within this dynamic environment. For example, Information resulting from queries and subscriptions is cached to mitigate potential bandwidth challenges at critical location within the system. AIMS improves security by allowing the establishment of roles for retrieval/publishing of information. The access to information is controlled not only at the message level but also by specified elements within the metadata tags. Lastly, the fortification of AIMS with Web Services allows for a highly cohesive loosely coupled design. AIMS utilizes of a Universal Description, Definition, and Integration (UDDI)[2] registry to describe and register services within the architecture. The UDDI allows implementations outside of AIMS (3rd party) to invoke any of the registered services for use within their own applications.
iFUSE: a development environment for composable easy-to-assemble information transforms
Robert A. Joyce, Jennifer P. Cormier
A crucial component of a net-centric information management system is a set of simple programs or scripts - fuselets - that effect small transformations on available data. Individual fuselets can perform tasks such as filtering, aggregation, monitoring, format conversion, and simple image manipulation. The global effect of a collection of cooperating fuselets is to add value to the system: to transform data into knowledge. Fuselets are also adept at bridging heterogeneous systems, providing consumers the data they need in the format they require - not necessarily the format that was convenient for the original data producer. ATC-NY has created an extensible fuselet development environment, iFUSE, that provides the support fuselet developers need in order to create and discover fuselets, avoid design and efficiency pitfalls, and ensure the appropriate factorization of fuselet code. For the individual fuselet, iFUSE lets the user focus on the information being transformed, not the code needed to implement the transformation. iFUSE also helps the designer understand the environment in which the fuselet operates, automatically detecting potential data flow problems and providing visualization tools such as "fuselet slicing," which allows fuselet authors and infosphere maintainers to assess the effects of additions and changes in context.
Semantic mediation and transformation services: perspectives on military application areas
Semantic technology is becoming an increasingly viable solution to interoperability problems that arise as user communities seek to interact within and across information spaces such as those that ride on the Global Information Grid (GIG).1 Semantically-aware cross-domain information transformation and service mediation capabilities can be used to improve interactions between diverse Communities of Interest (COIs) and the software applications that service them.2 This paper is not so much about the details of the technology itself, but is intended rather to focus on the operational domains in which interoperability problems and needs exist for the warfighter that are likely to benefit from the application of semantic technologies. It attempts to unveil military application areas against which technological solutions could be developed by academia, industry, and other technology experts to forge a path toward semantic interoperability and, ultimately, information superiority.3 It then provides an overview of on-going and possible future areas of semantic technology research and development being pursued by the Air Force Research Laboratory's Information Directorate (AFRL/IF) located in Rome, NY.
Pedigree management and assessment in a net-centric environment
Marisa M. Gioioso, S. Daryl McCullough, Jennifer P. Cormier, et al.
Modern Defense strategy and execution is increasingly net-centric, making more information available more quickly. In this environment, the intelligence agent or warfighter must distinguish decision-quality information from potentially inaccurate, or even conflicting, pieces of information from multiple sources - often in time-critical situations. The Pedigree Management and Assessment Framework (PMAF) enables the publisher of information to record standard provenance metadata about the source, manner of collection, and the chain of modification of information as it passed through processing and/or assessment. In addition, the publisher can define and include other metadata relevant to quality assessment, such as domain-specific metadata about sensor accuracy or the organizational structure of agencies. PMAF stores this potentially enormous amount of metadata compactly and presents it to the user in an intuitive graphical format, together with PMAF-generated assessments that enable the user to quickly estimate information quality. PMAF has been created for a net-centric information management system; it can access pedigree information across communities of interest (COIs) and across network boundaries and will also be implemented in a Web Services environment.
Composition modeling framework (CMF)
In this paper, we propose the Composition Modeling Framework (CMF) as a standards-based information engineering methodology that tackles existing and emerging Department of Defense (DoD) interoperability problems using a bottom-up approach. We introduce CMF capabilities within the context of an information space composed of repositories and a catalog that enables consumers' and producers' information requirements to be met. This information space supports dynamic and unscripted interaction among various producers and consumers, but its power as an information management tool is best harnessed when its participants share common goals operating in unison as a Community of Interest (COI). We present the CMF as one approach for representing the structure, meaning and abstract implementation of the underlying information space that services a COI and its participants. In addition to facilitating intra- COI interoperability, we demonstrate how CMF concepts can be used to construct cross-domain interoperability solutions by supporting inter-COI communication and understanding.
Sensor Networks
icon_mobile_dropdown
Effectively networking unattended ground sensors
DARPA has conducted a number of field measurement, modeling, and system design trade study efforts to determine the most effective way of linking Unattended Ground Sensors (UGS). These network links are needed for both collaborative operations and data and information exfiltration to users. This paper provides an informal summary of the findings of this work.
Methods for calculating the probability of detection and target location error of unattended ground sensors
Methodologies for analyzing the detection and tracking performance of Unattended Ground Sensors are developed. These are then applied to the case of the Massively Deployed Unattended Ground Sensor dataset generated during the Network Sensors for the Future Force Advanced Technology Demonstration conducted in the summer of 2004. For detection, results from three methods are compared, all based on the consistency and accuracy of Line of Bearing reports. For tracking, a more straightforward statistical analysis is conducted.
A novel framework for command and control of networked sensor systems
Genshe Chen, Zhi Tian, Dan Shen, et al.
In this paper, we have proposed a highly innovative advanced command and control framework for sensor networks used for future Integrated Fire Control (IFC). The primary goal is to enable and enhance target detection, validation, and mitigation for future military operations by graphical game theory and advanced knowledge information fusion infrastructures. The problem is approached by representing distributed sensor and weapon systems as generic warfare resources which must be optimized in order to achieve the operational benefits afforded by enabling a system of systems. This paper addresses the importance of achieving a Network Centric Warfare (NCW) foundation of information superiority-shared, accurate, and timely situational awareness upon which advanced automated management aids for IFC can be built. The approach uses the Data Fusion Information Group (DFIG) Fusion hierarchy of Level 0 through Level 4 to fuse the input data into assessments for the enemy target system threats in a battlespace to which military force is being applied. Compact graph models are employed across all levels of the fusion hierarchy to accomplish integrative data fusion and information flow control, as well as cross-layer sensor management. The functional block at each fusion level will have a set of innovative algorithms that not only exploit the corresponding graph model in a computationally efficient manner, but also permit combined functional experiments across levels by virtue of the unifying graphical model approach.
Communications and Networks
icon_mobile_dropdown
Interference multiple access communications
L. Reggie Brothers, James A. DeBardelaben, Joshua Niedzwiecki, et al.
The implementation of network centric warfare on the battlefield has driven the growing demand for high capacity warfighter communication systems. Although new high capacity SATCOM systems such as WGS are being introduced in the near term, these systems use the interference avoidance paradigm, which fundamentally limits overall network performance. This paper introduces a new wireless networking paradigm called Interference Multiple Access (IMA), developed under the auspices of DARPA. The interference multiple access paradigm exploits multi-access interference to enable revolutionary improvements in wireless communication capacity and latency without the need for infrastructure, coordination, or spectrum preplanning. Simulation and over-the-air test results suggest that greater than 3X increases in network throughput (especially in low SNR scenarios) can be achieved over traditional contention and scheduled-based spectrum access approaches when applied to WIN-T NCW terminals communicating in a mesh topology over the WGS constellation.
Throughput of 802.11g wireless devices in ad hoc mode
Brian B. Luu, Rommie L. Hardy
The U.S. Army Research Laboratory has used IEEE 802.11g standard wireless LANs for implementation in mobile ad hoc networks (MANET). One common problem with the use of 802.11g wireless devices is maintaining a high operational throughput over distances. In this paper, we assess the throughput performance of four 802.11g wireless network interface cards (NIC) performing in ad-hoc mode and an outdoor environment. This assessment was based on characteristics of NICs, such as chipset, signal amplification, and antenna diversity over various operating distances. The assessment showed that antenna diversity for outdoor environment has no throughput improvement, and amplification did not always improve data rates. The wireless communication with small buffer size minimizes fluctuation of throughput date rate over large range of distances.
The airborne network definition project: a network architecture effort for future battlefield networks that enable network-centric warfare
Bishwaroop Ganguly, Steven Finn, Jeffrey McLamb, et al.
The Airborne Network Definition (AND) project had the goal of creating and testing a robust, efficient network architecture networking all elements of the battlefield. This effort has since been generalized to fit the goals of the Mobile Edge Network System Architecture (MENSA) effort. The network is designed to be self-contained, attaching to fixed backbone infrastructure whenever possible. The fundamental building block of the architecture is the Small Combat Network (SCN), which integrates heterogeneous ground and air platforms, facilitating collaborative applications such as Blue Force Tracking, Cooperative Sensing and Targeting. The architecture uses the concept of an IP core to network unlike domains (radio types) and to connect the SCN to the backbone and the Global Information Grid (GIG). This paper describes the requirements of the network and outlines the technical design of the SCN architecture. We present step-by-step descriptions of a communication on the SCN which highlights some of the key features of the architecture. We present results of a simulation that applies our proposed architecture to realistic warfighting scenarios. Results show that the architecture enables cooperative applications and point to future work that will design and evaluate a deployable AN network architecture.
Live-flight demonstration of agent technology for connecting the tactical edge to the global information grid
Eric J. Martens, David E. Corman
The Boeing Company in conjunction with the Information Directorate of the Air Force Research Laboratory (AFRL) has developed an agent-based technology for connecting tactical edge platforms with the emerging Global Information Grid. The technology was demonstrated as part of a live-flight test using The Boeing Company's F-15E Advanced Technology Demonstrator aircraft. The core technology developed and demonstrated as part of the live-flight test is called the Platform Adaptor Agent (PA). The Platform Adaptor Agent provides a mechanism for a tactical platform like the F-15E to receive GIG information. GIG information provides improved tactical awareness and information that enables the war-fighter to increase mission effectiveness. The Platform Adaptor Agent also enables information produced by the tactical edge to be published to the GIG. These published data provide improved real-time situational awareness in command elements such as the Combined Air Operations Center (CAOC).
Demonstration of high-data-rate wavelength division multiplexed transmission over a 150-km free space optical link
David W. Young, Joseph E. Sluz, Juan C. Juarez, et al.
A 150 km free-space optical (FSO) communication link between Maui (Haleakala) and Hawaii (Mauna Loa) was demonstrated by JHU/APL and AOptix Technologies, Inc. in September 2006. Over a 5 day period, multiple configurations including single channel 2.5 Gbps transmission, single channel 10 Gbps, and four wavelength division multiplexed (WDM) 10 Gbps channels for an aggregate data rate of 40 Gbps were demonstrated. Links at data rates from 10 to 40 Gb/s were run in excess of 3 contiguous hours. Data on the received power, frame synchronization losses, and bit error rate were recorded. This paper will report on the data transfer performance (bit error rates, frame synchronization issues) of this link over a 5 day period. A micropulse lidar was run concurrently, and on a parallel path with the FSO link, recording data on scattering loss and visibility. Comparisons between the state of the link due to weather and the data transfer performance will be described.
Long distance laser communications demonstration
Malcolm J. Northcott, A. McClaren, J. E. Graves, et al.
AOptix demonstrated a simulated air-to-air laser communications (laser-com) system over a 147Km distance by establishing a laser communication link between the islands of Hawaii and Maui. We expect the atmospheric conditions encountered during this demonstration to be representative of the worst seeing conditions that could be expected for an actual air to air link. AOptix utilized laser-com terminal incorporating Adaptive Optics (AO) to perform high speed tracking and aberration correction to reduce the effects of the seeing. The demonstration showed the feasibility of establishing high data rate point to point laser-com links between aircraft. In conjunction with Johns Hopkins University Applied Physics Laboratory networking equipment we were able to demonstrate a 40Gbit DWDM link, providing significantly more data throughput than is available using RF technologies. In addition to being very high data rate, the link demonstrates very low beam spread, which gives very high covertness, and a high degree of data security. Since the link is based on 1550nm optical wavelengths it is inherently resistant to jamming.
A framework for assessing and predicting network loads and performance for network-centric operations and warfare
Being able to accurately analyze and predict performance of networks in Network-Centric Operations and Warfare is critical towards realistic utility of resources. In this paper, we introduce a theoretical framework to address such issues. The framework is decomposable and component-based with plug-and-play functionality. One focal point for the framework is the assessment and prediction of the effectiveness of networks, as well as, network performance and network loads in NCO/NCW. Also, the framework allows for varying levels of specificity in order to provide important analyses including pinpointing bottlenecks, and providing potential corrections. Furthermore, the framework will allow modeling at different and multiple scales, and allow for decoupling of networks and sub-networks in order to understand and predict performance issues and propagation effects of sub-networks within the NCO/NCW environment. This leads to a much better understanding of and capability to determine and restructure NCO/NCW infrastructures for overall performance, efficiency and efficacy.
Synchronization for wireless multi-radar covert communication networks
Shrawan C. Surender, Ram M. Narayanan
The motivation for our current work is the need for a covert wireless communication network between multi-site radars. Such radars form an effective network-centric architecture that has intrinsic properties such as LPI, LPD, and good data dissemination capabilities. Our continuing work indicates that a notched UWB noise signal within which OFDM data symbols are embedded can be used as a secure communication channel between individual noise radars. The receiver performance in such systems depends heavily on the timing of the DFT window for detecting the message symbols concealed within the noise-OFDM waveform. Performance is therefore severely limited due to the effects of timing and frequency offsets on the noise-data signal. These synchronization errors bring unwanted noise into this window in the form of ICI, ISI, etc. In this paper, we show that most of the techniques developed for a simple OFDM system do not suit the covert noise-OFDM system requirements. We further propose a packet/frame detection and timing estimation technique for the noise-data signal used between the random noise radars. This technique is unique as it is applied to OFDM symbols embedded in UWB noise. With no preprocessing required in the transmitter and no knowledge about the source noise signal, the correlation properties of band-limited white noise are exploited to achieve synchronization.
A network-centric robust resource allocation strategy for unmanned systems: stability analysis
It is widely understood that communication is a critical technological factor in designing autonomous unmanned networks consisting of a large number of heterogeneous nodes that may be configured in ad-hoc fashions and incorporating intricate architectures. In fact, one of the challenges in this field is to recognize the entire network as a heterogenous collection of physical and information systems with complicated interconnections and interactions. Using high data rates that are essential for real-time interactive command and control systems, these networks require utilization of optimal integration of local feedback loops into a scheduling and resource allocation systems. This integration becomes particularly problematic in presence of latencies and delays. Given that dynamics of a network of unmanned systems could easily become unstable depending on interconnections among nodes, in this paper stability of the resulting time-delayed controlled network based on configuration changes is studied. We also formally investigate sufficient conditions for our proposed robust resource allocation strategies to be able to cope with these interconnections and time-delays in an optimal fashion. Our time-delayed dependent network consists of three nodes that can be configured into different architectures. To model our traffic and network we use a fluid flow model that is of low order and simpler than a detailed Markovian queueing probabilistic model. Using sliding mode-based variable structure control (SM-VSC) techniques that enjoy robustness capabilities, we design on the basis of an inaccurate/uncertain model our proposed robust nonlinear feedback-based control approaches. The results presented are analyzed analytically to guarantee stability of known/unknown time-delayed dependent network of unmanned systems for different configurations.
Node compromise attacks and network connectivity
Kevin Chan, Faramarz Fekri
Net-centric warfare requires widespread, highly reliable communications even in the face of adversarial influences. Maintaining connectivity and secure communications among network entities are vital properties towards mission readiness and execution. In this work, we examine required communications range of nodes in a wireless sensor network. Several parameters of wireless networks are studied in terms of how they influence overall network connectivity such as key predistribution schemes and node compromise attacks. In many battlespace situations for networks of unmanned ground sensor nodes, communication range is limited by resources, hardware ability and unpredictable terrain. Additionally, networks attempt to minimize the transmission power of each node to conserve power, as the radio is oftentimes the largest drain on available energy resources. Furthermore, such networks are vulnerable to physical node compromise and attack by an adversary and destroy connectivity in these situations. What is studied here is overall network connectivity and its relationship to key predistribution schemes and node compromise attacks. In networking situations with an adversarial presence, it may be possible to continue to mission objectives properly with the remaining uncompromised network resources with some reconfiguring of network parameters. We derive a single expression to determine required communication radius for wireless sensor networks to include these situations.
Self-organizing Collaborative ISR Robotic Teams I: Joint Session with Conference 6561
icon_mobile_dropdown
Multiplatform information-based sensor management: an inverted UAV demonstration
Chris Kreucher, John Wegrzyn, Michel Beauvais, et al.
This paper describes an experimental demonstration of a distributed, decentralized, low communication sensor management algorithm. We first review the mathematics surrounding the method, which includes a novel combination of particle filtering for predictive density estimation and information theory for maximizing information flow. Earlier work has shown the utility via Monte Carlo simulations. Here we present a laboratory demonstration to illustrate the utility and to provide a stepping stone toward full-up implementation. To that end, we describe an inverted Unmanned Aerial Vehicle (UAV) test-bed developed by The General Dynamics Advanced Information Systems (GDAIS) Michigan Research and Development Center (MRDC) to facilitate and promote the maturation of the research algorithm into an operational, field-able system. Using a modular design with wheeled robots as surrogates to UAVs, we illustrate how the method is able to detect and track moving targets over a large surveillance region by tasking a collection of limited field of view sensors.
Agent-based multi-platform control, collaboration, and target hand-off
Deploying a world wide force that is strategically responsive and dominant at every point on the spectrum of conflict involves the cooperative system development and use of advanced technologies that yield revolutionary capabilities to support the war-fighters needs. This presentation describes an agent based control architecture and prototype implementation developed by ARDEC that enables command and control of multiple unmanned platforms and associated mission packages for collaborative target hand-off/engagement. Current prototypes provide the ability to remotely locate, track and predict the movement of enemy targets on the battlefield using a variety of sensor systems hosted on multiple, non-homogeneous SUAVs and UGVs.
Formation control in multi-player pursuit evasion game with superior evaders
In this paper, we consider a multi-pursuer multi-evader pursuit evasion game where some evaders' maximal speeds are higher than those of all pursuers. In multi-player pursuit evasion game, hierarchical framework is applied widely in order to decompose the original complicated multi-player game into multiple small scale games, i.e. one-pursuer one-evader games and multi-pursuer single-evader games. The latter is especially required for superior evaders. Although usually only suboptimal results are obtained, the resulted decentralized approaches are favored by researchers from the point view of communication aspect for practical applications. Based on our previous work, for a multi-pursuer single-superior- evader game on a plane, we first study the number of pursuers which necessitates the capture. Regarding each player as a mass point, a moving planar coordinate system is fixed on the evader. Then formation control is used for pursuers in their pursuit strategies deriving to 1) avoid collision between pursuers; 2) reduce the distance between each pursuer and the evader over the evolution of game; 3) keep the pursuers' angular distribution around the evader invariant during the pursuit process and enclose the superior evader within the union of each pursuer's capture domain at the end of game. The validity of our method is illustrated by two simulation examples.
Collaborative multi-target tracking using networked micro-robotic vehicles
Subir Biswas, Sonny Gupta, Fan Yu, et al.
This paper presents a collaborative target tracking framework, in which distributed mechanisms are developed for tracking multiple mobile targets using a team of networked micro robotic vehicles. Applications of such a framework would include detection of multi-agent intrusion, network-assisted attack localization, and other collaborative search scenarios. The key idea of the developed framework is to design distributed algorithms that can be executed by tracking entities using a mobile ad hoc network. The paper comprises the following components. First, the software and hardware architectural detail of a Swarm Capable Autonomous Vehicle (SCAV) system that is used as the mobile platform in our target tracking application is presented. Second, the details of an indoor self-localization and Kalman filter based navigation system for the SCAV are presented. Third, a formal definition of the collaborative multi-target tracking problem and a heuristic based networked solution are developed. Finally, the performance of the proposed tracking framework is evaluated on a laboratory test-bed of a fleet of SCAV vehicles. A detailed system characterization in terms localization, navigation, and collaborative tracking performance is performed on the SCAV test-bed. In addition to valuable implementation insights about the localization, navigation, filtering, and ad hoc networking processes, a number of interesting conclusions about the overall tracking system are presented.
Hunter standoff killer team (HSKT) ground and flight test results
Balinda Moreland, Mark Ennis, Robert Yeates, et al.
Since the inception of powered flight, manned aerial vehicles have been a force multiplier on the battlefield. With the emergence of new technology, the structure of the military battlefield is changing. One such technology, the Unmanned Aerial Vehicle (UAV) has emerged as a valuable asset for today's war fighter. UAVs have traditionally been operated by ground control stations, yet minimum research has been targeted towards UAV connectivity. Airborne Manned Unmanned System Technology Baseline (AMUST-Baseline) was a concept that demonstrated the battlefield synergy gained by Manned and Unmanned Vehicle teaming. AMUST-Baseline allowed an Apache Longbow's (AH-64D) co-pilot gunner (CPG) to have Level IV control of a Hunter fixed wing UAV. Level IV control of a UAV includes payload control, flight control and direct data receipt. With the success of AMUST-Baseline, AATD, Lockheed Martin, Northrop Grumman, and the Boeing Company worked towards enhanced Manned and Unmanned connectivity through a technology investment agreement. This effort named Airborne Manned Unmanned System Technology Demonstration (AMUST-D) focused on the connectivity between two manned platforms, Apache Longbow (AH-64D) and Command and Control (C2) Blackhawk, and Hunter UAV. It allows robust communication from the UAV to each platform through the Tactical Common Data Link (TCDL). AMUST-D used decision aiding technology developed under the Rotorcraft Pilots Associate (RPA) Advanced Technology Demonstration (ATD) as to assist in control of the Hunter UAV, as well as assist the pilot in regularly performed duties. Through the use of decision aiding and UAV control, the pilot and commander were better informed of potential threats and targets, thus increasing his situational awareness. The potential benefits of improved situational awareness are increased pilot survivability, increased lethality, and increased operational effectiveness. Two products were developed under the AMUST-D program, the Warfighter's Associate (WA) which was integrated onto the Apache Longbow, and the Mobile Commanders Associate (MCA) which was integrated onto the Army Airborne Command and Control System (A2C2S) UH-60 Blackhawk. In this paper we will discuss what WA and MCA provided to the warfighter, and the results of the HSKT ground and flight testing.
GeoInt Systems
icon_mobile_dropdown
UrbanScape
Brian Leininger, Richard E. Nichols, Casey Gragg
UrbanScape is a cutting edge 3D collection and processing system directed at the lowest echelon of war-fighters and command elements that deal with urban environments. The goal of the program is to provide war-fighters, patrolling in an urban environment, with near real-time, up-to-date, high-resolution, fully textured 3D models of the urban terrain that can be viewed, analyzed and manipulated in an immersive environment, making target areas inherently familiar to the war-fighter. Through revolutionary model generation and facade reconstruction algorithms the UrbanScape system cuts the gap between data collections and fully processed 3D model databases from days to hours. In half the time it takes for a collection mission, soldiers are provided a fully immersive 3D model database making a foreign city as "familiar as the soldier's backyard".
Geospatial challenges in a net centric environment: actionable information technology, design, and implementation
Michael R. Hieb, Sean Mackay, Michael W. Powers, et al.
Terrain and weather effects represent fundamental battlefield information supporting situation awareness and the decision-making processes for Net Centric operations. Sensor information can have a greater impact when placed within a terrain and weather contextual framework. Realizing the promised potential of Net Centric operations is challenging with respect to these effects, since these effects can both enhance or constrain force tactics and behaviors, platform performance (ground and air), system performance (e.g. sensors) and the soldier. We have defined a methodology that starts with military objectives and determines the most useful terrain products to support these missions, taking into account weather effects and sensors. From this methodology we have designed a number of technical standards and components. A key standard is geospatial Battle Management Language (geoBML) to represent Mission input to Geospatial and Sensor Products. An example of components for creating these products are those in the Battlespace Terrain Reasoning and Awareness (BTRA) system. These standards and components enable interoperability between force elements that address not only syntactic consistency, but consistency of both a lexical and semantic representation to realize shared, coherent awareness. This paper presents a systemic approach for successful resolution of these challenges and describes an Actionable Geo-environmental Information Framework (AGeIF).
Orchestrating and optimizing multi-source ISR assets
The application of commercial Business Process Management (BPM) techniques alongside traditional systems management architectures for intelligence creation chains such as TCPED (Task, Collection, Processing, Exploitation and Dissemination) and TPPU (Task, Post, Process, Use) offers the potential for optimized and adaptive enterprise Netcentric Intelligence, Surveillance and Reconnaissance (ISR). Computing platforms, assets and their agents can and should cooperate via enterprise resource management infrastructure and middleware. Baseline BPM can further be augmented with non-invasive agent-based machine-learning techniques which can, overall, contribute to roll-up views of enterprise performance and will, therefore, be used to further refine the TCPED/TPPU execution strategy. This paper presents an overarching architecture framework which combines these features and operational drivers under a unifying system perspective.
Predictive Analytic Modeling
icon_mobile_dropdown
Geographic information systems (GIS) approaches for geographic dynamics understanding and event prediction
This paper discusses the drivers and observables of geographic dynamics as well as GIS approaches to facilitate geographic dynamics understanding. Geographic domains exhibit diverse dynamics across multiple spatiotemporal scales. Such diverse and complex geographic dynamics poses challenges to connect relevant information and infer the underlined mechanisms. Central to the paper is the premise that activities, events and processes are drivers of geographic dynamics, and these drivers give rise to changes and movements in geographic space. The changes and movements become observables from which we can measure and assess the properties of geographic dynamics. GIS frameworks provide opportunities to integrate spatial and temporal data to examine the drivers and observables of geographic dynamics. The paper will highlight key GIS visualization, analysis, and modeling techniques for geographic dynamics understanding.
Detecting space-time cancer clusters using residential histories
Geoffrey M. Jacquez, Jaymie R. Meliker
Methods for analyzing geographic clusters of disease typically ignore the space-time variability inherent in epidemiologic datasets, do not adequately account for known risk factors (e.g., smoking and education) or covariates (e.g., age, gender, and race), and do not permit investigation of the latency window between exposure and disease. Our research group recently developed Q-statistics for evaluating space-time clustering in cancer case-control studies with residential histories. This technique relies on time-dependent nearest neighbor relationships to examine clustering at any moment in the life-course of the residential histories of cases relative to that of controls. In addition, in place of the widely used null hypothesis of spatial randomness, each individual's probability of being a case is instead based on his/her risk factors and covariates. Case-control clusters will be presented using residential histories of 220 bladder cancer cases and 440 controls in Michigan. In preliminary analyses of this dataset, smoking, age, gender, race and education were sufficient to explain the majority of the clustering of residential histories of the cases. Clusters of unexplained risk, however, were identified surrounding the business address histories of 10 industries that emit known or suspected bladder cancer carcinogens. The clustering of 5 of these industries began in the 1970's and persisted through the 1990's. This systematic approach for evaluating space-time clustering has the potential to generate novel hypotheses about environmental risk factors. These methods may be extended to detect differences in space-time patterns of any two groups of people, making them valuable for security intelligence and surveillance operations.
MDA Session
icon_mobile_dropdown
GEOINT for MDA
Christian Andreasen, Chung Hye Read
Marine geospatial data has been collected over hundreds of years to varying degrees of accuracy, sometimes with errors of several miles in relation to modern satellite navigation and remote sensing accuracies. This paper details research and production activities by the National Geospatial-Intelligence Agency (NGA) aimed at improving the worldwide accuracy of coastal data and NGA future directions for better GEOINT support of Maritime Domain Awareness.
Maritime domain awareness community of interest net centric information sharing
Mark Andress, Brian Freeman, Trey Rhiddlehover, et al.
This paper highlights the approach taken by the Maritime Domain Awareness (MDA) Community of Interest (COI) in establishing an approach to data sharing that seeks to overcome many of the obstacles to sharing both within the federal government and with international and private sector partners. The approach uses the DOD Net Centric Data Strategy employed through Net Centric Enterprise Services (NCES) Service Oriented Architecture (SOA) foundation provided by Defense Information Systems Agency (DISA), but is unique in that the community is made up of more than just Defense agencies. For the first pilot project, the MDA COI demonstrated how four agencies from DOD, the Intelligence Community, Department of Homeland Security (DHS), and Department of Transportation (DOT) could share Automatic Identification System (AIS) data in a common format using shared enterprise service components.
Determinants for global cargo analysis tools
M. Wilmoth, W. Kay, C. Sessions, et al.
The purpose of Global TRADER (GT) is not only to gather and query supply-chain transactional data for facts but also to analyze that data for hidden knowledge for the purpose of useful and meaningful pattern prediction. The application of advanced analytics provides benefits beyond simple information retrieval from GT, including computer-aided detection of useful patterns and associations. Knowledge discovery, offering a breadth and depth of analysis unattainable by manual processes, involves three components: repository structures, analytical engines, and user tools and reports. For a large and complex domain like supply-chains, there are many stages to developing the most advanced analytic capabilities; however, significant benefits accrue as components are incrementally added. These benefits include detecting emerging patterns; identifying new patterns; fusing data; creating models that can learn and predict behavior; and identifying new features for future tools. The GT Analyst Toolset was designed to overcome a variety of constraints, including lack of third party data, partial data loads, non-cleansed data (non-disambiguation of parties, misspellings, transpositions, etc.), and varying levels of analyst experience and expertise. The end result was a set of analytical tools that are flexible, extensible, tunable, and able to support a wide range of analyst demands.
Comprehensive maritime awareness (CMA) joint capabilities technology demonstration (JCTD)
Serious gaps exist in identifying and prioritizing world-wide maritime threats. Maritime security and defense forces lack the capabilities and capacities to provide timely and accurate maritime situational awareness. They lack automatic tools to identify and prioritize relevant and actionable information to avoid information overload. The inability to acquire, fuse and manage disparate information limits timely cueing and focus. Information sharing is inhibited by technical, cultural and policy barriers. The Comprehensive Maritime Awareness Joint Capabilities Technology Demonstration attempts to address these problems by developing a "culture of sharing" between international partners and the U.S. and between U.S. agencies. It is our vision that we will be able to automatically; 1) 100% of the maritime movements within an area of responsibility, 2) automatically identify threats, and 3) prioritize them for action. The over all objective is to improve maritime security by acquiring, integrating, and exchanging relevant maritime activity information, identifying possible threats using available information, and then focusing limited interdiction and inspection assets on the most probable threats.
Automated detection of objects in sidescan sonar data
John M. Irvine, Steven A. Israel, Stuart M. Bergeron
Detection and mapping of subsurface obstacles is critical for safe navigation of littoral regions. Sidescan sonar data offers a rich source of information for developing such maps. Typically, data are collected at two frequencies using a sensor mounted on a towfish. The major features of interest depend on the specific mission, but often include: objects on the bottom that could pose hazards for navigation, linear features such as cables or pipelines, and the bottom type, e.g., clay, sand, rock, etc. A number of phenomena can complicate the analysis of the sonar data: Surface return, vessel wakes, fluctuations in the position and orientation of the towfish. Developing accurate maps of navigation hazards based on sidescan sonar data is generally labor intensive. We propose an automated approach, which employs commercial software tools, to detect of these objects. This method offers the prospect of substantially reducing production time for maritime geospatial data products.
SeeCoast: persistent surveillance and automated scene understanding for ports and coastal areas
Bradley J. Rhodes, Neil A. Bomberger, Todd M. Freyman, et al.
SeeCoast is a prototype US Coast Guard port and coastal area surveillance system that aims to reduce operator workload while maintaining optimal domain awareness by shifting their focus from having to detect events to being able to analyze and act upon the knowledge derived from automatically detected anomalous activities. The automated scene understanding capability provided by the baseline SeeCoast system (as currently installed at the Joint Harbor Operations Center at Hampton Roads, VA) results from the integration of several components. Machine vision technology processes the real-time video streams provided by USCG cameras to generate vessel track and classification (based on vessel length) information. A multi-INT fusion component generates a single, coherent track picture by combining information available from the video processor with that from surface surveillance radars and AIS reports. Based on this track picture, vessel activity is analyzed by SeeCoast to detect user-defined unsafe, illegal, and threatening vessel activities using a rule-based pattern recognizer and to detect anomalous vessel activities on the basis of automatically learned behavior normalcy models. Operators can optionally guide the learning system in the form of examples and counter-examples of activities of interest, and refine the performance of the learning system by confirming alerts or indicating examples of false alarms. The fused track picture also provides a basis for automated control and tasking of cameras to detect vessels in motion. Real-time visualization combining the products of all SeeCoast components in a common operating picture is provided by a thin web-based client.