Anaheim Convention Center
Anaheim, California, United States
26 - 30 April 2020
Plenary Events
Monday Plenary Session
Date: Monday 27 April 2020
Time: 5:00 PM - 6:45 PM
Location: Ballroom B/C (Level 3)
Welcome, Awards, and Acknowledgements
5:00 PM - 5:15 PM

Presentation of the 2020 Rising Researchers
The Rising Researchers program is designed to recognize early career professionals who are conducting outstanding work in product development or research in the defense, commercial, and scientific sensing, imaging, optics, or related fields.

5:15 PM - 6:00 PM: Operationalizing Autonomy and AI for the Air Force

Jean-Charles Ledé, Technical Advisor, Air Force Research Lab. (United States)

Autonomy and AI have made tremendous progress in recent years to the point where operational applications of these technologies can provide decisive advantages. This presentation will discuss the approach recommended to rapidly field autonomy and AI capabilities at scale including the development of a common platform, addressing trust issues, and agile methodology. Examples in sensor exploitation and business processes will be used to demonstrate the operational value of current generation of AI. However, this generation has limitations, and the talk will conclude with future research required to expand the safe, ethical, and effective use of these technologies.

Mr. Ledé is the Air Force Research Laboratory Autonomy Technical Advisor overseeing the AFRL Autonomy and AI portfolio and making recommendations on new programs leveraging internal and external research. Mr Ledé is the senior Air Force representative in the OSD Autonomy Community of Interest, and the AFRL representative on the Air Force Artificial Intelligence Cross Functional Team.
Imaging and Analytics Track Plenary Session
Date: Tuesday 28 April 2020
Time: 8:30 AM - 9:15 AM
Location: Ballroom C (Level 3)
Tri-modal imaging spectroscopy of paintings


John Delaney, National Gallery of Art (United States)

The three imaging modalities of the scanner include reflectance (400 - 2500 nm, 2.5 nm sampling), molecular fluorescence (400 - 1000 nm) and X-ray fluorescence. The first two modalities provide molecular information and the third elemental information about artists’ materials (pigments and paint binders). The resulting material maps reveal insight into how artworks are constructed and modified. The type of information that can be obtained from this scanner will be presented with case studies such as Leonardo da Vinci’s Ginevra de' Benci and Pablo Picasso’s Le Gourmet, both in the collection of the National Gallery of Art, Washington DC.

Dr. John K. Delaney is senior imaging scientist in the scientific research department of the conservation division of the National Gallery of Art, Washington. His research focuses on the adaptation of remote sensing sensors and processing methods for the study of paintings and works on paper. https://www.nga.gov/press/biographies/bio-delaney.html
Materials and Devices Track Plenary Session
Date: Tuesday 28 April 2020
Time: 8:30 AM - 10:00 AM
Location: Ballroom B (Level 3)
8:30AM - 9:15AM: Multi-micron silicon photonics platform for imaging and sensing


Aaron Zilkie, Rockley Photonics (United States)

Silicon photonics is poised to bring photonic integrated circuit (PIC) technologies to a range of solutions in imaging and sensing, providing the advantages of increased power efficiency, size reductions, and lower-cost packaging. Compared to conventional silicon photonics architectures, our platform uses larger, multi-micron waveguides, and holds the key to optimizing PIC performance, power efficiency, manufacturability and versatility. Our technology brings performance advantages such as low propagation losses, efficient integration of III-V actives, and high power handling, and we have demonstrated photonic integrated circuits with high levels of integration with a full suite of devices including hybrid lasers, compact arrayed waveguide grating filters, high-speed modulators and detectors and efficient out-couplers. We will discuss applications ranging from consumer devices to healthcare to automotive.

Aaron Zilkie is Co-founder and VP of R&D at Rockley Photonics where he currently leads silicon photonics R&D developing next next-generation products and solutions for the company. Aaron has 15 years of experience in research and development, IP development, and product development in optical datacom, optical communications, and photonics. Prior to Rockley Photonics Aaron has held technology development roles at Kotura Inc., Oclaro Technology Inc. and Nortel Networks.

9:15 AM - 10:00 AM: Silicon photonics for LIDAR


Jonathan Doylend, Intel (United States)

LIDAR (Light Detection and Ranging) is emerging as a necessity for fully automated self-driving automotive applications. In order to sample the far field with sufficient resolution for this application the system must incorporate many optical elements, leading to challenges for manufacturability and size. Due to the density of optical components required, LIDAR is well suited for photonic integration in order to achieve miniaturization and scalable manufacturability. This talk will give an overview of LIDAR, the components required for a chip-scale solution, and silicon photonics progress with respect to this goal.

Jonathan Doylend is the Technical Lead for Silicon Photonics LIDAR development at Intel Corporation. Prior to Intel he has worked on silicon photonics LIDAR for the DARPA SWEEPER program at UCSB, worked at JDS Uniphase / Lumentum and Phiar Corporation, and founded Diakaris.
Advanced Sensing and Imaging Track Plenary Session
Date: Wednesday 29 April 2020
Time: 8:30 AM - 9:30 AM
Location: Ballroom C (Level 3)
A National Laboratory “crystal ball” look into the future of sensing and imaging systems

G. Andrew "Andy" Erickson, Director, Global Security Programs, Los Alamos National Laboratory (United States)

Los Alamos National Laboratory has been involved in advanced remote sensing since its inception in 1943. From detecting gamma ray bursts in space, generating real time geo-rectified imagery in Iraq or zapping rocks on mars, Los Alamos has developed numerous advanced sensing and imaging systems for a myriad of missions. However, getting the data is often the easy part of the problem. How we collect, manage and process large disparate data sets, maintain end-to-end data integrity and generate low latency, actionable knowledge with minimal human oversight is generally the bigger part of the puzzle. So do we have a crystal ball for the future? By looking at how these various technology areas have been developing and how they are being used in conjunction with one another, we can start to understand what technology gaps exist and just what might be possible in the future.

Andy Erickson is the director for global security programs at Los Alamos National Laboratory. In this role, he oversees all of the defense, intelligence, counterterrorism, nonproliferation, space and emerging threats programs at Los Alamos. Erickson is responsible for matching the needs of sponsors with multi-disciplinary science, technology and engineering teams at the Laboratory to provide timely and effective solutions to national security problems. He also works closely with the laboratory’s Feynman center for Innovation to transfer novel laboratory technologies to industry.
Next Generation Sensor Systems and Applications Track Plenary Session
Date: Wednesday 29 April 2020
Time: 8:30 AM - 9:30 AM
Location: Ballroom B (Level 3)
Artificial intelligence for maneuver and mobility (AIMM) essential research project (ERP)

Stuart Young, Chief of the Information Sciences Division, U.S. Army Combat Capabilities Development Command (United States)

The future operational environment will be contested in all domains in an increasingly lethal and expanded battlefield, conducted in complex environments against challenged deterrence. In order to prevail in the Multi-Domain Operations (MDO) phases of dis-integration, exploitation, and re-entry to competition, the Army will need to employ teams of highly-dispersed warfighters and agents (robotic and software), to include Robotic Combat Vehicles (RCVs). To operate as a high-functioning team, Soldiers will need to be able to coordinate with RCVs as if they were teammates (i.e. fellow Soldiers) rather than tools (i.e. tele-operated robots capable of performing limited tasks). To enable this human-agent teamwork, the Artificial Intelligence for Maneuver and Mobility (AIMM) Essential Research Project (ERP) aims to revolutionize AI-enabled systems for autonomous maneuver that can rapidly learn, adapt, reason, and act in MDO.
The program is divided into two main Lines of Effort (LoE): Mobility, and Context-Aware Decision Making (CADM). The Mobility LoE is focused on developing resilient autonomous off-road navigation for combat vehicles at operational speed that can autonomously move to a position of advantage. The CADM LoE is focused on enabling autonomous systems to reason about the environment for scene understanding with the ability to incorporate multiple sources of information and quantify uncertainty.
Ultimately, the Mobility and CADM LoE's will culminate in autonomous maneuver-the ability of unmanned vehicles to autonomously maneuver on the ground against a near-peer adversary within the Multi-Domain Operations (MDO) battlespace. This capability will enable autonomous vehicles to team with Soldiers more seamlessly (reducing Soldier cognitive burden); conduct reconnaissance to develop the enemy situation at standoff (creating options for the commander); and enabling the next generation of combat vehicles to fight and win against a near-peer adversary.

Dr. Stuart H. Young has over 27 years of experience managing and performing basic and applied research for autonomous military robotic systems at the U.S. Army Research Laboratory (ARL) where he is currently Chief of the Information Sciences Division. His research focus is in the development of intelligent behaviors for collaborating robotic teams operating in militarily relevant environments.
Back to Top