Automated scheduling for space-based exoplanet observatories
In the last 20 years, researchers from around the world have discovered nearly 1000 planets orbiting stars in our galaxy and identified several thousand more candidates. Some of these planets are similar to those in our solar system, while others are completely alien and previously unsuspected, with formation mechanisms that are not yet well understood. Direct imaging of exoplanets has the potential to greatly increase our knowledge of planets of all types and solve some of these mysteries. Unfortunately, imaging from the ground is greatly complicated by the effects of the atmosphere. While next-generation systems will be able to detect giant extrasolar planets,1, 2 our best bet for imaging smaller, more Earth-like planets is a space observatory.
Because a space telescope is a major undertaking in terms of both time and resources, it is absolutely vital that we ensure that any proposed mission will not only generate the data we need to advance our scientific goals but also that it will be operated as efficiently as possible. Both of these goals can be achieved by detailed modeling of the proposed instrument and spacecraft, and simulation of entire missions. By adding an automated mission scheduler, we can generate thousands of possible mission scenarios and analyze them to find both the most likely outcomes and outliers that could potentially lead to mission failure. We can also use this framework to demonstrate optimal mission scheduling strategies.
Exoplanet imaging requires a method of blocking light from the star while capturing light emitted or reflected by the planet. This can be an internal coronagraph3–5 or a separate spacecraft (an occulter, or ‘starshade’) positioned between the target and the telescope to block the starlight.6, 7 Observation scheduling is a dynamic traveling salesman problem: there are target stars to be visited, and a cost associated with transitioning between each pair of targets, which changes as the observatory moves along its orbit.8 The cost is determined by a weighted sum of the heuristics for the scientific goals and engineering constraints, with the weights tuned to match different mission goals.
The main science goal heuristic is based on the probability of detecting a planet at the next target,9 while the engineering constraints include total and per-target observation time allocations, and, in the case of starshades, the available fuel. The next target is selected from the minimum cost path, which is determined via a fixed-depth search of a tree whose edge lengths are defined by the cost function. Once an observation has been simulated, the cost function is re-evaluated for all of the targets, and the process is repeated.
Figure 1 compares the number of unique planetary detections obtained with this scheduling technique to the results of multiple randomized observation schedules (shown in the blue histogram) and to those of a very simplistic scheduler that selects the next target using only the probability of detection. The randomized schedules allow us to map the range of possible outcomes with a fixed population of planets and targets. The automated scheduler does not achieve a global maximum because of other mission goals that compete for finite observation time, but finds local optima based on the priorities assigned to each goal.
Figure 2 is a visualization of an ensemble of 1000 mission simulations. Each line represents a transition between targets. The lines are color coded by the observation order, and their thicknesses are determined by the number of simulations containing that transition. We can clearly see the emergence of dominant patterns of observations and the evolution of competing strategies, as illustrated in Figure 3, in which two simulations exhibiting two observing strategies are plotted. In both cases, the paths are deterministically generated on the basis of our cost function rather than a priori guidelines.
We have already applied the simulation framework described here to a variety of proposed mission concepts and designs8, 10,11 and are currently using it to study the potential exoplanet science that can be gained by adding a coronagraph to the proposed Wide-Field InfraRed Survey Telescope-Astrophysics Focused Telescope Assets mission.12 These tools will be used to design and operate the next generation of space observatories, which will greatly increase our knowledge of exoplanets. We will also use the methods described here to analyze this new data and to account for instrument-specific biases. This will allow us to test existing and future theories of planetary formation and evolution, and ultimately reveal how our solar system came to be and where it is headed.
I am very grateful for the many contributions of my various collaborators, including N. Jeremy Kasdin, Stuart Shaklan, Eric Cady, Bruce Macintosh, and Doug Lisman. Portions of this work were performed under the auspices of the US Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344. This research has made use of the NASA Exoplanet Archive, which is operated by the California Institute of Technology, under contract with NASA under the Exoplanet Exploration Program.
Dmitry Savransky received his PhD in mechanical and aerospace engineering from Princeton University and is currently a postdoctoral researcher, assisting in the integration and testing of the Gemini Planet Imager.