Searching for optical transient and variable sources with the Palomar Transient Factory
The Palomar Transient Factory (PTF) is a comprehensive astronomical transient detection system that includes a wide-field survey camera, an automated real-time data reduction pipeline, a dedicated photometric follow-up telescope, and a full archive of all detected sources. The PTF robotic telescope is mounted on the 48-inch Samuel Oschin Telescope at the Palomar Observatory in southern California and is used to scan the sky every night for optical transient and variable astronomical sources.1, 2 The majority of synoptic optical surveys are tuned to maximize discoveries of selected source populations (e.g., microlenses, classical novae, or supernovae). These surveys are crucial for specialized science investigations, but they are not relevant to the time-domain phase space. The PTF represents a next-generation transient survey that can be used to systematically explore the variable sky on a number of timescales.
PTF first light was achieved on 13 December 2008, and commissioning of the survey was completed on 1 March 2009. Although the original survey finished on 31 December 2012, operations are continuing until 2016 as part of the re-tooled Intermediate PTF (iPTF) survey.3 With the PTF, simultaneous discoveries of well-studied populations (e.g., classical novae or supernovae) and poorly constrained events (e.g., luminous red novae or tidal disruption flares) have been made. In addition, several phenomena—only previously predicted—have been observed for the first time (e.g., orphan afterglows of gamma-ray bursts and supernova precursor explosions). Over 2300 supernovae have been discovered with the PTF since 2009, including several examples that have been used to define a new class of astronomical transients. The discoveries that have been made with the PTF are summarized in Figure 1.

Data obtained with the PTF camera is transferred (via the National Science Foundation's High Performance Wirelesss Research and Education Network, and the Energy Sciences Network of the Department of Energy) immediately through two automated reduction pipelines to the National Energy Research Scientific Computer Center (NERSC).6 In the real-time transient detection pipeline (run at NERSC), we use computers that run machine-learning algorithms to scan through the data and to identify astronomical events that require further investigation. Our aim is to make systematic spectroscopic and photometric measurements for these targets, in as close to real time as possible. In addition, we use a near-real-time image subtraction pipeline to identify optical transients, within minutes of the images being obtained. The output of this pipeline is sent to the University of California Berkeley, where a source classifier—based on all available time-series and context data—is used to determine a set of probabilistic statements for the scientific classification of the transient objects.7
About 100GB of raw optical imaging data is obtained each night at the Palomar Observatory. This data is subtracted from existing reference images for the same part of the sky to look for new astrophysical transients (e.g., supernovae, variable stars, and other cataclysmic explosions). Our data processing approach, conducted at NERSC (i.e., for detrending and alignment), produces about 500GB of subsequent new, reference, and subtraction imaging data. We use our machine-learning algorithms to scan this data set for the objects that are of scientific interest (approximately one per million). We then compare a 1TB database—containing over 1.5 billion objects—with previous detections of the same portion of the sky. We publish these results on the Internet less than 40 minutes after the original observations are made at Palomar.
Follow-up measurements of detected transients are a vital component of any successful transient survey. The photometric and spectroscopic measurements we make on the PTF objects can be used to learn more about the physics of the events, and to more definitively determine the nature and distance of the events, respectively. We use the Palomar 60-inch telescope to automatically generate colors and light curves for the interesting transients that are detected. As part of the PTF collaboration, we are also able to leverage 15 other telescopes for additional photometric and spectroscopic observations. Once we have spectroscopically or photometrically confirmed a target, the data is provided to the appropriate PTF consortium science group. We use an automated system to collate the detections from the Berkeley classification engine, and to make them available to the various follow-up facilities, to coordinate the observations, and to report the results.
In August 2011, a Type Ia supernova (SN 2011fe) was discovered through PTF observations. This supernova (see Figure 2) is closer to Earth (about 21 million light years) than any other of this type that has been observed in several decades. The real-time capabilities of the PTF pipeline allowed astronomers to catch this supernova within hours of its explosion, which is a rare feat. The astronomical community quickly began to observe this supernova with as many telescopes (and even binoculars) as possible, including the Hubble Space Telescope. The early observations of SN 2011fe were used to confirm some assumptions about the physics of Type Ia supernovae, and allowed a number of possible theoretical models to be ruled out. By examining the evolution of SN 2011fe's brightness, as well as its early spectral features, the PTF team constrained the size of the exploding star, its moment of explosion, what happened during the explosion, and the type of binary star system involved. This investigation provided the first evidence that Type Ia supernovae originate as carbon-oxygen white dwarf stars.8

We use automated real-time data reduction pipelines to process wide-field survey astronomical data from the PTF. Together with follow-up photometric and spectroscopic observations, discoveries of several new transient objects can be confirmed and studied. In 2016, the iPTF will cease operations and make way for the Zwicky Transient Facility (ZTF).9 This new collaboration will feature a camera we are currently building at the Palomar Observatory. With the ZTF, the entire northern hemisphere sky will be surveyed every night, and will continue the search for objects such as supernovae, black holes, and near-Earth asteroids. The ZTF will shoot one frame every 30 seconds, at 18 gigabits per frame. There will thus be an order of magnitude increase in data from the ZTF compared with the PTF. We are currently working to couple our astronomical pipelines with ever improving high-performance computer resources to deal with the associated computational challenges.
The PTF is an international collaboration of scientists and engineers from the California Institute of Technology, the Department of Energy's National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory, NASA's Infrared Processing and Analysis Center, the University of California at Berkeley, Las Cumbres Observatory Global Telescope Network, the University of Oxford, Columbia University, the Weizmann Institute of Science in Israel, and the Pennsylvania State University. The principal investigator of the PTF is S. R. Kulkarni at the California Institute of Technology.
Peter Nugent is the division deputy for scientific engagement in the Computational Research Division. In his research he uses high-performance computers to tackle data analysis problems, as well as theoretical simulations in cosmology and astrophysics.