SPIE Startup Challenge 2015 Founding Partner - JENOPTIK Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
OPIE 2017

OPIC 2017

SPIE Defense + Commercial Sensing 2017 | Register Today

2017 SPIE Optics + Photonics | Call for Papers




Print PageEmail PageView PDF


Effective computing for radio astronomy instruments

A reconfigurable computing platform and software approach significantly improves development times for high-performance, real-time digital signal processing instrumentation.
26 February 2014, SPIE Newsroom. DOI: 10.1117/2.1201402.005360

There is an ever-increasing need for high-performance, real-time digital signal processing computing power for radio astronomy instrumentation applications such as beam forming, imaging, radio frequency interference mitigation, and wide-band high-resolution spectroscopy. Until recently, such instruments (e.g., the Atacama Large Millimeter/submillimeter Array correlator1) were custom designed using specialized boards, backplanes, interconnects, and monitor and control software. A typical cycle of designing, constructing, and debugging for such an instrument, however, takes several years. In this time the implementation technology will often have become obsolete due to progress in the electronics industry that follows Moore's law.

Since 2006, a group (led by Dan Werthimer of the Space Sciences Laboratory at the University of California, Berkeley) has been developing a scalable and upgradable field-programmable gate array (FPGA) computing platform and software design method, which has a wide variety of applications in radio astronomy. This project—the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER)2—relies on a small number of modular, connectible, upgradable hardware components and platform-independent software libraries that can be reused and rescaled as hardware capabilities expand. With this approach, existing designs can easily be migrated to new generations of hardware.

We developed the Green Bank Ultimate Pulsar Processing Instrument (GUPPI)3 using CASPER technology (the Matlab/Simulink-based tool set and the first generation of CASPER hardware). This is one of the instruments on the National Radio Astronomy Observatory's Robert C. Byrd Green Bank Telescope (GBT).4 The GBT is the world's largest fully steerable single-dish radio telescope, with receivers that are sensitive in the 300MHz to 100GHz frequency range. With GUPPI, we are able to process the entire 800MHz bandwidth of the GBT 1.4GHz receiver. With colleagues at the University of California, Berkeley, we have also been developing the Versatile Green Bank Astronomical Spectrometer (VEGAS)5 instrument since 2010. This instrument is based on the latest generation of CASPER hardware (see Figure 1).

Figure 1. A Collaboration for Astronomy Signal Processing and Electronics Research (CASPER) Roach II electronics board used in the Versatile Green Bank Astronomical Spectrometer (VEGAS) instrument. The field-programmable gate array (FPGA), two analog-to-digital converters (ADC), and ADC clock synthesizer can be seen.

We were commissioned in 2013 to build a combined pulsar and spectral line backend for the Shanghai Astronomical Observatory, which we call the Digital Backend System (DIBAS). We were able to provide the required spectral line capability for DIBAS from our VEGAS instrument design. This was therefore a good test case to determine the utility of the CASPER approach because it was, in principle, simple to port our GUPPI pulsar modes to the new generation of hardware.

We are able to port ∼60% of our original GUPPI design elements directly into the equivalent DIBAS modes. The remaining ∼40% required only minimal modifications or additions. These are all a direct result of the 16 simultaneous analog-to-digital converter (ADC) samples that are produced in each clock cycle for DIBAS, compared with the eight simultaneous ADC samples that are produced in each GUPPI clock cycle (i.e., DIBAS has twice the data production rate of GUPPI). For the timing mode designs, we are able to simply port ∼30% of the GUPPI design elements into the DIBAS derivatives. The remaining ∼70% required some substantial modifications and additions to accommodate the higher data rate of DIBAS.

The original GUPPI FPGA hardware design—see Figure 2(a)—comprises six different FPGAs distributed over three distinct hardware units. This arrangement required us to use four X-attachment user interface (XAUI) 10Gb/s high-speed links to connect the samplers to the Berkeley Emulation Engine (BEE-2) computing platform. We also needed dozens of embedded high-speed interconnects (printed circuit board traces) to connect the various FPGAs used on the BEE-2. Implementing these connections proved to be our greatest challenge during the development of GUPPI. In comparison, our elemental DIBAS FPGA design—see Figure 2(b)—comprises only one FPGA located on a single printed circuit board, to which two high-speed analog-to-digital converters are directly connected via ultra-high-speed connectors. This arrangement completely eliminates all the interconnect issues involved with the GUPPI design and greatly simplified our overall development task. Although considerable effort was required to port our designs to the new hardware platform, our development of the DIBAS FPGA has benefited greatly from the GUPPI heritage. We took ∼18 months to complete the original GUPPI development process, whereas the conversion to DIBAS took only four months. This improvement is due to increased familiarity with the CASPER tools and the considerable reuse of existing code.

Figure 2. Block diagrams of (a) the original GUPPI design and (b) the updated Digital Backend System (DIBAS) design. XAUI: X-attachment user interface. SMA: SubMiniature version A connector. GPU: Graphics processing unit. PPS: Pulse per second. MS/s: Mega samples per second. GbE: Gigabit Ethernet. TB: Terabyte.

Using CASPER, we have been able to move existing designs—for both hardware and software—to more modern, capable hardware implementations in an efficient way that would be impossible without these tools. The CASPER toolkit involves a steep learning curve and the development environment requires care and attention, but we have found that the promise of the project has been realized. Our next task is to port the pulsar modes developed for DIBAS back to the VEGAS instrument. The work we have described here will be presented in more detail in a forthcoming SPIE article.6

The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities Inc.

Richard Prestage, John Ford
National Radio Astronomy Observatory (NRAO)
Green Bank, WV

Richard Prestage is currently a scientist at NRAO Green Bank, having previously been the site director, as well as head of technical services for the Atacama Large Millimeter/submillimeter Array Observatory. He has a BSc from the University of Leeds and a PhD from the University of Edinburgh.

John Ford has been working at NRAO since 1995 as monitor and control system engineer, and is now the head of the Electronics Division. His technical interests include embedded control systems and digital signal processing design. He holds a BSEE and an MSEE from the University of Memphis.

1. R. P. Escoffier, G. Comoretto, J. C. Weber, A. Baudry, C. M. Broadwell, J. H. Greenberg, R. R. Treacy, The ALMA correlator, Astron. Astrophys. 462, p. 801-810, 2007.
2. A. Parsons, D. Werthimer, D. Backer, T. Bastian, G. Bower, W. Brisken, H. Chen, et al., Digital instrumentation for the radio astronomy community, arXiv:0904.1181 [astro-ph.IM], 2009.
3. R. DuPlain, S. Ransom, P. Demorest, P. Brandt, J. Ford, A. L. Shelton, Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument, Proc. SPIE 7019, p. 70191D, 2008. doi:10.1117/12.790003
4. R. M. Prestage, K. T. Constantikes, T. R. Hunter, L. J. King, R. J. Lacasse, F. J. Lockman, R. D. Norrod, The Green Bank Telescope, Proc. IEEE 97, p. 1382-1390, 2009.
5. D. A. Roshi, M. Bloss, P. Brandt, S. Bussa, H. Chen, P. Demorest, G. Desvignes, Advanced multi-beam spectrometer for the Green Bank Telescope, Proc. Int'l Union Radio Sci. General Assembly, 2011. doi:10.1109/URSIGASS.2011.6051280
6. R. M. Prestage, J. Ford, Experiences with the design and construction of wideband spectral line and pulsar instrumentation with CASPER hardware and software: the digital backend system, Proc. SPIE 9152, 2014. (Presentation at SPIE Astronomical Telescopes + Instrumentation 2014.)