Share Email Print

Proceedings Paper

High-performance data processing using distributed computing on the SOLIS project
Author(s): Stephen Wampler
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The SOLIS solar telescope collects data at a high rate, resulting in 500 GB of raw data each day. The SOLIS Data Handling System (DHS) has been designed to quickly process this data down to 156 GB of reduced data. The DHS design uses pools of distributed reduction processes that are allocated to different observations as needed. A farm of 10 dual-cpu Linux boxes contains the pools of reduction processes. Control is through CORBA and data is stored on a fibre channel storage area network (SAN). Three other Linux boxes are responsible for pulling data from the instruments using SAN-based ringbuffers. Control applications are Java-based while the reduction processes are written in C++. This paper presents the overall design of the SOLIS DHS and provides details on the approach used to control the pooled reduction processes. The various strategies used to manage the high data rates are also covered.

Paper Details

Date Published: 13 December 2002
PDF: 10 pages
Proc. SPIE 4848, Advanced Telescope and Instrumentation Control Software II, (13 December 2002); doi: 10.1117/12.460917
Show Author Affiliations
Stephen Wampler, National Solar Observatory (United States)

Published in SPIE Proceedings Vol. 4848:
Advanced Telescope and Instrumentation Control Software II
Hilton Lewis, Editor(s)

© SPIE. Terms of Use
Back to Top