Ultra-high-resolution walls for visualizing very large datasets
In recent years, the ability to acquire, generate, store, process, interlink, and query data has increased spectacularly. This has had a profound impact in many scientific disciplines such as astronomy, molecular biology, and particle physics. There are, however, also several challenges in addressing these increasingly data-driven domains. Such challenges span numerous fields of research within computer science, e.g., databases, data mining, communication networks, as well as human–computer interactions and—more specifically—interactive data visualization. Indeed, scientists are faced with very large amounts of data that are difficult to understand and analyze in-depth because of their sheer size and complexity. Users therefore require effective tools to freely (yet efficiently) explore, make sense of, and interactively manipulate their data.
One data visualization approach is the use of ultra-high-resolution wall displays, i.e., that have a very high pixel density over a large physical surface. The early work in this field was mainly focused on the technical aspects of the displays, i.e., how to make such platforms, how to display complex graphics, and how to stream data across the nodes of the computer clusters that drive them.1 In these early studies, there was little attention paid to issues related to interacting with the display surfaces. Only poor interaction capabilities were therefore generally offered, e.g., a wireless mouse and keyboard on a stand, or gyroscopic mice. To make wall displays truly interactive, they are increasingly being coupled with input devices such as touch frames, motion-tracking systems, and wireless multitouch devices.2 Multidevice and multiuser interactions with the displayed data can thus be enabled.
As part of our research in the field of data visualization, we have investigated the design, engineering, and evaluation of ultra-high-resolution wall-sized displays.2 In this research, we are specifically focused on designing and empirically evaluating novel interaction techniques that are specifically designed for wall display environments. For example, we have developed high-precision remote pointing techniques that allow users to interact with the wall even when they are not within reach of the display.3 In addition, our mid-air panning and zooming techniques can be used to navigate maps, images, and datasets that exceed the display capacity of the walls.4
The first wall display that we set up in our laboratory—WILD (wall-sized interaction with large datasets)—has a total resolution of 20,480 × 6400 (i.e., 131 megapixels) over a surface area of 5.5 × 1.8m. Part of the ‘GLIMPSE360’ IR image of the Milky Way (from the Spitzer Space Telescope)5 is shown, displayed on our newer wall—known as WILDER—in Figure 1. Although WILDER has a somewhat lower resolution (14,400 × 4800) than our original wall, it features a touch-sensitive frame and has much narrower screen bezels. With platforms such as WILD and WILDER, we can represent data with a high level of detail, while simultaneously retaining context. In other words, users can transition from an overview of the data to a detailed view, simply by moving in front of the wall display. Wall displays are thus supportive of collaborative work, as they enable multiple users to simultaneously visualize and interact with the data (as long as the correct input devices and interaction techniques are available).
Our research also concerns the engineering of this specific type of interactive system. Ultra-high-resolution wall displays typically have problems with data sharing and graphics rendering because they are often driven by clusters of computers (for instance, our WILD platform uses 32 + 1 graphics processing units in 16 + 1 computers). The multiple input channels, from the heterogeneous devices involved in user interactions with a wall display (e.g., motion trackers, tablets, smartphones, and laptops), also add significant complexity to the process. We have thus designed and implemented software toolkits to ease the rapid prototyping and development of advanced interactive visualizations running on cluster-driven display surfaces.6, 7
The applications of our approach range from the monitoring of complex infrastructures and crisis management situations, to tools for the exploratory visualization of scientific data. With our latest application—FITS-OW (flexible image transport system-on wall)8—we enable astronomers to visualize and interact with very large FITS images (and collections of these images), as shown in Figure 2. Users can pan and zoom the images that are several hundred thousand pixels in width and height. They can also overlay the results of data analyses, and fetch and display additional images of a specific object or region in the sky. In this way it is possible to show observations from different parts of the electromagnetic spectrum or from different times. With FITS-OW, astronomers can also query databases, e.g., from the SIMBAD (set of identifications, measurements, and bibliography for astronomical data) server, and visualize the results of their queries in situ (i.e., right next to the corresponding source in the image). Furthermore, detailed information can be shown for multiple sources simultaneously (including multiple measurements and documents). To perform these operations, astronomers use interaction techniques that have been designed specifically for wall displays, i.e., using direct manipulation and gestures on the wall's surface or on handheld tablets.9 Some of the specific challenges that we have addressed with FITS-OW include the generation of FITS tile pyramids and their multiscale rendering, queries to sky catalogs, dynamic adjustments of scale, color mapping and graphics compositing settings, as well as the underlying input management framework.8
In summary, our research is focused on the design and evaluation of novel interaction techniques for the visualization of large datasets. For such purposes, we investigate the use of ultra-high-resolution walls, whose high pixel density enables the display of highly detailed data while retaining context information. We have developed a set of techniques to enhance user interaction with wall displays, such as high-precision remote pointing and multiscale navigation techniques. We have also created new software to facilitate the rapid prototyping and development of applications running on such cluster-based interactive systems. Our work is applicable in a range of fields, including crisis management and scientific data analysis. Our future work will focus on the design of gesture-based interaction techniques to enable users to perform more advanced processing and interactive data manipulation tasks directly on wall displays.
Emmanuel Pietriga is a senior research scientist. In his research he focuses on interactive visualization techniques for large datasets, including multiscale user interfaces and ultra-high-resolution wall-sized displays. He collaborates with the Atacama Large Millimeter Array (ALMA) radio telescope group on user interfaces for operations monitoring and control.