Award Abstract # 0959979
MRI-R2: Development of an Immersive Giga-pixel Display

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: THE RESEARCH FOUNDATION FOR THE STATE UNIVERSITY OF NEW YORK
Initial Amendment Date: April 23, 2010
Latest Amendment Date: February 9, 2012
Award Number: 0959979
Award Instrument: Standard Grant
Program Manager: Rita Rodriguez
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: May 1, 2010
End Date: April 30, 2014 (Estimated)
Total Intended Award Amount: $1,400,000.00
Total Awarded Amount to Date: $1,400,000.00
Funds Obligated to Date: FY 2010 = $1,400,000.00
ARRA Amount: $1,400,000.00
History of Investigator:
  • Arie Kaufman (Principal Investigator)
    ari@cs.stonybrook.edu
  • Amitabh Varshney (Co-Principal Investigator)
  • Hong Qin (Co-Principal Investigator)
  • Dimitrios Samaras (Co-Principal Investigator)
  • Klaus Mueller (Co-Principal Investigator)
Recipient Sponsored Research Office: SUNY at Stony Brook
W5510 FRANKS MELVILLE MEMORIAL LIBRARY
STONY BROOK
NY  US  11794-0001
(631)632-9949
Sponsor Congressional District: 01
Primary Place of Performance: SUNY at Stony Brook
W5510 FRANKS MELVILLE MEMORIAL LIBRARY
STONY BROOK
NY  US  11794-0001
Primary Place of Performance
Congressional District:
01
Unique Entity Identifier (UEI): M746VC6XMNH9
Parent UEI: M746VC6XMNH9
NSF Program(s): Major Research Instrumentation
Primary Program Source: 01R00910DB RRA RECOVERY ACT
Program Reference Code(s): 6890
Program Element Code(s): 118900
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

"This award is funded under the American Recovery and Reinvestment Act of 2009 (Public Law 111-5)."
Proposal #: 09-59979
PI(s): Kaufman, Arie E., Mueller, Klaus, Qin, Hong, Samaras, Dimitrios, Varshney, Amitabh
Institution: SUNY at Stony Brook
Title: MRI-R2: Development of an Immersive Giga-pixel Display
Project Proposed:
This project, developing a next generation of immersive display instrument (called 'Reality Deck'), aims to explore and visualize data from many fields. To satisfy the need driven by the explosive growth of data size and environments already at the institution, the work builds on the existing experience with immersive environments (e.g., the 'Immersive Cabin' a current generation device using projectors). This unique project generates a one-of-a-kind exploration theater, using 308 high-resolution 30 LCD display monitors, by contributing an environment whose visual resolution is at the limit of the human eye's acuity. Within this environment investigators can interact with the data/information displayed.
The instrument services many groups, including visual computing, virtual and augmented reality, human computer interfaces, computer vision and image processing, data mining, physics, scientific computing, chemistry, marine and atmospheric sciences, climate and weather modeling, material science, etc.
Collaborating scientist's applications will be ported to the RealityDeck, including applications in nanoelectronics, climate and weather modeling, biotoxin simulations, microtomography, astronomy, atmospheric science, G-pixel camera for intelligence gathering, architectural design and disaster simulations, smart energy grid, and many others.
A unique assembly of displays, GPU cluster, sensors, communication/networking, computer vision and human computer interaction technologies, RealityDeck is an engineering feat with user studies to deliver a holistic system with a significant societal and research value. It is a one-of-a-kind pioneering G-pixel display approaching the limits of visual cognition that provides functionalities to a diverse user base. Its resolution at the eye's visual acuity and its field of view will always exceed that of a human user (wherever a human chooses to look), satisfying visual queries into the data in a very intuitive way. This visual interaction is tightly coupled with physical navigation.
This surround virtual environment consists of inertial sensors and six cameras mounted around the top corners of the RealityDeck room to allow interaction with the displays. The display system is driven by a cluster of about 80 high-end computer nodes, each equipped with two high end GPUs. A small-scale video-wall has already been constructed as an experimental platform for the RealityDeck consisting of 9 high-resolution 30 LCD panels in a 3×3 configuration.
Broader Impacts:
The instrument will be used for research, education, and outreach across many departments at the institution, the University of Maryland, and Brookhaven National Laboratory (BNL). It fosters collaborations across disciplines attracting faculty, researchers, and students. RealityDeck significantly enriches the quality of visual thinking and data exploration. It substantially enhances the infrastructure of research and education and has the potential to alter the way computer graphicists, engineers, and scientists work and/or conduct scientific discoveries.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

K. Petkov, C. Papadopoulos, M. Zhang, A.E. Kaufman, and X. Gu "Conformal Visualization for Partially-Immersive Platforms" IEEE Virtual Reality , 2011 , p.143
Z. Zheng, N. Ahmed, K. Mueller "iView: Feature Clustering Framework for Suggesting Informative Views in Volume Visualization" IEEE Transactions on Visualization and Computer Graphics , v.17 , 2011 , p.1959

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.



Project Title: Development of an Immersive Giga-pixel Display

 

Project Outcomes Report for the General Public Introduction

The goal of
this Major Research Instrumentation (MRI) project was the design, engineering
and implementation of a unique super high resolution visualization facility, termed Reality Deck, targeted at the display of big data. Past platforms
targeted at big-data visualization extended to approximately 300 megapixels resolution. Nowadays however, it is trivial to acquire datasets (ie.g., high resolution imagery) that spans multiple gigapixels in size.

Constructed to answer the challenge
of big data visualization in large immersive display environments, the Reality
Deck is housed in a  40'x30' lab where 416 LCD monitors are tiled and mounted
in a 4-wall arrangement. Total work space enclosed by the monitors is
33'x19'. Overall, the Reality Deck provides an aggregate resolution of more
than 1.5 gigapixels.  Additionally, it offers full
horizontal immersion by means of its four walls and mechanized door. This means data with a panoramic component can be very naturally mapped to the display and users can explore it just by physical navigation, without the need to "pan" the visualization using a controller. The facility size and
form factor invite users to walk around, examining different aspects of the
visualized data by approaching different sections of the display space. We leverage
this characteristic of the facility in several research contributions outlined
below.

 

Construction

Similarl to other room-sized displays, the Reality Deck is constructed by tiling multiple LCD monitors in precise fashion. In the Reality Deck case, we
utilize commodity, off-shelf, hardware with cost-efficient modifications
to drive construction costs down and enable ease of maintenance. Each
Samsung S27A850D monitor used  was modified to reduce bezel
size and reroute all distracting electronics to its rear.

Images and 3D visuals displayed in the Reality Deck are generated by a cluster of 20 workstation-class computers, each utilizing 4 GPUs (Graphics Processing Units).  For most cluster nodes, each GPU is connected to 6 monitors for a total of 24 monitors per node. The visualization cluster is connected to the Reality Deck using active fiber-optic DisplayPort extender cables. 7 miles of cables are utilized throughout the Reality Deck.

The facility is equipped with numerous interaction peripherals. A 24 infra-red camera tracking system provides positional information for people and objects while anambisonic sound system allows for audio queues to be tied into the visualization. A touch-enabled table computer is available in the
middle of the facility, enabling natural user interactions.

Design process and challenges behind the construction of the Reality Deck have been document and available for future system builders [3, 5].

 

Visualization and Interaction Techniques

Visualization software for the Reality Deck provides support for visualization of multiple data formats (videos, 3D models, gigapixel images, geospatial data, medical volumes, etc.). We have also developed several visualization and interaction techniques that enable natural interfacing with
data and improve system performance.

The Infinite
Canvas [4] is a walking-based interface that allows users to explore data that
extends arbitrarily along one dimension. Our novel frameless visualization
scheme [8] allows reconstruction of high resolution and high frame-rate
images from multi-t...

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page