
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | August 13, 2003 |
Latest Amendment Date: | August 29, 2006 |
Award Number: | 0321377 |
Award Instrument: | Standard Grant |
Program Manager: |
Rita Rodriguez
CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 1, 2003 |
End Date: | August 31, 2008 (Estimated) |
Total Intended Award Amount: | $0.00 |
Total Awarded Amount to Date: | $400,000.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
3720 S FLOWER ST FL 3 LOS ANGELES CA US 90033 (213)740-7762 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
3720 S FLOWER ST FL 3 LOS ANGELES CA US 90033 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Major Research Instrumentation |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This project, developing the technology for Distributed Immersive Performance (DIP), deals with live, interactive musical performances in which participants in different physical locations are interconnected by very high fidelity multichannel audio and video links. DIP, a specialized realization of broader immersive technology, creates the complete aural and visual ambience that places a group in a virtual space to experience events occurring at a remote site or communicate naturally regardless of location. The assembled DIP experimental system will have three sites in different locations on the USC campus. The sites will have different types of equipment to test the effects of video and audio fidelity on the ease of use and functionality of different applications. Two will have high-definition (HD) video or digital video (DV) quality images projected onto wide screen wall displays completely integrated with an immersive audio reproduction system for a seamless, fully 3-dimensional aural environment with the correct spatial sound localization for participants. The system will be capable of storage and playback of the many streams of synchronized audio and video data (immersidata) and will utilize novel protocols for the low-latency, seamless, synchronized real-time delivery of immersidata over local-area and wide-area networks such as Internet. Partners in the project include New World Symphony (NWS) of Miami Beach, University of Maryland, and Georgia Tech. The latter two contribute the Internet2 infrastructure as server sites. The enabled research addresses the following challenges:
Low latency continuous media (CM) stream transmission, synchronization and data loss management
Low latency, real-time video and multichannel immersive audio acquisition and rendering
Real-time continuous media stream recording, storage, playback
Human factor studies: psychophysical, perceptual, artistic, performance evaluation
Robust integration of all these technical areas into seamless presentation to participants
Students participate in the assembly, integration, testing, and research applications using the DIP system via thesis research, directed research projects, and classroom work. Offering an opportunity for students to receive a broad cross-disciplinary education in media systems engineering, undergraduate engineering majors from USC and CSULA are invited to participate.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.