
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | May 28, 2015 |
Latest Amendment Date: | July 7, 2017 |
Award Number: | 1539534 |
Award Instrument: | Continuing Grant |
Program Manager: |
William Bainbridge
IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | August 31, 2014 |
End Date: | January 31, 2020 (Estimated) |
Total Intended Award Amount: | $420,663.00 |
Total Awarded Amount to Date: | $420,663.00 |
Funds Obligated to Date: |
FY 2014 = $84,436.00 FY 2015 = $73,358.00 FY 2016 = $142,615.00 FY 2017 = $114,987.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
3112 LEE BUILDING COLLEGE PARK MD US 20742-5100 (301)405-6269 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
Hornbake Building College Park MD US 20742-5141 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01001415DB NSF RESEARCH & RELATED ACTIVIT 01001516DB NSF RESEARCH & RELATED ACTIVIT 01001617DB NSF RESEARCH & RELATED ACTIVIT 01001718DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This research project addresses the fundamental question of how we can use the existing ecosystem of networked devices in our surroundings to make sense of and exploit massive, heterogeneous, and multi-scale data anywhere and at any time. Assembling these devices into unified sensemaking environments would enable deep analysis in the field. Examples include managing heterogeneous data in scientific lab notebooks, scaffolding undergraduate classroom learning with examples, manuals, and videos, and supporting police investigation by linking facts, findings, and evidence. On a higher level, this concept would stimulate our digital economy by supporting fields such as design and creativity, command and control, and scientific discovery. However, despite this ready access to a myriad of handheld devices as well as those integrated in our physical environments, each individual device is currently designed to be the focus of attention, cannot easily be combined with other devices to improve productivity, and has limited computational and storage resources. This project introduces a comprehensive new approach called ubiquitous analytics (ubilytics) for harnessing these ever-present digital devices into unified environments for anywhere analysis and sensemaking of data.
Ubilytics draws on human-computer interaction, visual analytics, and ubiquitous computing as well as a synthesis of distributed, extended, and embodied cognition, based on three principles. First, universal interaction requires designing an interaction model that combines several devices into a holistic distributed interface, transparently bridges multiple devices, surfaces, and even physical objects, and unifies interaction with various data types. Second, flexible visual structures must be created in order to generate representations that adapt to varying device dimensions, resolution, viewing angle, and distance, support space and layout management in ego-centric and world-centric configurations, and can utilize both novel and appropriated displays for output. Third, efficient distributed architecture must be achieved through methods for discovering, merging, and synchronizing heterogeneous devices with support for a generic component model to facilitate reuse, offloading costly computation into the cloud, and meshing ubilytics environments for collaboration.
Sensemaking is often attributed to professional analysts finding meaning from observed data, but this research will take a comprehensive view of sensemaking for both casual and expert users, in both dedicated and mobile settings, and with both large-scale and small-scale datasets. This work will therefore benefit society by focusing on three example domains: (1) scientific discovery, (2) classroom learning, and (3) police investigation. It will also advance discovery and understanding by integrating the research in an undergraduate programming course used as a testbed for learning in ubilytics environments. Another goal is to broaden participation of underrepresented groups by engaging in a women in engineering program as well as by mentoring minority undergraduate students during summer research internships. Results, software, and documentation will be disseminated under Open Source and Creative Commons licenses.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
The Ubilytics project posed the fundamental question of how we can use the existing ecosystem of networked devices in our surroundings to understand and exploit massive, heterogeneous, and multi-scale data anywhere and at any time. Assembling these devices into unified sensemaking environments will enable deep analytical reasoning in the field, such as managing heterogeneous data in scientific lab notebooks, scaffolding undergraduate learning, and supporting investigative reporting by linking facts, findings, and evidence. On a higher level, this concept would stimulate our digital economy by supporting fields such as design and creativity, command and control, and scientific discovery.
However, despite this ready access to myriad devices both small—smartphones, tablets, and smartwatches—and large—desktop computers, multitouch displays, and mixed reality headsets—each individual device is currently designed to be the focus of attention, and cannot easily be combined with other devices to improve productivity. Another limiting factor is that interfaces and visual representations are typically designed for a particular form factor, and adapt poorly to new settings when migrating between devices. Finally, the computational and storage resources of most mobile devices are insufficient for the complex analyses necessary as well as for persistently storing session state across fluctuating ensembles of devices.
Ubiquitous analytics--or ubilytics--is a comprehensive new approach for harnessing these ever-present digital devices into unified environments for anywhere analysis and sensemaking of data. It draws on human-computer interaction, visual analytics, and ubiquitous computing as well as a synthesis of distributed, extended, and embodied cognition. The latter challenge traditional cognitive science by insisting that cognition is not limited to the brain, but also involves the body, the physical world, and the sociocultural context. Instead of studying cognitive aids in isolation, ubilytics therefore takes a system-level view of cognition that engages different representational media—such as humans, physical artifacts, mobile devices, and large displays—as well as interactions that are used to bring these media in coordination with each other—such as verbal and gestural cues, touching and sketching, partitioning and arranging, and note-taking and annotation. Thus, ubilytic environments benefit sensemaking by distributing cognitive aids in space and time; by off-loading memory, deduction, and reasoning; by harnessing our innate perceptual, cognitive, motor, spatial, and social skills; and by multiplying interaction and display surfaces.
The research in the ubilytics project revolved around three themes: (1) universal interaction, (2) flexible visual structures, and (3) efficient distributed architectures. Below I give examples of ubilytics research projects that span all three themes.
----
Central to ubiquitous analytics is Vistrates, a component model and a literate computing platform for developing, assembling, and sharing visualization components. Built on top of the Webstrates and Codestrates open source projects, Vistrates features cross-cutting components for visual representation, interaction, collaboration, and device responsiveness maintained in a component repository. The environment is collaborative and allows novices and experts alike to compose component pipelines for specific analytical activities. This allows for easily creating cross-platform and cross-device visualization applications on any device capable of running modern web technology, as well as integrating existing web-based visualization resources such as D3, Vega-Lite, and Plot.ly into these applications.
Paper: PDF
Video: https://youtu.be/nMmiWBJoJUc
----
With Vistrates as a foundation, we looked at the practicalities of actually distributing different views of a visualization application across multiple devices in the Vistribute project. By characterizing each view based on its preferred size, position, and relation to other views, Vistribute automatically calculates a layout across the available display surfaces in a ubiquitous sensemaking environment. This layout can change as devices enter and leave the environment, such as when powering up a laptop at a meeting, or pocketing a smartphone that is no longer needed.
Paper: PDF
Video: https://youtu.be/-ssJ_T-nbdI
---
Looking to the future, I have increasingly been investigating how to use ubiquitous analytics for situated sensemaking, such as when in the field and on the go. Several projects have contributed to this development. In There Is No Spoon, collaborators and I deployed the ImAxes virtual reality system for multidimensional analysis at a U.S. federal agency for an entire year and observed how economic analysts and data scientists interacted with it for their own datasets. In VisHive, we studied how to build opportunistic and ad-hoc computational clouds using only web-based technology and mobile devices. And in ongoing work we are making a foray into mixed and augmented reality to support situated visualization for ubiquitous analytics in the recently funded DataWorld NSF project.
There Is No Spoon: [PDF]
Last Modified: 06/02/2020
Modified by: Niklas E Elmqvist
Please report errors in award information by writing to: awardsearch@nsf.gov.