Award Abstract # 1539534
CAREER: Ubilytics: Harnessing Existing Device Ecosystems for Anywhere Sensemaking

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF MARYLAND, COLLEGE PARK
Initial Amendment Date: May 28, 2015
Latest Amendment Date: July 7, 2017
Award Number: 1539534
Award Instrument: Continuing Grant
Program Manager: William Bainbridge
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 31, 2014
End Date: January 31, 2020 (Estimated)
Total Intended Award Amount: $420,663.00
Total Awarded Amount to Date: $420,663.00
Funds Obligated to Date: FY 2013 = $5,267.00
FY 2014 = $84,436.00

FY 2015 = $73,358.00

FY 2016 = $142,615.00

FY 2017 = $114,987.00
History of Investigator:
  • Niklas Elmqvist (Principal Investigator)
    elm@umd.edu
Recipient Sponsored Research Office: University of Maryland, College Park
3112 LEE BUILDING
COLLEGE PARK
MD  US  20742-5100
(301)405-6269
Sponsor Congressional District: 04
Primary Place of Performance: University of Maryland College Park
Hornbake Building
College Park
MD  US  20742-5141
Primary Place of Performance
Congressional District:
04
Unique Entity Identifier (UEI): NPU8ULVAAS23
Parent UEI: NPU8ULVAAS23
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01001314DB NSF RESEARCH & RELATED ACTIVIT
01001415DB NSF RESEARCH & RELATED ACTIVIT

01001516DB NSF RESEARCH & RELATED ACTIVIT

01001617DB NSF RESEARCH & RELATED ACTIVIT

01001718DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 7367
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This research project addresses the fundamental question of how we can use the existing ecosystem of networked devices in our surroundings to make sense of and exploit massive, heterogeneous, and multi-scale data anywhere and at any time. Assembling these devices into unified sensemaking environments would enable deep analysis in the field. Examples include managing heterogeneous data in scientific lab notebooks, scaffolding undergraduate classroom learning with examples, manuals, and videos, and supporting police investigation by linking facts, findings, and evidence. On a higher level, this concept would stimulate our digital economy by supporting fields such as design and creativity, command and control, and scientific discovery. However, despite this ready access to a myriad of handheld devices as well as those integrated in our physical environments, each individual device is currently designed to be the focus of attention, cannot easily be combined with other devices to improve productivity, and has limited computational and storage resources. This project introduces a comprehensive new approach called ubiquitous analytics (ubilytics) for harnessing these ever-present digital devices into unified environments for anywhere analysis and sensemaking of data.

Ubilytics draws on human-computer interaction, visual analytics, and ubiquitous computing as well as a synthesis of distributed, extended, and embodied cognition, based on three principles. First, universal interaction requires designing an interaction model that combines several devices into a holistic distributed interface, transparently bridges multiple devices, surfaces, and even physical objects, and unifies interaction with various data types. Second, flexible visual structures must be created in order to generate representations that adapt to varying device dimensions, resolution, viewing angle, and distance, support space and layout management in ego-centric and world-centric configurations, and can utilize both novel and appropriated displays for output. Third, efficient distributed architecture must be achieved through methods for discovering, merging, and synchronizing heterogeneous devices with support for a generic component model to facilitate reuse, offloading costly computation into the cloud, and meshing ubilytics environments for collaboration.

Sensemaking is often attributed to professional analysts finding meaning from observed data, but this research will take a comprehensive view of sensemaking for both casual and expert users, in both dedicated and mobile settings, and with both large-scale and small-scale datasets. This work will therefore benefit society by focusing on three example domains: (1) scientific discovery, (2) classroom learning, and (3) police investigation. It will also advance discovery and understanding by integrating the research in an undergraduate programming course used as a testbed for learning in ubilytics environments. Another goal is to broaden participation of underrepresented groups by engaging in a women in engineering program as well as by mentoring minority undergraduate students during summer research internships. Results, software, and documentation will be disseminated under Open Source and Creative Commons licenses.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 19)
Andrea Batch, Andrew Cunningham, Maxime Cordeil, Niklas Elmqvist, Tim Dwyer, Bruce H. Thomas, Kim Marriott "There Is No Spoon: Evaluating Performance, Space Use, and Presence with Expert Domain Users in Immersive Analytics" IEEE Transactions on Visualization & Computer Graphics , 2020 10.1109/TVCG.2019.2934803
Andrea Batch, Biswaksen Patnaik, Moses Akazue, Niklas Elmqvist "Scents and Sensibility: Evaluating Information Olfactation" Proceedings of the ACM Conference on Human Factors in Computing Systems , 2020 10.1145/3313831.3376733
Andrea Batch, Hanuma Teja Maddali, Kyungjun Lee, Niklas Elmqvist "Gesture and Action Discovery for Evaluating Virtual Environments with Semi-Supervised Segmentation of Telemetry Records" Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality , 2018 10.1109/TVCG.2018.2865119
Andrea Batch, Niklas Elmqvist ""All Right, Mr. DeMille, I'm Ready for My Closeup:" Adding Meaning to User Actions from Video for Immersive Analytics" Proceedings of the Machine Learning from User Interactions Workshop , 2019
Biswaksen Patnaik, Andrea Batch, Niklas Elmqvist "Information Olfactation: Harnessing Scent to Convey Data" IEEE Transactions on Visualization and Computer Graphics , v.25 , 2019 , p.726 10.1109/TVCG.2018.2865237
Biswaksen Patnaik, Andrea Batch, Niklas Elmqvist "Olfactory Analysis: Exploring the Design Space of Smell for Data Visualization" Proceedings of the Workshop on Multimodal Interaction for Data Visualization , 2018
J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, N. Elmqvist "Visualization Beyond the Desktop ? The Next Big Thing" IEEE Computer Graphics & Applications , v.34 , 2014 , p.26 http://dx.doi.org/10.1109/MCG.2014.82
S. K. Badam, F. Amini, N. Elmqvist, P. Irani "Supporting Visual Exploration for Multiple Users in Large Display Environments" Proceedings of the IEEE Conference on Visual Analytics Science & Technology , 2016 10.1109/VAST.2016.7883506
S. K. Badam, J. Zhao, N. Elmqvist, D. S. Ebert "TimeFork: Interactive Prediction of Time Series" Proceedings of the ACM Conference on Human Factors in Computing Systems , 2016 , p.5409 https://doi.org/10.1145/2858036.2858150
S. K. Badam, Z. Zheng, E. Wall, A. Endert, N. Elmqvist "Supporting Team-First Visual Analytics through Group Activity Representations" Proceedings of Graphics Interface , 2017 , p.208 10.20380/GI2017.26
Sriram Karthik Badam, Andreas Mathisen, Roman Rädle, Clemens Nylandsted Klokmose, Niklas Elmqvist "Vistrates: A Component Model for Ubiquitous Analytics" IEEE Transactions on Visualization & Computer Graphics , v.25 , 2019 , p.586 10.1109/TVCG.2018.2865144
(Showing: 1 - 10 of 19)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The Ubilytics project posed the fundamental question of how we can use the existing ecosystem of networked devices in our surroundings to understand and exploit massive, heterogeneous, and multi-scale data anywhere and at any time. Assembling these devices into unified sensemaking environments will enable deep analytical reasoning in the field, such as managing heterogeneous data in scientific lab notebooks, scaffolding undergraduate learning, and supporting investigative reporting by linking facts, findings, and evidence. On a higher level, this concept would stimulate our digital economy by supporting fields such as design and creativity, command and control, and scientific discovery.

However, despite this ready access to myriad devices both small—smartphones, tablets, and smartwatches—and large—desktop computers, multitouch displays, and mixed reality headsets—each individual device is currently designed to be the focus of attention, and cannot easily be combined with other devices to improve productivity. Another limiting factor is that interfaces and visual representations are typically designed for a particular form factor, and adapt poorly to new settings when migrating between devices. Finally, the computational and storage resources of most mobile devices are insufficient for the complex analyses necessary as well as for persistently storing session state across fluctuating ensembles of devices.

Ubiquitous analytics--or ubilytics--is a comprehensive new approach for harnessing these ever-present digital devices into unified environments for anywhere analysis and sensemaking of data. It draws on human-computer interaction, visual analytics, and ubiquitous computing as well as a synthesis of distributed, extended, and embodied cognition. The latter challenge traditional cognitive science by insisting that cognition is not limited to the brain, but also involves the body, the physical world, and the sociocultural context. Instead of studying cognitive aids in isolation, ubilytics therefore takes a system-level view of cognition that engages different representational media—such as humans, physical artifacts, mobile devices, and large displays—as well as interactions that are used to bring these media in coordination with each other—such as verbal and gestural cues, touching and sketching, partitioning and arranging, and note-taking and annotation. Thus, ubilytic environments benefit sensemaking by distributing cognitive aids in space and time; by off-loading memory, deduction, and reasoning; by harnessing our innate perceptual, cognitive, motor, spatial, and social skills; and by multiplying interaction and display surfaces.

The research in the ubilytics project revolved around three themes: (1) universal interaction, (2) flexible visual structures, and (3) efficient distributed architectures. Below I give examples of ubilytics research projects that span all three themes.

----

Central to ubiquitous analytics is Vistrates, a component model and a literate computing platform for developing, assembling, and sharing visualization components. Built on top of the Webstrates and Codestrates open source projects, Vistrates features cross-cutting components for visual representation, interaction, collaboration, and device responsiveness maintained in a component repository. The environment is collaborative and allows novices and experts alike to compose component pipelines for specific analytical activities. This allows for easily creating cross-platform and cross-device visualization applications on any device capable of running modern web technology, as well as integrating existing web-based visualization resources such as D3, Vega-Lite, and Plot.ly into these applications. 

 Paper: PDF

Video: https://youtu.be/nMmiWBJoJUc 

----

With Vistrates as a foundation, we looked at the practicalities of actually distributing different views of a visualization application across multiple devices in the Vistribute project. By characterizing each view based on its preferred size, position, and relation to other views, Vistribute automatically calculates a layout across the available display surfaces in a ubiquitous sensemaking environment. This layout can change as devices enter and leave the environment, such as when powering up a laptop at a meeting, or pocketing a smartphone that is no longer needed.

Paper: PDF

Video: https://youtu.be/-ssJ_T-nbdI 

---

Looking to the future, I have increasingly been investigating how to use ubiquitous analytics for situated sensemaking, such as when in the field and on the go. Several projects have contributed to this development. In There Is No Spoon, collaborators and I deployed the ImAxes virtual reality system for multidimensional analysis at a U.S. federal agency for an entire year and observed how economic analysts and data scientists interacted with it for their own datasets. In VisHive, we studied how to build opportunistic and ad-hoc computational clouds using only web-based technology and mobile devices. And in ongoing work we are making a foray into mixed and augmented reality to support situated visualization for ubiquitous analytics in the recently funded DataWorld NSF project.

There Is No Spoon: [PDF]


Last Modified: 06/02/2020
Modified by: Niklas E Elmqvist

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page