Articulate

Demo of interaction with the Articulate speech+gesture input system, natural language interpreter, and visualization presentation system. 2019

Toward Conversational interaction in Visual Data Exploration

Instead of learning to use a complex data analysis tool, imagine if you could just pose direct questions about your data and get visualizations in response from a smart assistant. This vision drives our work on the Articulate project, a collaboration between the Electronic Visualization Laboratory at UIC, led by my advisor Andrew Johnson, and the UIC Natural Language Processing group, led by Prof. Barbara Di Eugenio. In this project we explore multi-modal and conversational interaction for visual data exploration in a large display environment.

I have been the lead developer in the visualization and multi-modal speech+gesture input system, which communicates with a natural language interpreter built by PhD student Abhinav Kumar. Abhinav and I conducted a pre-design study to observe how people might interact using speech and gestures with visualizations on a large display, and I am leading the analysis of study results from a human-computer interaction perspective. I also mentored 5 undergraduate students in research projects that involved components of this input system.

Collaborators

Natural Language Processing: Abhinav Kumar, Barbara Di Eugenio,

Electronic Visualization Laboratory: Abeer Alsaiari, Andrew Johnson

Undergraduate researchers: Vasanna Nguyen, Krupa Patel, Ryan Fogarty, Joseph Borowicz, Vijay Mahida

Past collaborators: Jason Leigh (Hawaii), Alberto Gonzales (Hawaii), Khairi Reda (IUPUI)

Papers and Posters

Evaluation of Scalable Interactions over Multiple Views in Large Display Environments

Jillian Aurisano, Abhinav Kumar, Barbara DiEugenio, Andrew Johnson. Poster to be presented at the Information Visualization conference at IEEE VisWeek in Vancouver, Canada on October 22, 2019.

Multimodal Coreference Resolution for Exploratory Data Visualization Dialogue: Context-Based Annotation and Gesture Identification

Abhinav Kumar, Barbara DiEugenio, Jillian Aurisano, Andrew Johnson, Abeer Alsaiari, Nigel Flowers, Alberto Gonzales, Jason Leigh. The 21st Workshop on the Semantics and Pragmatics of Dialogue (SemDIAL) in Saarburken Germany, August 2017.

Towards a dialogue system that supports rich visualizations of data

Abhinav Kumar, Jillian Aurisano, Alberto Gonzales, Jason Leigh, Andrew Johnson, Barbara DiEugenio. Meeting of the Special Interest Group on Discourse and Dialogue (SigDIAL) co-located with Interspeech in Los Angeles CA in September 2016.

Articulate2: Toward a Conversational Interface for Visual Data Exploration

Jillian Aurisano, Abhinav Kumar, Alberto Gonzales, Jason Leigh, Barbara DiEugenio, Andrew Johnson.Poster presented at the Information Visualization conference at IEEE VisWeek in Baltimore MD in October 2016.

"Show Me Data.” Observational Study of a Conversational Interface in Visual Data Exploration

Jillian Aurisano, Abhinav Kumar, Alberto Gonzales, Khairi Reda, Jason Leigh, Barbara DiEugenio, Andrew Johnson.Poster presented at the Information Visualization conference at IEEE VisWeek in Chicago, IL in October 2015.

HONORABLE MENTION