University of Michigan School of Information
Beyond the sticky note: Using augmented reality to enhance the affinity diagram
Tuesday, 04/30/2019
A paper authored by UMSI PhD student Hariharan Subramonyam and associate professor Eytan Adar has received a Best Paper Award from the 2019 ACM CHI conference in Glasgow.
Their paper, “Affinity Lens: Data-Assisted Affinity Diagramming with Augmented Reality,” presents a mobile-based augmented reality (AR) application that bridges the gap between the paper notes favored by many researchers and the benefits offered by on-demand online data analysis.
Affinity diagramming is the long-standing practice of placing pieces of paper, often sticky notes, on walls or large surfaces – called affinity walls – during brainstorming sessions. Collaborators can easily identify themes and cluster data points by moving the notes around as they find connections.
Software has been designed to replicate and replace this practice, but people have stubbornly held on to the decades-old sticky note tradition. Physical notes can be set up anywhere, easily moved around, and can take advantage of large physical spaces like walls and white boards in ways that are hard to achieve even on large screens.
Where affinity diagrams come up short is when researchers draw data from complex and varied sources such as surveys, sensor data and interaction logs.
In interviews, affinity diagram practitioners reported that they would bring their laptops to sessions to access quantitative insights from spreadsheets, and then make note of the findings on the wall.
“This approach is not only time consuming, but also problematic in that
coherence between the analysis on the wall and the analysis on the screen is hard to maintain,” says Subramonyam.
To address this, Subramonyam and Adar collaborated with co-author Steven Drucker at Microsoft to develop Affinity Lens, a mobile augmented reality application that brings data to the physical space – in this case, sticky notes – to assist designers with affinity diagramming
Affinity Lens combines computer vision technology with visual data analysis to detect notes and clusters, and overlay insights on top of the physical notes.
How it works
Designers can load data into the mobile app or desktop utility. The app can then create individual notes from each data point, and then prints these notes, each with its own unique AR marker. Once the notes are printed and laid out on the affinity wall, the designer can use his or her phone or tablet to scan the notes.
Using the app, users can more easily identify themes, cluster notes, make comparisons, create word clouds, and even analyze results in a much faster timeframe than with traditional sticky note affinity diagrams.
“When we looked at how people use the diagrams today, we came across many human-computer interaction research papers that conducted affinity diagramming but also separately did quantitative data analysis,” says Subramonyam.
“Our approach makes this kind of work much easier. We call this Data Assisted Affinity Diagramming (DAAD). Based on feedback from designers, they saw Affinity Lens as a way to bridge the communication gap between designers and data analysts. Insights generated using DAAD could serve as an initial set of hypotheses for further quantitative analysis and statistical testing.”
- Jessica Webster, UMSI PR Specialist