As a Human-Computer Interaction researcher, I work at the intersection of Design, AI, and Cognitive Science to solve HCI challenges in Applied AI. AI is now prevalent in both everyday and high-stakes software applications. However, end-users regularly encounter undesirable experiences from AI. A recent example is the presence of racial bias in Twitter's image cropping algorithm. A fundamental challenge is that existing software practices fall short when creating AI for diverse human needs. AI development prioritizes automation and task efficiency over other values. User experience (UX) design methods fail to support the designer's role in shaping key AI components such as training-data, labels, and learning models. That means an "AI-first" approach will fail to effectively integrate AI features within diverse human experiences. My research addresses these challenges through a human-centered view of the AI-UX intersection.
My current research combines technical HCI work with qualitative studies of AI software development in practice. First, I design, build, and evaluate human-AI systems to solve domain-specific problems in supporting human learning, creativity, and sensemaking. Based on cognitive models, I develop ways to incorporate AI capabilities into end-user workflows without disrupting essential human tasks. Second, drawing from my own experiences designing across AI and UX components, I investigate how UX designers and AI engineers might co-design AI-powered applications for diverse human users. Third, based on insights from qualitative interviews and design inquiries, I develop new methods and tools to empower designers by making AI accessible to UX design practices.
My broader research agenda is to spearhead the ethical creation of AI-powered applications by centering people in the design and development processes. By applying qualitative and technical HCI research methods, and undertaking domain-specific AI challenges, I incorporate multiple viewpoints to realize this goal. I believe this approach is needed to uncover the complex nature of AI-UX interchange for the successful design of AI experiences (AIX). My work has been published at top venues, including CHI, UIST, and InfoVis, and has received two Best Paper awards at CHI.
The Role of End-User Data in Designing Human-AI Experiences
My dissertation demonstrates novel methods and tools to make AI accessible to user-experience (UX) designers and allow them to shape Human-AI experiences.
Fields of interest
Human Computer Interaction
Human Centered AI
Interactive Intelligent Systems
MS Information (HCI Specialization), UMSI, 2015
PhD Information, UMSI, 2021
Hariharan Subramonyam, Colleen Seifert, and Eytan Adar. (2020, November). ProtoAI: Model-Informed Prototyping for AI-Powered Applications [To Appear IUI'2021]
Hariharan Subramonyam, Colleen Seifert, Priti Shah, and Eytan Adar. 2020. texSketch: Active Diagramming through Pen-and-Ink Annotations. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems [Best Paper Award]
Maulishree Pandey, Hariharan Subramonyam, Brooke Sasia, Steve Oney, Sile O'Modhrain. 2020. Explore, Create, Annotate: Designing Digital Drawing Tools with Visually Impaired People. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
Hariharan Subramonyam, Steven M. Drucker, and Eytan Adar. 2019. Affinity Lens: Data-Assisted Affinity Diagramming with Augmented Reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA. [Best Paper Award]
Hariharan Subramonyam, Wilmot Li, Eytan Adar, and Mira Dontcheva. 2018. TakeToons: Script driven Performance Animation. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST '18). ACM, New York, NY, USA.
Joyojeet Pal, Anandhi Viswanathan, Priyank Chandra, Anisha Nazareth, Vaishnav Kameswaran, Hariharan Subramonyam, Aditya Johri, Mark S. Ackerman, and Sile O'Modhrain. 2017. Agency in Assistive Technology Adoption: Visual Impairment and Smartphone Use in Bangalore. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA.
Hariharan Subramonyam, Eytan Adar. 2019. SmartCues: A Multitouch Query Approach for Details-on-Demand through Dynamically Computed Overlays. In IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 1, pp. 597-607, Jan. 2019.
Brazel, David, Robin Corley, Chanda Phelan, Maia Frieser, Hariharan Subramonyam, Sally-Ann Rhea, Helen Vernier, John Hewitt, Paul Resnick, and Scott Vrieze. "The application of ecological momentary assessment and geolocation to a longitudinal twin study of substance use." In BEHAVIOR GENETICS, vol. 47, no. 6, pp. 676-677. 233 Spring St. New York, NY, USA: SPRINGER, 2017.
Workshops and Posters
Hariharan Subramonyam, Bongshin Lee, Sile O'Modhrain, and Eytan Adar. 2017. Data dialog: facilitating collaborative decision making through data-driven conversations. In Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth '17). ACM, New York, NY, USA.
Hariharan Subramonyam, Yuncheng Shen, and Samantha Lauren Jones. 2015. SIGCHI: Enabling Context for Traditional Chinese Paintings with "Rice Paper". In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 49-54
Hariharan Subramonyam. 2015. SIGCHI: Magic Mirror - Embodied Interactions for the Quantified Self. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 1699-1704.