A Taxonomy of Human--MLLM Interaction in Early-Stage Sketch-Based Design Ideation
2026-02-25 • Human-Computer Interaction
Human-Computer Interaction
AI summaryⓘ
The authors studied how designers work with AI tools that understand sketches to come up with ideas. They observed 12 people using a system where both humans and AI could create or lead the design process in different ways. The authors found that designers often switch between leading themselves and letting the AI take the lead, rather than sticking to one method. This shows that collaboration with AI in design is flexible and changes during the creative process. Their work helps future researchers build better tools that support this dynamic teamwork.
multimodal large language modelsdesign ideationsketch-based interactionhuman-AI collaborationinteraction modescreative responsibilityuser studyinteraction logspost-task interviewsco-evolution
Authors
Weiayn Shi, Kenny Tsu Wei Choo
Abstract
As multimodal large language models (MLLMs) are increasingly integrated into early-stage design tools, it is important to understand how designers collaborate with AI during ideation. In a user study with 12 participants, we analysed sketch-based design interactions with an MLLM-powered system using automatically recorded interaction logs and post-task interviews. Based on how creative responsibility was allocated between humans and the AI, we predefined four interaction modes: Human-Only, Human-Lead, AI-Lead, and Co-Evolution, and analysed how these modes manifested during sketch-based design ideation. Our results show that designers rarely rely on a single mode; instead, human-led and AI-led roles are frequently interwoven and shift across ideation instances. These findings provide an empirical basis for future work to investigate why designers shift roles with AI and how interactive systems can better support such dynamic collaboration.