Inclusive Design Insights from a Preliminary Image-Based Conversational Search Systems Evaluation

Yue Zheng

Lei Yu

Junmian Chen

Tianyu Xia

Yuanyuan Yin

Shan Wang

Haiming Liu

The digital realm has witnessed the rise of various search modalities, among which the Image-Based Conversational Search System stands out. This research delves into the design, implementation, and evaluation of this specific system, juxtaposing it against its text-based and mixed counterparts. A diverse participant cohort ensures a broad evaluation spectrum. Advanced tools facilitate emotion analysis, capturing user sentiments during interactions, while structured feedback sessions offer qualitative insights. Results indicate that while the text-based system minimizes user confusion, the image-based system presents challenges in direct information interpretation. However, the mixed system achieves the highest engagement, suggesting an optimal blend of visual and textual information. Notably, the potential of these systems, especially the image-based modality, to assist individuals with intellectual disabilities is highlighted. The study concludes that the Image-Based Conversational Search System, though challenging in some aspects, holds promise, especially when integrated into a mixed system, offering both clarity and engagement.

This publication uses Eye Tracking, Eye Tracking Screen Based, Facial Expression Analysis and GSR which is fully integrated into iMotions Lab

Learn more