Sounds Healthy: Modelling sound-evoked consumer food choice through visual attention

Danni Peng-Li

Signe L. Mathiesen

Raymond C.K. Chan

Derek V. Byrne

Qian Janice Wang

Food choice is a multifaceted construct that is not solely guided by our internal incentives. In fact, sensory scientist, consumer psychologists, and marketers have demonstrated that external ambient cues, including background music, can influence myriads of subconscious consumer behaviors, effectively leading to increased sales of food and beverages. However, the vast majority of literature in on this topic has thus far been confined to monocultural field studies in which the underlying mechanisms of food choice are unexplored. We therefore studied the explicit and implicit effects of custom-composed soundtracks on food choices and eye-movements in consumers from both East and West. Firstly, based on the results from a pre-study (N = 396), we composed a ‘healthy’ and ‘unhealthy’ soundtrack. Subsequently, we recruited 215 participants from China (n = 114) and Denmark (n = 101) respectively for in an in-laboratory eye-tracking food choice paradigm. For each culture, half of the participants listened to the ‘healthy’ soundtrack and the other half to the ‘unhealthy’ soundtrack during the experiment. Chi-square tests of independence revealed that across cultures, the healthy (vs. unhealthy) soundtrack led to more healthy food choices. Similarly, the generalized linear mixed models showed that the healthy soundtrack induced more and longer fixations on healthy (vs. unhealthy) food. Finally, a multiple mediation analysis signified a partial mediation effect of sound on food choice through the mediators of fixation duration, fixation count, and revisit count. Our results indicate that, with strategically chosen soundscapes, it is possible to influence consumers’ decision-making processes and guide their attention towards healthier foods, providing valuable knowledge for local as well as global food business.

This publication uses Eye Tracking which is fully integrated into iMotions Lab

Learn more