The ability to efficiently assess the resolution of VR videos is critical for the implementation and advertising of VR products. Pupil responses and Galvanic Skin Response (GSR) are direct and objective mirrors of the emotional activity of humans. They are unaffected by subjective will and has excellent real-time performance, which can be used for VR quality assessment. However, there is little work to combine the two signals to evaluate VR resolution so far. Whether subjects’ visual patterns alter with the VR resolution changes is also an interesting point that has not been studied yet. In this paper, a dataset containing subjects’ pupil responses and GSR under different VR resolutions was built. Based on it, Area of Interest (AOI) was utilized to analyze subjects’ visual patterns and found there were differences and similarities under varied VR video resolutions. To extract signal features at different VR resolutions more efficiently, a hybrid attention network was proposed. Experiment results demonstrated that the model can distinguish pupil responses and GSR signals under different VR video resolutions more efficiently, providing a feasibility verification for us to detect VR video resolutions using physiological signals.
Scientific Publications from Researchers Using iMotions
iMotion is used for some of the most interesting human behavior research studies done by top researchers around the world. Contact us to have your publication featured here.All Publications