1Honda Research Institute Japan Co. Ltd., Honcho 81, Wako, Saitama 3510114, Japan.
2Graduate School of Information Science, University of Tokyo, 731 Hongo, Bunkyoku, Tokyo 1138656, Japan.
3National Institute of Informatics, 212 Hitotsubashi, Chiyodaku, Tokyo 1018430, Japan.
This paper describes a new user–machine interactive scheme using crossmodal computation, in order to estimate a user's interest. The scheme builds on our previous study that used eye gaze detection alone to extract visual preference from users. However, that type of interaction scheme was insufficient since it was unable to detect a user's intensity. Thus, our proposed scheme will suggest how different sensor modalities can be used to extract emotional aspects of the intensity of interest, which could be segregated from novelty of given visual stimuli. In addition, our speculation is that repeated interest estimation would be needed to gain accurate interest estimation. Hence a user's habituation detection, dedicated from a user's boredom and/or aversion, must also be taken into account. Our computational results has shown that the proposed scheme was efficiently capable of achieving accurate interest estimation with our crossmodal computation.
user–machine interactive scheme, crossmodal computation, interest estimation, eye gaze detection, arousal detection, habituation detection