Healey, PatrickSerafi, Mohamad2024-10-272024[1] Doherty, K. and Doherty, G., 2018. Engagement in HCI: conception, theory and measurement. ACM Computing Surveys (CSUR), 51(5), pp.1-39. [2] Hristijan Gjoreski, I. M. ,. J. A. W. A. ,. A. C. ,. S. S. ,. I. K. ,. M. F. ,. P. W. ,. J. B. ,. M. G. ,. C. N., 2023. OCOsense Glasses – Monitoring Facial Gestures and Expressions for Augmented Human-Computer Interaction: OCOsense Glasses for Monitoring Facial Gestures and Expressions. s.l., Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, pp. 1-4. [3] Essa, S. H. N. D. A. S. V. K. I., 2019. Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras. Waikoloa, HI, USA, IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1626-1635. [4] M.P. Jacob Habgood, S. A. D. W. D. M., 2017. HCI Lessons From PlayStation VR. Extended abstracts publication of the annual symposium on computer-human interaction in play, pp. 125-135. [5] Daniel McDuff, M. A. A. M. J. T. M. M. R. e. K., 2016. AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit.. Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems, pp. 3723-3726. [6] Earnshaw, R. A., 2014. Virtual reality systems. s.l.:Academic press. vol. 2, pp. 740–741, August 1987 [Digests 9th Annual Conf. Magnetics Japan, p. 301, 1982]. [7] Anwar, S., Butt, A.A. and Menekse, M., 2022, October. Exploring relationships between academic engagement, application engagement, and academic performance in a First-Year engineering course. In 2022 IEEE Frontiers in Education Conference (FIE) (pp. 1-5). IEEE. [8] Kocabalil, A.B., Laranjo, L. and Coiera, E., 2018, July. Measuring user experience in conversational interfaces: a comparison of six questionnaires. In Proceedings of the 32nd International BCS Human Computer Interaction Conference 32 (pp. 1-12). [9] Allcoat, D. and von M¨uhlenen, A., 2018. Learning in virtual reality: Effects on performance, emotion and engagement. Research in Learning Technology, 26. [10] Castiblanco Jimenez, I.A., Gomez Acevedo, J.S., Olivetti, E.C., Marcolin, F., Ulrich, L., Moos, S. and Vezzetti, E., 2022. User Engagement Comparison between Advergames and Traditional Advertising Using EEG: Does the User’s Engagement Influence Purchase Intention?. Electronics, 12(1), p.122. [11] Peters, C., Castellano, G. and De Freitas, S., 2009, November. An exploration of user engagement in HCI. In Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots (pp. 1-3). [12] O’Brien, H.L., Cairns, P. and Hall, M., 2018. A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. International Journal of Human- Computer Studies, 112, pp.28-39. [13] Hookham, G., Nesbitt, K. and Kay-Lambkin, F., 2016, February. Comparing usability and engagement between a serious game and a traditional online program. In Proceedings of the Australasian Computer Science week multiconference (pp. 1-10). [14] Nonis, F., Olivetti, E.C., Marcolin, F., Violante, M.G., Vezzetti, E. and Moos, S., 2020. Questionnaires or Inner Feelings: Who Measures the Engagement Better?. Applied Sciences, 10(2), p.609. [15] M. Young, The Technical Writer’s Handbook. Mill Valley, CA: University Science, 1989 pp 570–583. [16] Von Eckardt, B., 1995. What is cognitive science?. MIT press. [17] Walsh, K.R. and Pawlowski, S.D., 2002. Virtual reality: A technology in need of IS research. Communications of the Association for Information Systems, 8(1), p.20. [18] Healey, P.G., White, G., Eshghi, A., Reeves, A.J. and Light, A., 2008. Communication spaces. Computer Supported Cooperative Work (CSCW), 17, pp.169-193. [19] Healey, P.G., De Ruiter, J.P. and Mills, G.J., 2018. Editors’ introduction: miscommunication. Topics in Cognitive Science, 10(2), pp.264- 278. [20] Slater, M. and Sanchez-Vives, M.V., 2016. Enhancing our lives with immersive virtual reality. Frontiers in Robotics and AI, 3, p.74. [21] Alsaggaf, W., Tsaramirsis, G., Al-Malki, N., Khan, F.Q., Almasry, M., Abdulhalim Serafi, M. and Almarzuqi, A., 2020. Association of game events with facial animations of computer-controlled virtual characters based on probabilistic human reaction modeling. Applied Sciences, 10(16), p.5636. [22] Stephenson, N., 1994. Snow crash. Penguin UK. [23] Carroll, J.M., 1997. Human–computer interaction: Psychology as a science of design. International journal of human-computer studies, 46(4), pp.501-522. [24] Wohlgenannt, I., Simons, A. and Stieglitz, S., 2020. Virtual reality. Business and Information Systems Engineering, 62, pp.455-461. [25] Kite, M. and Whitley, B.E., 2012. Principles of research in behavioral science. Routledge. [26] Chen, S., Epps, J., Ruiz, N. and Chen, F., 2011, February. Eye activity as a measure of human mental effort in HCI. In Proceedings of the 16th international conference on Intelligent user interfaces (pp. 315-318). [27] Birkett, S., Galpin, A., Cassidy, S., Marrow, L. and Norgate, S., 2011. How revealing are eye-movements for understanding web engagement in young children. CHI’11 Extended Abstracts on Human Factors in Computing Systems, pp.2251-2256. [28] Ishii, R., Nakano, Y.I. and Nishida, T., 2013. Gaze awareness in conversational agents: Estimating a user’s conversational engagement from eye gaze. ACM Transactions on Interactive Intelligent Systems (TiiS), 3(2), pp.1-25. [29] De Carolis, B., D’Errico, F., Macchiarulo, N. and Palestra, G., 2019, October. “Engaged Faces”: Measuring and Monitoring Student Engagement from Face and Gaze Behavior. In IEEE/WIC/ACM International Conference on Web Intelligence-Companion Volume (pp. 80-85). [30] Kwok, C.K., 2018. Understanding user engagement level during tasks via facial responses, eye gaze and mouse movements. [31] Meta (2022) Meta Quest Pro VR Headset: Features and Specifications. Available from: https://www.meta.com/quest [32] Homke, P. (2018) ‘Effects of blink manipulation in face-to-face communication’, Journal of Communication Studies, 23(4), pp. 121-130 [33] Bavelas, J. and Chovil, N., 2018. Some pragmatic functions of conversational facial gestures. Gesture, 17(1), pp.98-127. [34] Nota, N., Trujillo, J.P. and Holler, J., 2021. Facial signals and social actions in multimodal face-to-face interaction. Brain Sciences, 11(8), p.1017. [35] Nusseck, M., Cunningham, D.W., Wallraven, C. and B¨ulthoff, H.H., 2008. The contribution of different facial regions to the recognition of conversational expressions. Journal of vision, 8(8), pp.1-1. [36] Marinelli, M., 2005. The many facets of the locomotor response to a novel environment test: theoretical comment on Mitchell, Cunningham, and Mark (2005).https://hdl.handle.net/20.500.14154/73309A Virtual reality project to study confused "puzzled" facial expression and what might be considered a confused facial expression. I have uploaded the HEAR that confirms I got my master degree.Advances in the quality of virtual reality hardware and software mean that it is now possible to have real-time conversations using high fidelity avatars driven by live capture of facial expressions. Not all facial movements are captured or animated by current state-of-the-art systems. This raises two questions. First, which types of facial expressions are most important for effective communication? Second, how effectively are they reproduced by avatars and perceived by users? One especially important class of facial expressions for natural human interaction are those associated with communication of misunderstanding: ‘confused’ or ‘puzzled’ faces. To answer these questions a corpus of 9 facial expressions of confusion plus 5 baseline expressions are constructed using facial movement data captured from natural conversation. Each expression is then reconstructed using high quality avatars. The set of expressions are presented to users as 3D animations and participants judge whether they show confusion. Data on the type and speed of response and from live eye tracking show that all facial expressions were significant individual levels to predict pressing yes compared to fluent expression which we think is far from the confused facial expression. the direction of eyes or head roll didn’t make much.16envirtual realityfacial expressionshuman-human InteractionTHE AFFECT OF FACIAL EXPRESSION ANIMATIONS IN SYSTEMS THAT INCORPORATES A VRThesis