Developing learners’ interpreting and public speaking skills by integrating virtual reality, augmented reality, and mobile technologies

Office of Research Affairs and Knowledge Transfer Knowledge Transfer Research Achievements of HKMU Scholars Developing learners’ interpreting and public speaking skills by integrating virtual reality, augmented reality, and mobile technologies

Developing learners' interpreting and public speaking skills by integrating virtual reality, augmented reality, and mobile technologies

Interpreting and public speaking play an important role in facilitating and enhancing communications. Authentic contexts or real working conditions are crucial for interpreting and public speaking training. Yet, learners often encounter difficulties in having sufficient authentic and guided practice due to limited training resources and the challenges for creating authentic contexts in traditional classrooms. The use of emerging technologies — such as virtual reality (VR), augmented reality (AR) and mobile technologies — has created new opportunities for self-study and scenario-based learning, which could be beneficial to interpreting and public speaking training. Since there is currently a lack of VR/AR mobile apps for interpreting and public speaking practice available on the market, it is worth exploring the integration of these technologies to provide an alternative way to develop talents in these areas.

To tackle the above challenges, Dr Venus Chan, Assistant Professor in the School of Arts and Social Sciences at Hong Kong Metropolitan University, investigated the use of VR/AR and mobile technologies for interpreting and public speaking training.

Screenshot of ‘VIP’

 

Dr Chan developed an open-access VR mobile app 'Virtual Interpreting Practice' (VIP) for bi-directional English-Chinese and Chinese-English interpreting learning. VIP contains 13 learning modules covering two different modes of interpreting — sight interpreting and consecutive interpreting — for both English-Chinese and Chinese-English language directions. The app offers a fully immersive experience of interpreting practice in various scenarios which simulates the real working environment of interpreters. Users are allowed to practice interpreting in a virtual stimulated environment like a conference room.

To evaluate the effectiveness of VIP, 31 native Chinese-speaking undergraduates with English as their second language or foreign language used the app to practice interpreting on their own for two months. The evaluation results showed that the use of the app enhanced the interpreting performance and bilingual competence of the participating students, as well as improving their learning motivation. In a pretest-posttest evaluation, the participants got overall higher scores in all aspects of interpreting practice after using the app, including both modes and language directions of interpreting. Furthermore, the participants generally believed that the use of the app improved their independent learning and critical thinking skills, as well as enhanced their motivation and confidence in interpreting learning.

Screenshot of ‘XR MALL’

 

Based on the outcomes of this work, Dr Chan further developed a mobile app featuring the use of extended reality (XR) — combining immersive technologies like VR and AR — together with mobile technologies for interpreting and public speaking learning. With the support from the Research Grants Council's Faculty Development Scheme, this mobile app 'XR MALL‘ (can be downloaded via Google Play or App Store) facilitates mobile-assisted language learning (MALL) that learners around the world can obtain a fully immersive experience and learn interpreting and public speaking anywhere and anytime. The app provides comprehensive learning resources, video lectures, VR exercises, and AR vocabulary lists. Learners can choose the learning content and difficulty level they want to practice.

Demo video regarding the interpreting practice of XR MALL

Demo video regarding the public speaking practice of XR MALL

The work of Dr Chan contributes to developing novel and unique self-study tools tailor-made for interpreting and public speaking learning and showing the benefits of integrating VR/AR and mobile technologies for interpreter training. Her findings also reveal students' self-discipline and willingness to learn as key factors leading to the effectiveness of the apps, highlighting the importance of continuously updating and providing new features to the apps to maintain students' interest in using them and retain their learning motivation. With proper utilisation of new technologies for learning, interpreting students and public speaking learners can better equip themselves to prepare for their future career.

More details about the work can be found in the following publications:

Download XR MALL: