In Korea, occupational hazards are on the rise, particularly in the construction sector. According to a report on the ‘Occupational Safety Accident Status’ by Korea’s Ministry of Employment and Labor, the industry accounted for the highest number of accidents and fatalities among all sectors in 2021. To address this rise, the Korea Occupational Safety and Health Agency has been providing virtual reality (VR)-based construction safety content to daily workers as part of their educational training initiatives.
Nevertheless, current VR-based training methods grapple with two limitations. Firstly, VR-based construction safety training is essentially a passive exercise, with learners following one-way instructions that fail to adapt to their judgments and decisions. Secondly, there is an absence of an objective evaluation process during VR-based safety training. To address these challenges, researchers have introduced immersive VR-based construction safety content to promote active worker engagement and have conducted post-written tests. However, these post-written tests have limitations in terms of immediacy and objectivity. Furthermore, among the individual characteristics that can affect learning performance, including personal, academic, social, and cognitive aspects, cognitive characteristics may undergo changes during VR-based safety training.
To address this, a team of researchers led by Associate Professor Choongwan Koo from the Division of Architecture & Urban Division at Incheon National University, Korea, has now proposed a groundbreaking machine learning approach for forecasting personal learning performance in VR-based construction safety training that uses real-time biometric responses. Their paper was made available online on October 7, 2023, and will be published in Volume 156 of the journal Automation in Construction in December 2023.
“While traditional methods of evaluating learning outcomes that use post-written tests may lack objectivity, real-time biometric responses, collected from eye-tracking and electroencephalogram (EEG) sensors, can be used to promptly and objectively evaluate personal learning performances during VR-based safety training,” explains Dr. Koo.
The study involved 30 construction workers undergoing VR-based construction safety training. Real-time biometric responses, collected from eye-tracking and EEG to monitor brain activity, were gathered during the training to assess the psychological responses of the participants. Combining this data with pre-training surveys and post-training written tests, the researchers developed machine-learning-based forecasting models to evaluate the overall personal learning performance of the participants during VR-based safety training.
The team developed two models—a full forecast model (FM) that uses both demographic factors and biometric responses as independent variables and a simplified forecast model (SM) which solely relies on the identified principal features as independent variables to reduce complexity. While the FM exhibited higher accuracy in predicting personal learning performance than traditional models, it also displayed a high level of overfitting. In contrast, the SM demonstrated higher prediction accuracy than the FM due to a smaller number of variables, significantly reducing overfitting. The team thus concluded that the SM was best suited for practical use.
Explaining these results, Dr. Koo emphasizes, “This approach can have a significant impact on improving personal learning performance during VR-based construction safety training, preventing safety incidents, and fostering a safe working environment.” Further, the team also emphasizes the need for future research to consider various accident types and hazard factors in VR-based safety training.
In conclusion, this study marks a significant stride in enhancing personalized safety in construction environments and improving the evaluation of learning performance!
***
Reference
DOI: https://doi.org/10.1016/j.autcon.2023.105115
Authors: Dajeong Choi1, Seungwon Seo1, Hyunsoo Park1, Taehoon Hong2, and Choongwan Koo1,
Affiliations:
1Division of Architecture & Urban Design, Incheon National University
2Department of Architecture and Architectural Engineering, Yonsei University
About Incheon National University
Incheon National University (INU) is a comprehensive, student-focused university. It was founded in 1979 and given university status in 1988. One of the largest universities in South Korea, it houses nearly 14,000 students and 500 faculty members. In 2010, INU merged with Incheon City College to expand capacity and open more curricula. With its commitment to academic excellence and an unrelenting devotion to innovative research, INU offers its students real-world internship experiences. INU not only focuses on studying and learning but also strives to provide a supportive environment for students to follow their passion, grow, and, as their slogan says, be INspired.
Website: http://www.inu.ac.kr/mbshome/mbs/inuengl/index.html
About the author
Professor Choongwan Koo obtained his Ph.D. degree in the field of Sustainable Construction Engineering and Management from Yonsei University in 2014 and has a good mix of academic and industrial experiences. He has also worked as an Assistant Professor at the Department of Building Services Engineering, The Hong Kong Polytechnic University in 2016–2018. His research is focused on the field of smart construction management and intelligent facility management with a transformative and innovative strategy towards enhancing construction safety, for example, VR-based construction safety training, heat strain management, and vision-based safe working environments. He is currently focusing on smart construction management and intelligent facility management as a director of research projects funded by government agencies such as the National Research Foundation (NRF-2020R1C1C1004147).
END