Influence of environmental renewal on students’ mental health development under the background of long-term online teaching

Ligui Zhu, Boli Li

Article ID: 9288
Vol 9, Issue 1, 2025


Abstract


In the wake of the COVID-19 pandemic, the prevalence of online education in primary education has exhibited an upward trajectory. Relative to traditional learning environments, online instruction has evolved into a pivotal pedagogical modality for contemporary students. Thus, to comprehensively comprehend the repercussions of environmental changes on students’ psychological well-being in the backdrop of prolonged online education, this study employs an innovative methodology. Founded upon three elemental feature sequences—images, acoustics, and text extracted from online learning data—the model ingeniously amalgamates these facets. The fusion methodology aims to synergistically harness information from diverse perceptual channels to capture the students’ psychological states more comprehensively and accurately. To discern emotional features, the model leverages support vector machines (SVM), exhibiting commendable proficiency in handling emotional information. Moreover, to enhance the efficacy of psychological well-being prediction, this study incorporates an attention mechanism into the traditional Convolutional Neural Network (CNN) architecture. By innovatively introducing this attention mechanism in CNN, the study observes a significant improvement in accuracy in identifying six psychological features, demonstrating the effectiveness of attention mechanisms in deep learning models. Finally, beyond model performance validation, this study delves into a profound analysis of the impact of environmental changes on students’ psychological well-being. This analysis furnishes valuable insights for formulating pertinent instructional strategies in the protracted context of online education, aiding educational institutions in better addressing the challenges posed to students’ psychological well-being in novel learning environments.


Keywords


online teaching; mental health; CNN; SVM; multimodal features; psychological states environmental renewal

Full Text:

PDF


References


Basic, S., Markovic, I., Sporis, D., et al. (2017). Psychogenic non epileptic seizure status – diagnostic and treatment challenge. Psychiatria Danubina, 29(1), 87–89. https://doi.org/10.24869/psyd.2017.87

Cai, Y., Cai, H., & Wan, X. (2019). Multi-Modal Sarcasm Detection in Twitter with Hierarchical Fusion Model. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. https://doi.org/10.18653/v1/p19-1239

Cho, S.-M., Sung, M.-J., Shin, K.-M., et al. (2012). Does Psychopathology in Childhood Predict Internet Addiction in Male Adolescents? Child Psychiatry & Human Development, 44(4), 549–555. https://doi.org/10.1007/s10578-012-0348-4

Gui, T., Zhu, L., Zhang, Q., et al. (2019). Cooperative Multimodal Approach to Depression Detection in Twitter. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 110–117. https://doi.org/10.1609/aaai.v33i01.3301110

Gunes, H., & Piccardi, M. (2007). Bi-modal emotion recognition from expressive face and body gestures. Journal of Network and Computer Applications, 30(4), 1334–1345. https://doi.org/10.1016/j.jnca.2006.09.007

Hilliard, J., Kear, K., Donelan, H., et al. (2020). Students’ experiences of anxiety in an assessed, online, collaborative project. Computers & Education, 143, 103675. https://doi.org/10.1016/j.compedu.2019.103675

Huang, H., Hu, Z., Wang, W., et al. (2020). Multimodal Emotion Recognition Based on Ensemble Convolutional Neural Network. IEEE Access, 8, 3265–3271. https://doi.org/10.1109/access.2019.2962085

Ju, X., Zhang, D., Xiao, R., et al. (2021). Joint Multi-modal Aspect-Sentiment Analysis with Auxiliary Cross-modal Relation Detection. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. https://doi.org/10.18653/v1/2021.emnlp-main.360

Orchard, L. J., & Fullwood, C. (2009). Current Perspectives on Personality and Internet Use. Social Science Computer Review, 28(2), 155–169. https://doi.org/10.1177/0894439309335115

Paudel, P. (2020). Online Education: Benefits, Challenges and Strategies During and After COVID-19 in Higher Education. International Journal on Studies in Education, 3(2), 70–85. https://doi.org/10.46328/ijonse.32

Romeo, V. (2016). Can Compulsive Internet Use Affect Adolescent Mental Health. Psychology Today, 1(06).

Shu, L., Xie, J., Yang, M., et al. (2018). A Review of Emotion Recognition Using Physiological Signals. Sensors, 18(7), 2074. https://doi.org/10.3390/s18072074

Truong, Q.-T., & Lauw, H. W. (2019). VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 305–312. https://doi.org/10.1609/aaai.v33i01.3301305

Turk, M., Heddy, B. C., & Danielson, R. W. (2022). Teaching and social presences supporting basic needs satisfaction in online learning environments: How can presences and basic needs happily meet online? Computers & Education, 180, 104432. https://doi.org/10.1016/j.compedu.2022.104432

Valeria, D. V. (2017). The Draft of the ICD-11 Chapter on Mental Disorders: A Report for WPA Constituencies. Psychiatria Danubina, 29(01), 96-100.

Weinstein, A., & Lejoyeux, M. (2010). Internet Addiction or Excessive Internet Use. The American Journal of Drug and Alcohol Abuse, 36(5), 277–283. https://doi.org/10.3109/00952990.2010.491880

Wilbraham, S. J., Jones, E., Brewster, L., et al. (2024). Inclusion or Isolation? Differential Student Experiences of Independent Learning and Wellbeing in Higher Education. Education Sciences, 14(3), 285. https://doi.org/10.3390/educsci14030285

Wu, Z., Pan, S., Chen, F., et al. (2021). A Comprehensive Survey on Graph Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 32(1), 4–24. https://doi.org/10.1109/tnnls.2020.2978386

Xu, N., Mao, W., & Chen, G. (2018). A Co-Memory Network for Multimodal Sentiment Analysis. In: Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. https://doi.org/10.1145/3209978.3210093

Yang, L., Na, J.-C., & Yu, J. (2022). Cross-Modal Multitask Transformer for End-to-End Multimodal Aspect-Based Sentiment Analysis. Information Processing & Management, 59(5), 103038. https://doi.org/10.1016/j.ipm.2022.103038

Yu, Y., Lin, H., Meng, J., et al. (2016). Visual and Textual Sentiment Analysis of a Microblog Using Deep Convolutional Neural Networks. Algorithms, 9(2), 41. https://doi.org/10.3390/a9020041

Zeng, Q., Li, X., & Lin, H. (2020). Concat Convolutional Neural Network for pulsar candidate selection. Monthly Notices of the Royal Astronomical Society, 494(3), 3110–3119. https://doi.org/10.1093/mnras/staa916

Zhang, S., Tong, H., Xu, J., et al. (2019). Graph convolutional networks: a comprehensive review. Computational Social Networks, 6(1). https://doi.org/10.1186/s40649-019-0069-y




DOI: https://doi.org/10.24294/jipd9288

Refbacks

  • There are currently no refbacks.


Copyright (c) 2025 Ligui Zhu, Boli Li

License URL: https://creativecommons.org/licenses/by/4.0/

This site is licensed under a Creative Commons Attribution 4.0 International License.