Main Article Content

Abstract

There is a need for institutions to evaluate their e-Learning educational atmosphere to improve students’ learning experiences. The E-Learning Educational Atmosphere Measure (EEAM) is a comprehensive tool focusing on the students’ perception of the e-Learning environment. To be able to verbally interpret the results of the measure for better comprehension and more effective and consistent usage, it is essential to establish clear cut-off scores. We aimed to determine the optimal cut-off points for the EEAM scores by plotting them as the ROC curves versus a single global rating question. The findings showed that while the range of the possible EEAM scores was 40 to 200, cut-off points of equal or below 127, between 127 to 152, and equal or above 152 indicated students’ perception of the e-Learning atmosphere as “poor to weak”, “moderate”, and “good to excellent” respectively. The Area Under the Curve for scores that reflected the “poor to weak” state was 0.875 (p-value=0.000) with a sensitivity of 84.8% and a specificity of 70.0%. This area was 0.947 (p-value=0.000) for the “good to excellent” state with a sensitivity of 100% and a specificity of 82.1%. Our findings are useful in studying, evaluating, and monitoring the e-Learning educational atmosphere of institutions or comparing the results of multiple settings.

Keywords

e-Learning Educational Environment Educational Atmosphere

Article Details

How to Cite
Mohammadi, A., Mojtahedzadeh, R., Mousavi, A., & Ahmadi Fariman, S. (2023). Interpreting students’ perception of the e-Learning environments: determining optimal Cut-off Points for the e-Learning Educational Atmosphere Measure (EEAM). Journal of E-Learning and Knowledge Society, 19(4), 26-33. https://doi.org/10.20368/1971-8829/1135812

References

  1. Aldridge, J. M., Dorman, J. P., & Fraser, B. J. (2004). Use of Multitrait-Multimethod Modelling to Validate Actual and Preferred Forms of the Technology-Rich Outcomes-Focused Learning Environment Inventory (TROFLEI). Australian Journal of Educational & Developmental Psychology, 4, 110–125.
  2. Archer, S. K., Garrod, R., Hart, N., & Miller, S. (2013). Dysphagia in Duchenne muscular dystrophy assessed by validated questionnaire. International Journal of Language and Communication Disorders, 48(2), 240–246. https://doi.org/10.1111/j.1460-6984.2012.00197.x
  3. Barua, A. (2013). Methods for Decision-Making in Survey Questionnaires Based on Likert Scale. Journal of Asian Scientific Research, 3(1), 35–38. https://archive.aessweb.com/index.php/5003/article/view/3446
  4. Bevins, F., Bryant, J., Krishnan, C., & Law, J. (2020). Coronavirus: How should US higher education plan for an uncertain future? McKinsey & Company. https://www.mckinsey.com/industries/public-sector/our-insights/coronavirus-how-should-us-higher-education-plan-for-an-uncertain-future
  5. Chan, C. Y. W., Sum, M. Y., Tan, G. M. Y., Tor, P.-C., & Sim, K. (2018). Adoption and correlates of the Dundee Ready Educational Environment Measure (DREEM) in the evaluation of undergraduate learning environments - a systematic review. Medical Teacher, 40(12), 1240–1247. https://doi.org/10.1080/0142159X.2018.1426842
  6. Chang, V., & Fisher, D. L. (2001). The Validation and Application of a New Learning Environment Instrument to Evaluate Online Learning in Higher Education. In P. L. Jeffrey (Ed.), Australian Association for Research in Education Conference.
  7. Cho, S., Kim, Y. J., Lee, M., Woo, J. H., & Lee, H. J. (2021). Cut-off points between pain intensities of the postoperative pain using receiver operating characteristic (ROC) curves. BMC Anesthesiology, 21(1), 29. https://doi.org/10.1186/s12871-021-01245-5
  8. Clayton, J. (2007). The validation of the online learning environment survey. ICT: Providing Choices for Learners and Learning, 159–167.
  9. Dunstan, D. A., & Scott, N. (2019). Clarification of the cut-off score for Zung’s self-rating depression scale. BMC Psychiatry, 19(1), 1–7. https://doi.org/10.1186/s12888-019-2161-0
  10. Gaebel, M., Zhang, T., Bunescu, L., & Stoeber, H. (2018). Learning and teaching in the European higher education area. European University Association asbl. (Issue January). https://eua.eu/downloads/publications/trends-2018-learning-and-teaching-in-the-european-higher-education-area.pdf
  11. Genn, J. M. (2001). AMEE Medical Education Guide No. 23 (Part 1): Curriculum, environment, climate, quality and change in medical education-a unifying perspective. Medical Teacher, 23(4), 337–344. https://doi.org/10.1080/01421590120063330
  12. Gewin, V. (2020). Five tips for moving teaching online as COVID-19 takes hold. In Nature (Vol. 580, Issue 7802, pp. 295–296). https://doi.org/10.1038/d41586-020-00896-7
  13. Hajian-Tilaki, K. (2013). Receiver operating characteristic (ROC) curve analysis for medical diagnostic test evaluation. Caspian Journal of Internal Medicine, 4(2), 627.
  14. Harden, R. M., Crosby, J. R., & Davis, M. H. (1999). AMEE Guide No. 14: Outcome-based education: Part 1-An introduction to outcome-based education. Medical Teacher, 21, 7–14.
  15. Hutchinson, L. (2003). Educational environment. Bmj, 326(7393), 810–812.
  16. Jalili, M., Hejri, S. M., Moradi-lakeh, M., & Mirzazadeh, A. (2014). Validating Modified PHEEM Questionnaire for Measuring Educational Environment in Academic Emergency Departments. Archives of Iranian Medicine, May.
  17. Jean-Francois, E. (2019). Exploring the perceptions of campus climate and integration strategies used by international students in a US university campus. Studies in Higher Education, 44(6), 1069–1085. https://doi.org/10.1080/03075079.2017.1416461
  18. Lane, S., Raymond, M., & Haladayna, T. (2015). Handbook of Test Development. In Handbook of Test Development (2nd Editio). Routledge. https://doi.org/10.4324/9780203102961
  19. Larzelere, R. E., Andersen, J. J., Ringle, J. L., & Jorgensen, D. D. (2004). The child suicide risk assessment: a screening measure of suicide risk in pre-adolescents. Death Studies, 28(9), 809–827. https://doi.org/10.1080/07481180490490861
  20. Lee, J.-K., & Lee, W.-K. (2008). The relationship of e-Learner’s self-regulatory efficacy and perception of e-Learning environmental quality. Computers in Human Behavior, 24(1), 32–47. https://doi.org/https://doi.org/10.1016/j.chb.2006.12.001
  21. McAleer, S., & Roff, S. (2001). A practical guide to using the Dundee Ready Education Environment Measure (DREEM). In J. M. Genn (Ed.), Curriculum, environment, climate, quality and change in medical education: A unifying perspective. AMEE Education Guide No. 23 (pp. 29–33). AMEE.
  22. Miles, S., Swift, L., & Leinster, S. J. (2012). The Dundee Ready Education Environment Measure (DREEM): A review of its adoption and use. Medical Teacher, 34(9), e620–e634. https://doi.org/10.3109/0142159X.2012.668625
  23. Mohammad, O. A. R., Mahmoud, S. K., & Abdulmohsen, A.-Z. (2010). Learning environment in medical schools adopting different educational strategies. Educational Research and Reviews, 5(3), 126–129.
  24. Mousavi, A., Mohammadi, A., Mojtahedzadeh, R., Shirazi, M., & Rashidi, H. (2020). E-learning educational atmosphere measure (EEAM): A new instrument for assessing e-students’ perception of educational environment. Research in Learning Technology, 28(1063519), 1–12. https://doi.org/10.25304/rlt.v28.2308
  25. Nanishi, K., Green, J., Taguri, M., & Jimba, M. (2015). Determining a Cut-Off Point for Scores of the Breastfeeding Self-Efficacy Scale-Short Form: Secondary Data Analysis of an Intervention Study in Japan. PloS One, 10(6), e0129698. https://doi.org/10.1371/journal.pone.0129698
  26. Oliveira, M. A., Bendo, C. B., Paiva, S. M., Vale, M. P., & Serra-Negra, J. M. (2015). Determining Cut-Off Points for the Dental Fear Survey. Scientific World Journal, 2015. https://doi.org/10.1155/2015/983564
  27. Roff, Sue, Mcaleer, S., Harden, R., Al-Qahtani, M., Ahmed, A., Deza, H., Groenen, G., & Primparyon, P. (1997). Development and Validation of the Dundee Ready Education Environment Measure (DREEM). Medical Teacher - MED TEACH, 19, 295–299. https://doi.org/10.3109/01421599709034208
  28. Roff, Susanne, McAleer, S., & Skinner, A. (2005). Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Medical Teacher, 27(4), 326–331.
  29. Rose, S. (2020). Medical Student Education in the Time of COVID-19. JAMA, 323(21), 2131–2132. https://doi.org/10.1001/jama.2020.5227
  30. Rovai, A. P. (2002). Development of an instrument to measure classroom community. The Internet and Higher Education, 5, 197–211.
  31. Şahin Sarkın, B., & Gülleroğlu, D. (2019). Anxiety in prospective teachers: Determining the cut-off score with different methods in multi-scoring scales. Educational Sciences: Theory and Practice, 19(1), 3–21. https://doi.org/10.12738/estp.2019.1.0116
  32. Soemantri, D., Herrera, C., & Riquelme, A. (2010). Measuring the educational environment in health professions studies: a systematic review. Medical Teacher, 32(12), 947–952. https://doi.org/10.3109/01421591003686229
  33. Trinidad, S., Aldridge, J., & Fraser, B. (2005). Development, validation and use of the Online Learning Environment Survey. Australasian Journal of Educational Technology, 21(1 SE-Articles). https://doi.org/10.14742/ajet.1343
  34. Unal, I. (2017). Defining an Optimal Cut-Point Value in ROC Analysis: An Alternative Approach. Computational and Mathematical Methods in Medicine, 2017, 3762651. https://doi.org/10.1155/2017/3762651
  35. Walker, S. L., & Fraser, B. J. (2005). Development and Validation of an Instrument for Assessing Distance Education Learning Environments in Higher Education: The Distance Education Learning Environments Survey (DELES). Learning Environments Research, 8(3), 289–308. https://doi.org/10.1007/s10984-005-1568-3