Advertisement

Comparing the Learning Effectiveness of Three Virtual Simulation Tools With Nursing Students During the Coronavirus Disease (COVID-19) Pandemic

Published:March 13, 2022DOI:https://doi.org/10.1016/j.ecns.2022.03.003

      Highlights

      • Observing simulation was an effective learning approach as a virtual simulation tool.
      • No advantage in learning outcomes was demonstrated between a commercial virtual simulation tool and non-commercial tools.
      • The role of debriefing in the application of virtual simulation tools might be of particular importance.

      Abstract

      Background

      We explored the learning effectiveness of three virtual simulation tools used in the Coronavirus Disease pandemic environment.

      Sample

      Study participants consisted of students from two nursing classes, a junior and a senior class.

      Method

      A mixed-methods approach compared three tools’ performance across five learning domains. Descriptive statistics and analysis of variance compared mean ratings for learning domains. Open-ended questions were included for qualitative evaluation.

      Results

      Thirty-six respondents rated the Resource Simulation Center (RSC), based on the observation of videos of students undergoing simulation exercises, as superior to the other two. There were no differences between the other two tools. Qualitative findings echoed preference by students for “RSC”.

      Conclusion

      “RSC” was preferred over a commercial product based on computer generated graphics, and a free-online product based on clinical scenarios acted out in short videos. Differences in debriefing practices may have influenced the results, thereby emphasizing the role of debriefing with virtual simulation tools.

      Background

      The first case of Coronavirus Disease in North America was reported in the state of Washington in January 2020 (

      Holshue, M. L., Debolt, C., Lindquist, S., Lofy, K. H., Wiesman, J., Bruce, H., … Pillai, S. K. (2020). First case of 2019 novel coronavirus in the United States. New England Journal of Medicine, 382, 929. https://www.nejm.org/doi/full/10.1056/NEJMoa2001191

      ) leading to social distancing rules that barred our nursing students from engaging in clinicals, closed residence halls, and precluded face-to-face contact for the remaining Spring, 2020 semester. In response, and with the goal of maximizing teaching efforts, we developed various simulation initiatives, each addressing the specific didactic needs of two classes, to offset the loss of opportunities for clinical experience.
      Besides clinical opportunities, public health regulations on social distancing also precluded face-to-face, manikin-based simulation methods, which are standard and routine in nursing curricula. Thus, we turned to virtual, computer-based simulations that could be delivered online and within public health rules. Specifically, we used three different screen-based simulations, which are defined as “A simulation presented on a computer screen using graphical images and text, similar to popular gaming format, where the operator interacts with the interface using keyboard, mouse, joystick, or other input device” (
      the Terminology and Concepts Working Group
      ). The preponderance of evidence supports that these types of virtual simulations are effective in achieving learning outcomes (
      • Foronda C.L.
      • Fernandez-Burgos M.
      • Nadeau C.
      • Kelley C.N.
      • Henry M.N.
      Virtual simulation in nursing education: A systematic review spanning 1996 to 2018.
      ). The objective of this study was to compare student assessment of learning effectiveness across the three different simulation tools employed.

      Sample

      A convenience sample made up of students from two Bachelor of Science in Nursing (BSN) classes (one at the junior level and another at the senior level) comprised our study participants.

      Methods

      We employed an online survey to evaluate respondent views on the learning effectiveness of three different virtual simulation tools. This was a mixed methods approach composed of Likert scale questions and open-ended questions covering various domains of nursing simulation education based on the Simulation Learning Effectiveness Inventory (SLEI). The SLEI identifies seven reliable and valid domains to measure student perception of learning effectiveness in simulation (
      • Chen S.
      • Huang T.
      • Liao I.
      • Liu C.
      Development and validation of the simulation learning effectiveness inventory.
      ). The original content validity of the SLEI was performed on the Chinese version of the tool. The English version was evaluated by
      • Bates T.A.
      • Clark P.C.
      Reliability and validity of the simulation learning effectivenessinventory.
      who examined the psychometric properties using a sample of 132 undergraduate nursing students and demonstrated validity and reliability. This study compared each of the three simulation tools’ performance across five of the SLEI seven domains. Two SLEI domains were thought inapplicable to our objectives and not explored: Appropriateness of Course Arrangement, and Availability of Learning Resources. The SLEI was used here to identify the remaining five domains to compare tool performance focused on respondent perception of how well each of the tools: Facilitate Debriefing, Improve Clinical Ability, Help Problem Solving, Help Confidence, and Improve Collaboration, see Figure 1. The electronic survey was distributed one time after participants had completed all simulation modalities.

      Virtual Simulation Tools

      Simulation case studies were different for each of the two classes as content was chosen to meet specific course learning objectives. All students received an introduction and orientation to each tool. All simulations included debriefing activities. The tools evaluated and debriefing methods were as follows.

      I-Human

      I-Human Patients (

      Kaplan (n.d.). i-Human Patients. Retrieved October 20, 2020, from https://ih2.i-human.com/

      ) is a cloud-based simulation product that uses interactive, cased-based clinical scenarios. I-Human uses a serious game approach employing a virtual computer-graphic patient, physical assessment through a mouse-driven interface, and a simulated electronic health record (EHR). This is a commercially available product with a wide variety and number of case-scenarios across the clinical spectrum. Debriefing occurred during a weekly scheduled, clinical post-conference via videotelephony.

      Ryerson

      The Virtual Healthcare Experience (

      Ryerson (n.d.). Virtual Healthcare Experience. Retrieved October 20, 2020, from https://de.ryerson.ca/games/nursing/hospital/

      ) is a cloud-based simulation using interactive case-based scenarios and a simulated EHR through a video-based interface. Rather than computer graphics, the videos use actors to portray clinician-patient interactions. The simulations are housed within a virtual hospital that includes simulation content across the clinical spectrum. The Virtual Healthcare Experience portal was developed as a collaboration among Ryerson University, Centennial College, and George Brown College faculty. The tool is a freely available resource hosted under a Creative Commons license by Ryerson University. Debriefing occurred during a weekly scheduled, clinical post-conference via videotelephony.

      Resource Simulation Center

      The Resource Simulation Center (RSC) is our own designed computer-based video simulation which has students observing prerecorded videos of students from previous semesters performing manikin-based simulations from our usual curriculum. The experience includes review and analysis of a simulated EHR, followed by observing the recordings with interval pauses to answer questions related to clinical action for example, assessment, interventions, etc. Debriefing occurred over the following 24 hours via the learning management system online discussion board with a concluding presentation highlighting the main takeaway points from the experience.

      Survey

      An online survey was distributed to students in both classes. Respondents were asked to rate each tool using the same set of questions covering the five domains. Students were asked about their experience for example, did it facilitate effective debriefing; did it improve your clinical ability; did it help your problem solving skills; did it help your confidence; and did it improve your ability to collaborate? Responses consisted of five categories ranging from not at all, to slightly, somewhat, fairly well, and very well.
      Descriptive statistics were used as well as an analysis of variance to compare mean rating for each question among tools. A Cumulative Performance Score (CPS) was also computed. This was derived by adding the mean score across all five domains for each simulation tool. The variable was meant to encompass the tool's overall performance in learning effectiveness. Qualitative analysis was performed based on the question Do you have any other comments related to your [virtual tool] simulation experience this semester? Institutional Review Board (IRB) approval was received for this study.

      Results

      Quantitative

      The response rate for seniors was 50% (17/34), while that of the juniors was 39% (19/49), with a combined student sample size of 36. Scores across the five domains were consistent between the two classes.
      When comparing simulation tools by domain, statistically significant differences were found in Effective Debriefing between RSC (mean 3.88) and Ryerson (mean 2.88), p < .05; Help with Confidence between RSC (mean 2.81) and iHuman (mean 1.69), p < .05; and Ability to Collaborate between RSC (mean 3.03) and iHuman (mean 2.11), and RSC (mean 3.03) and Ryerson (2.19), p < .05. No statistically significant differences were found among simulation tools for Help with Problem Solving and Improve Clinical Ability (Table 1).
      Table 1Effectiveness by Domain by Simulation Tool.
      Learning Domain(I) Simulation Tool(J) Simulation ToolMean Difference (I-J)Std. ErrorSig.95% Confidence Interval
      Lower BoundUpper Bound
      Did it facilitate effective debriefing?I HUmanRyerson0.5540.2930.185-0.161.27
      RSC-0.4460.2930.392-1.160.27
      RyersonI HUman-0.5540.2930.185-1.270.16
      RSC-1.000
      The mean difference is significant at the 0.05 level.
      0.2990.004
      The mean difference is significant at the 0.05 level.
      -1.73-0.27
      RSCI HUman0.4460.2930.392-0.271.16
      Ryerson1.000
      The mean difference is significant at the 0.05 level.
      0.2990.0040.271.73
      Did it improve your clinical ability?I HUmanRyerson-0.1340.2851.000-0.830.56
      RSC-0.5710.2850.144-1.270.12
      RyersonI HUman0.1340.2851.000-0.560.83
      RSC-0.4380.2910.410-1.150.27
      RSCI HUman0.5710.2850.144-0.121.27
      Ryerson0.4380.2910.410-0.271.15
      Did it help your problem solving skills?I HUmanRyerson0.1020.2941.000-0.610.82
      RSC-0.4290.2940.443-1.150.29
      RyersonI HUman-0.1020.2941.000-0.820.61
      RSC-0.5310.3010.241-1.260.20
      RSCI HUman0.4290.2940.443-0.291.15
      Ryerson0.5310.3010.241-0.201.26
      RSCI HUman0.4290.2850.294-0.251.11
      Ryerson0.5310.3080.204-0.211.27
      Did it help your confidence?I HUmanRyerson-0.5640.3000.189-1.300.17
      RSC-1.127
      The mean difference is significant at the 0.05 level.
      0.3000.001
      The mean difference is significant at the 0.05 level.
      -1.86-0.40
      RyersonI HUman0.5640.3000.189-0.171.30
      RSC-0.5630.3070.209-1.310.18
      RSCI HUman1.127
      The mean difference is significant at the 0.05 level.
      0.3000.0010.401.86
      Ryerson0.5630.3070.209-0.181.31
      Did it improve your ability to collaborate?I HUmanRyerson-0.0730.2861.000-0.770.62
      RSC-0.917
      The mean difference is significant at the 0.05 level.
      0.2860.006
      The mean difference is significant at the 0.05 level.
      -1.61-0.22
      RyersonI HUman0.0730.2861.000-0.620.77
      RSC-0.844
      The mean difference is significant at the 0.05 level.
      0.2920.015
      The mean difference is significant at the 0.05 level.
      -1.56-0.13
      RSCI HUman0.917
      The mean difference is significant at the 0.05 level.
      0.2860.0060.221.61
      Ryerson0.844
      The mean difference is significant at the 0.05 level.
      0.2920.0150.131.56
      Note. RSC = Resource Simulation Center.
      Comparison of the three simulation tools identifying the primary simulation instrument to the other two tools by domain. RSC noted to be statistically significant in three of the five domains.
      low asterisk The mean difference is significant at the 0.05 level.
      When comparing Cumulative Performance Score (CPS) by tool, RSC (mean 16.06) was statistically significantly higher than both iHuman (mean 12.57) and Ryerson (mean 12.69), p < .05; no statistically significant differences were found in CPS between iHuman and Ryerson (Table 2). Considering all three tools, Facilitate Debriefing domain has the most favorable score (3.4/5; 68%) while Help your Confidence has the least favorable score (2.25/5; 45%).
      Table 2Cumulative Performance Score (CPS) by Simulation Tool.
      Dependent Variable: Cumulative Performance Score
      (I) Simulation Method(J) Simulation MethodMean Difference (I-J)Std. ErrorSig.95% Confidence Interval
      Lower BoundUpper Bound
      I HUmanRyerson-0.1161.2581.000-3.182.95
      RSC-3.491
      The mean difference is significant at the 0.05 level.
      1.2580.020
      The mean difference is significant at the 0.05 level.
      -6.56-0.43
      RyersonI HUman0.1161.2581.000-2.953.18
      RSC-3.375
      The mean difference is significant at the 0.05 level.
      1.2860.030
      The mean difference is significant at the 0.05 level.
      -6.51-0.24
      RSCI HUman3.491
      The mean difference is significant at the 0.05 level.
      1.2580.0200.436.56
      Ryerson3.375
      The mean difference is significant at the 0.05 level.
      1.2860.0300.246.51
      Note. RSC = Resource Simulation Center.
      A cumulative score compared the overall performance in learning effectiveness across all five domains for each instrument; the mean difference for the RSC instrument was statistically significant (p .02-.03).
      low asterisk The mean difference is significant at the 0.05 level.

      Qualitative

      Thematic analyses of student perceptions for the three tools were completed. Students felt that iHuman was too basic for their educational level. The overall student perspective was that iHuman would be better for first semester students who need to focus more on assessment skills. The product helped with head-to-toe assessment, prioritization, reviewing lab values and talking to each patient. As one student described, “It helped to perform each step of the process.”
      Students were more positive about their experiences using the Ryerson simulations. According to the student thematic analysis, the Ryerson simulation system was more realistic and a good learning resource. Students provided the most positive comments about the RSC simulation experience. They commented that they liked watching other students’ performance in the simulation because it helped to see the teamwork and group problem-solving. Another positive noted from the student thematic analyses was that the RSC simulations were augmented by additional teaching resources including an online presentation, discussion board questions and a critical thinking assignment.

      Discussion

      When comparing these three tools under novel circumstances, RSC, or the observing of previously recorded simulation scenarios, performed best overall in terms of learning effectiveness when compared to the other two virtual simulation tools. This seems supported by previous findings that participant reaction is similar between those performing simulation and those observing simulation (
      • Delisle M.
      • Ward M.A.R.
      • Pradarelli J.C.
      • Panda N.
      • Howard J.D.
      • Hannenberg A.A.
      Comparing the learning effectiveness of healthcare simulation in the observer versus active role: Systematic review and meta-analysis.
      ). Screen-based simulation, as seen with two of the tools used in this study, scored lowest in a study by
      • Leighton K.
      • Kardong-Edgren S.
      • Schneidereith T.
      • Foisy-Doll C.
      • Wuestney K.
      Meeting undergraduate nursing students' clinical needs.
      that compared clinical learning in undergraduate nursing students. The results of this study are consistent with those findings. Although virtual, the three tools were different in terms of technologies, graphics, and content. The screen-based tools alone performed lower in terms of learning effectiveness but may offer an opportunity for improved clinical learning if combined for future use with virtual reality components
      • Brown K.
      • Swoboda S.
      • Gilbert G.
      • Horvath C.
      • Sullivan N.
      Integrating virtual simulation into nursing education: A roadmap.
      . reported that student perceived experience with virtual reality simulation was similar to that of face-to-face simulation. In addition, in this study the debriefing varied between RSC versus iHuman and Ryerson simulations. This variation may have influenced student learning as the characteristics of debriefing, such as timing and structure, are known to affect learning (
      • Kim Y.
      • Yoo J.
      The utilization of debriefing for simulation in healthcare: A literature review.
      ). The International Nursing Association of Clinical and Simulation Learning (INACSL) best practices for debriefing also require all simulation-based experiences to include a planned debriefing session aimed at improving future performance (
      INACSL Standards Committee
      INACSL standards of best practice: SimulationSMdebriefing.
      ).
      Specific domains of the SLEI in which the RSC tool was reported to be significantly better by students included Effective Debriefing, Help with Confidence, and Ability to Collaborate, which may be related to the screen-based delivery. The tools did not differ in learning effectiveness across the domains of Help with Problem Solving and Improve Clinical Ability which demonstrates the students may have found engaging in the problem-solving portions of the simulation helpful for improving their ability to care for patients. The simulation tools’ effect was highest on Facilitate Debriefing and lowest on Help your Confidence. There were no differences in learning effectiveness as reported by participants between iHuman and Ryerson simulation tools although participants reported a preference for Ryerson.
      Limitations of this work included a small sample size and the heterogeneity of conditions under study. For example, although our findings support that observing simulation is an effective way to learn, it is impossible to say what aspect of debriefing activity contributed to its effectiveness. Future work can aim to better understand these dynamics through a more controlled design. Further, although we attempted to obtain instructor feedback in this study, a small response rate precluded analyses and thus it was unreported. It would be interesting to see how instructor's perspective compares to the student's experience.

      Conclusion

      The Coronavirus Disease pandemic presents nursing education with unique forces that challenge the way we teach and expose students to the clinical experience. Nevertheless, in the context of limited resources, the role of simulation and the effectiveness of various techniques and products are a pressing and continuing question for the pandemic and beyond. It is encouraging to see that, given specific course learning objectives, home-grown, or school-based simulation efforts based on observing prerecorded simulations of peers, are as effective or more effective than available resources, including commercial ones. Further research exploring best practices in this fast-evolving area must be a priority.

      References

        • Bates T.A.
        • Clark P.C.
        Reliability and validity of the simulation learning effectivenessinventory.
        Journal of Professional Nursing. 2019; 35: 461-466https://doi.org/10.1016/j.profnurs.2019.04.007
        • Brown K.
        • Swoboda S.
        • Gilbert G.
        • Horvath C.
        • Sullivan N.
        Integrating virtual simulation into nursing education: A roadmap.
        Clinical Simulation in Nursing. 2021; 12: 1-9https://doi.org/10.1016/j.ecns.2021.08.002
        • Chen S.
        • Huang T.
        • Liao I.
        • Liu C.
        Development and validation of the simulation learning effectiveness inventory.
        Journal of Advanced Nursing. 2015; 71: 2444-2453https://doi.org/10.1111/jan.12707
        • Delisle M.
        • Ward M.A.R.
        • Pradarelli J.C.
        • Panda N.
        • Howard J.D.
        • Hannenberg A.A.
        Comparing the learning effectiveness of healthcare simulation in the observer versus active role: Systematic review and meta-analysis.
        Simulation in Healthcare: Journal of the Society for Simulation in Healthcare. 2019; 14: 318-332https://doi.org/10.1097/SIH.0000000000000377
        • Foronda C.L.
        • Fernandez-Burgos M.
        • Nadeau C.
        • Kelley C.N.
        • Henry M.N.
        Virtual simulation in nursing education: A systematic review spanning 1996 to 2018.
        Simulation in Healthcare: Journal of the Society for Simulation in Healthcare. 2020; 15: 46-54https://doi.org/10.1097/SIH.0000000000000411
      1. Holshue, M. L., Debolt, C., Lindquist, S., Lofy, K. H., Wiesman, J., Bruce, H., … Pillai, S. K. (2020). First case of 2019 novel coronavirus in the United States. New England Journal of Medicine, 382, 929. https://www.nejm.org/doi/full/10.1056/NEJMoa2001191

        • INACSL Standards Committee
        INACSL standards of best practice: SimulationSMdebriefing.
        Clinical Simulation in Nursing. 2016; 12: S21-S25https://doi.org/10.1016/j.ecns.2016.09.008
      2. Kaplan (n.d.). i-Human Patients. Retrieved October 20, 2020, from https://ih2.i-human.com/

        • Kim Y.
        • Yoo J.
        The utilization of debriefing for simulation in healthcare: A literature review.
        Nurse Education in Practice. 2020; 43 (Article)102698https://doi.org/10.1016/j.nepr.2020.102698
        • Leighton K.
        • Kardong-Edgren S.
        • Schneidereith T.
        • Foisy-Doll C.
        • Wuestney K.
        Meeting undergraduate nursing students' clinical needs.
        Nurse Educator. 2021; 46: 349-354https://doi.org/10.1097/NNE.0000000000001064
      3. Ryerson (n.d.). Virtual Healthcare Experience. Retrieved October 20, 2020, from https://de.ryerson.ca/games/nursing/hospital/

        • the Terminology and Concepts Working Group
        Lioce L. Lopreiato J. Downing D. Chang T.P. Robertson J.M. Anderson M. Diaz D.A. Spain A.E. Healthcare Simulation Dictionary. Agency for Healthcare Research and Quality, 2020 (2nd ed., AHRQ Publication No. 20-0019)