CLINICAL NURSING EDUCATION IN A TIME OF COVID-19: A COMPARISON OF VIRTUAL AND IN-PERSON SIMULATION DEBRIEFING METHODS by Meghan Anne Kirk A scholarly project submitted in partial fulfillment of the requirements for the degree of Doctor of Nursing Practice in Family and Individual Health MONTANA STATE UNIVERSITY Bozeman, Montana April 2021 ©COPYRIGHT by Meghan Anne Kirk 2021 All Rights Reserved ii DEDICATION For my Nana, Marita G. Donahue BSN, RN, M.Ed., MSN, NP-C; A nurse educator for more than 40 years, and one of the first and best nurse practitioners in Massachusetts. You are an incredible light and inspiration in my life as well as the lives of the many nurses you have trained and patients you have cared for. Thank you for your loving support and guidance. iii ACKNOWLEDGEMENTS I would like to acknowledge those without whom this project would not have been possible. I am grateful for the exceptional leadership, guidance and support of my DNP committee members: - Jennifer Sofie, BSN, MSN, DNP (Committee Chair) - Stacy Stellflug BSN, MSN, PhD - Susan Luparell, BSN, MSN, PhD - Paul Krogue BSN, DNP I would like to thank the Montana State University College of Nursing, and the nursing students who participated in this project. I would like to acknowledge the loving support and encouragement of my colleagues, preceptors, friends, and family members. iv TABLE OF CONTENTS 1. INTRODUCTION .......................................................................................................................1 2. REVIEW OF THE LITERATURE .............................................................................................4 Background ..................................................................................................................................4 Traditional Clinical Nursing Education .......................................................................................4 Simulation in Healthcare..............................................................................................................5 Simulation Evolution and Significance .......................................................................................7 Simulation Theory and Elements ...............................................................................................10 Theoretical Underpinnings.................................................................................................10 National League for Nursing (NLN) Simulation Vision Statement ..................................15 Simulation Guidelines and Elements .........................................................................................16 INACSL Guidelines for Best Practice in Simulation ........................................................16 Efficacy of Simulation in Nursing Education ............................................................................18 Use of Simulation During the COVID-19 Pandemic ................................................................19 Virtual Simulation ......................................................................................................................22 Virtual Simulation in COVID-19...............................................................................................23 Debriefing Definitions and Significance ...................................................................................24 Debriefing Methods ...........................................................................................................26 Virtual Simulation Debriefing ...........................................................................................29 Montana State University (MSU) ......................................................................................31 Project Objective ........................................................................................................................32 3. METHODS ................................................................................................................................33 Sample Demographics and Setting ............................................................................................33 Study Design .............................................................................................................................34 vSim® for Nursing (vSim) ........................................................................................................35 Objectives ..................................................................................................................................36 Outcome Measures: Debriefing Assessment for Simulation in Healthcare (DASH)........................................................................................................36 Institutional Review Board (IRB) Exemption ...........................................................................37 Ethical Issues .............................................................................................................................38 Data Collection with Qualtrics Survey ......................................................................................38 Statistical Analysis .....................................................................................................................39 4. RESULTS ..................................................................................................................................40 Data Quality ...............................................................................................................................40 Missing Data ..............................................................................................................................40 Quantitative Results ...................................................................................................................40 Qualitative Results .....................................................................................................................48 v TABLE OF CONTENTS CONTINUED Face-to-Face Benefits ................................................................................................................48 Face-to-Face Challenges ............................................................................................................48 Virtual Benefits ..........................................................................................................................49 Virtual Challenges .....................................................................................................................49 5. DISCUSSION ............................................................................................................................51 Limitations .................................................................................................................................56 6. CONCLUSION ..........................................................................................................................58 Implications for Future Work ....................................................................................................59 REFERENCES CITED ..................................................................................................................62 APPENDICES ...............................................................................................................................68 APPENDIX A: Institutional Review Board Exemption Letter .........................................69 APPENDIX B: Debriefing Assessment for Simulation in Healthcare ® (DASH) Student Version ........................................................................71 ® APPENDIX C: Copy of Qualtrics Survey DASH-SV : Face-to-Face Debriefing ...........74 ® APPENDIX D: Copy of Qualtrics Survey DASH-SV : Virtual Debriefing .....................82 vi LIST OF TABLES Table Page 1. Descriptive Statistics and Paired T-test Results for Comparison ® of Face-to-Face (F2F) and Virtual (V) DASH -SV Responses ....................................42 vii LIST OF FIGURES Table Page ® 1. Histograms of student DASH -SV Scores for Each Survey Element ...........................43 ® 2. Histogram of Cumulative student DASH -SV Scores for Virtual and Face-to-Face Simulation Debriefing ......................................................................44 ® 3. Paired T-test Results for DASH -SV Virtual and Face-to-Face Simulation Debriefing for Each Survey Element ..........................................................46 ® 4. Paired T-test Results for Cumulative Student DASH -SV Scores for Virtual and Face-to-Face Simulation Debriefing ....................................................47 viii GLOSSARY Healthcare Simulation: Educational practice in which real clinical scenarios are simulated for experiential student learning to achieve specific training outcomes Fidelity: The level of realism present in a simulated experience High-quality simulation: Evidence-based simulation experiences which meet the INACSL best practice guidelines Face-to-Face Simulation (F2F): Simulation that occurs in-person, typically in a simulation lab Virtual Simulation: Real people operate simulated systems which recreate real, interactive scenarios using artificial intelligence (AI) on a computer INACSL: International Nursing Association for Clinical Simulation and Learning Debriefing Assessment for Simulation in Healthcare (DASH): 6-item Likert scale for evaluation of simulation debriefing experiences. Available in student (DASH-SV), Instructor, and Rater Faculty versions as well as multiple languages National League for Nursing (NLN): A national organization for nursing education and faculty Human patient simulation (HPS): Simulation with real humans rather than manikins or AI ® vSim for Nursing Verbal Debriefing (VD): A conversational, reflective learning after simulation Video-Assisted Debriefing (VAD): The use of video playback to augment simulation debriefing ®: WebEx An online conferencing platform capable of video and audio connection Virtual Simulation Debriefing: Debriefing that occurs virtually within or following a virtual simulation experience (Agency for Healthcare Research and Quality [AHRQ], 2020) ix ABSTRACT A global COVID-19 pandemic was declared in March, 2020, causing educators and students around the world to pivot toward virtual education when in-person education methods became impossible to safely deliver. This posed unique challenges within nursing education and other disciplines, which historically required experiential or hands-on training. Barriers and disadvantages to traditional in-person clinical nursing education methods such as: limited clinical site availability, competition between schools for clinical sites, cost, increased risk potential, increased patient acuity, decreased average length of hospital stay, and faculty shortages led to exploration of clinical education through in-person and virtual simulation methods predating the COVID-19 pandemic. One of the essential elements of simulation is debriefing, which describes an interactive and reflective discussion of simulation events, which aids in assimilation of new knowledge and ability to apply what is learned in future clinical experiences. The project lead explored and compared 3rd-year nursing student experiences with face-to-face simulation debriefing and virtual debriefing methods. Debriefing Assessment for Simulation in Healthcare survey scores for a small convenience sample of students (n=17) for each debriefing method were collected before and after the COVID-19 mediated educational shift. Qualitative responses were solicited only in the virtual debriefing survey, when students were asked to identify their preference for either face-to-face or virtual debriefing as well as any benefits or challenges associated with each method. Descriptive statistics along with a one-group two-tailed repeated measures Student’s T-test was completed for analysis in The Statistical Package for the Social Sciences. T-test results of student scores for each debriefing element were all statistically insignificant at a 95% confidence interval, aside from Element #5 which describes instructor identification of success or failure modes. Student ratings for debriefing quality and subsequent T-test findings suggest that student experience with virtual and face-to-face methods is equal, except with regard to instructor feedback, but are limited due to diminished statistical power. However, qualitative results indicate students uniformly expressed a preference for traditional debriefing methods over virtual simulation debriefing. Virtual simulation debriefing, while not as familiar or easy in terms of communication, appears to be an effective alternative to traditional, face-to-face debriefing. 1 CHAPTER ONE INTRODUCTION Clinical nursing faculty are responsible for preparing students to competently provide evidence-based and patient-centered nursing care upon program completion. Professional nursing positions command a wide scope of knowledge and skills. They also include a considerable level of responsibility. Undergraduate nursing program objectives are met by a combination of didactic and clinical training (Oermann et al., 2018). Opportunities to apply didactic content in clinical practice are essential for nursing student development and progression (Harper & Markham, 2011). Although clinical experience for nursing students is an integral element of professional training, this type of education presents unique challenges for the nursing programs and communities who support them (Jeffries, 2016). Due to a multitude of precipitating factors, simulation has become a staple in nursing programs (Hayden et al., 2014). High quality simulation has replaced nearly half of the clinical experience hours required for program completion in some institutions (NLN, 2016; Jeffries, 2016). Currently, simulation serves as a vital element of clinical learning in nursing education and is employed on some level in a vast majority of institutions (Chronister & Brown, 2012). Simulation is a pedagogy in which active clinical learning experiences in traditional environments are replaced with simulated clinical scenarios that replicate the real patient experiences to varying degrees of realism (e.g., varying levels of fidelity) (Harper & Markham, 2011). Most simulations involve a pre-simulation briefing, simulation completion, and a subsequent simulation debriefing. Simulation offers “an active learning environment for students to experience clinical situations and use cognitive, affective, and psychomotor skills” (Dreifuerst, 2 2009, p. 109). Other authors have defined simulation as a “a technique used to replace or amplify real experiences with guided experiences that evoke or replace substantial aspects of the real world in a fully interactive manner” (Levett-Jones & Lapkin, 2014, p. e58). Clinical skills, judgement, reasoning, and other important developmental milestones for professional nurses can be effectively taught through a simulated experience without concern for compromised patient safety or outcomes (Dreifuerst, 2009). In March, 2020, many universities closed in-person education entirely in response to the COVID-19 pandemic in an effort to limit viral transmission. For many nursing students who were engaged in in-person clinical training at that point in time, this involved a sudden shift to training delivered exclusively through virtual modalities and simulation. This also occurred at Montana State University. This upheaval of traditional clinical education methods demanded rapid adaption and generated many important questions surrounding the uncertainties of a new paradigm. Faculty and students wondered whether clinical nursing could be effectively taught through a virtual platform, and what the related logistics might entail. Many clinical training outcomes such as faculty and student experience, student future performance in clinical training, level of competency, future employment, job performance, and downstream patient safety each have the potential to be significantly impacted by a dramatic shift of this nature (Oermann et al., 2018). Concerns were also raised about the absence of personal interaction in virtual environments, which contrasts starkly with the highly interactive and experiential clinical education methods that have been traditionally employed. This project describes the experiences of 17 nursing students required to shift from in-person clinical and simulation-based training to 3 exclusively virtual simulation educational methods, based on comparison of their debriefing experiences pre and post COVID-19. Following a review of relevant literature describing traditional clinical nursing education, history and application of simulation and debriefing methods in clinical nurse education, and changes in the use of simulation in education during the COVID-19 Pandemic, in Chapter Two, this project compared the experience of students who experienced a shift between in-person and virtual clinical education during the spring semester, 2020. Methods of data collection and results are described in Chapters Three and Four, with results presented in Chapter Five. A discussion of project findings with recommendations for future work is provided in Chapter Six. This project offers a pilot comparison of student experiences and perceptions associated with virtual and face-to-face debriefing methods. 4 CHAPTER TWO REVIEW OF THE LITERATURE Background In 2020, education shifted suddenly from traditional in-person paradigms to online and virtual modalities. The transition affected education in a wide variety of contexts, but especially, the clinical education of nursing students (Mitchell, 2020). A review of the relevant nursing education literature is provided in the sections to follow. Beginning with a description of traditional clinical nursing education, this chapter reviews the history and application of simulation in healthcare, theoretical underpinnings for simulation, simulation elements and guidelines, and simulation efficacy. Next, an exploration of recent developments secondary to the COVID-19 pandemic and virtual simulation is provided. The latter portion of this chapter provides a discussion of debriefing significance, guidelines, and methods. Traditional Clinical Nursing Education In a traditional sense, “clinical learning opportunities provide real-life experiences and opportunities for transfer of knowledge to practical situations” (Oermann et al., 2018). Nursing students traditionally begin their training progression in the classroom learning didactic content (Oermann et al., 2018). Prior to participation in clinical practice, students are typically provided with various opportunities to test and practice what they learned in a simulation or laboratory environment. This also provides an opportunity for faculty to observe student abilities 5 and guide learning prior to practicing in actual patient care delivery settings. Following didactic and simulation-based education, clinical experiences may begin. The clinical nursing student is not presumed to be functioning in a nursing role or scope of practice, but rather has a role and primary set of responsibilities centered around learning and achievement of specific course objectives (Oermann et al., 2018). Clinical students are expected to prepare for and attend shifts in various practice settings with a small group of peers, accompanied by a clinical instructor. The clinical instructor assigns each student a patient (or multiple patients, depending on the level of experience) the day before a clinical experience is scheduled. A staff nurse who assumes primary responsibility for patient care is assigned to every patient, including the patients who students are helping to care for. The nursing student, staff nurse, and clinical instructor provide collaborative and team-based patient care. The primary role of clinical instructors is to facilitate and guide student learning. Following experiential learning in clinical sites, the small group typically does a post-clinical conference for questions, debriefing and case discussion (Oermann et al., 2018). Simulation in Healthcare The evidence suggests that bridging the gap between nursing theory and independent clinical practice requires either traditional clinical experiences alone or in conjunction with high- quality simulation delivery (Rudolph et al., 2016). The use of simulation-based models in clinical healthcare education has become increasingly common, and this educational paradigm is rapidly evolving, with notably increased momentum over the course of the past few decades (Levett- Jones & Lapkin, 2014). Fidelity is used in the context of simulation to quantify the level of realism. Clinical scenarios are replicated in low-fidelity, moderate-fidelity, or high-fidelity 6 simulated experiences. It has been suggested that about 87% of prelicensure nursing institutions are employing medium- or high-fidelity simulation as a teaching methodology (Dileone et al., 2020). Sufficient traditional clinical experience hours can be difficult to achieve due to multidimensional barriers and complicating factors. Increased patient acuity, decreased length of hospital stay, multiple schools competing for community partnership resources, limited clinical site availability, faculty shortages, electronic charting system access challenges, and the likelihood of experiential disparities between nursing students are some of the key contributing factors that have led to increased simulation practice and research (Hayden et al., 2014; Rudolph et al., 2016). COVID-19 pandemic-related changes were added to this list of factors challenging our traditional approach to clinical preparation of nursing students in the spring semester of 2020. A variety of styles, levels of fidelity, and design approaches are used in healthcare simulation training environments (Lavoie et al., 2020). Recent shifts toward online learning platforms coupled with incorporation of virtual simulation technology have revolutionized simulation opportunities, although classroom and paper-based models maintain an important role in prelicensure nursing education (Aebersold, 2016; Zhang et al., 2020). Goals of simulation often include discipline-specific transfer, acquisition and application of knowledge, interpersonal communication, and teamwork elements of learning (Aebersold, 2016). When compared to face-to-face clinical, critiques of simulation experiences cite failures to incorporate holistic nursing processes and challenges with teaching recognition of the dynamic totality of human beings in simulated environments (Cohen & Boni, 2018). Further research is 7 needed to promote holistic nursing through simulation in an effort to facilitate student understanding and application of this unique facet of the professional nursing role (Cohen & Boni, 2018). Simulation Evolution and Significance Simulation-based education has a long history that spans multiple disciplines. The military, aviation, and healthcare sectors each have unique historical and current patterns of simulation-use for education and training (Brett-Fleegler et al., 2012). A growing body of evidence demonstrates a link between simulation-based training and improved safety, particularly in disciplines requiring high-risk decision-making or with a potential for high- consequence outcomes (Harper & Markham, 2011; Hayden et al., 2014). Clinical nursing practice and training certainly fit this description, although healthcare was not the first discipline to incorporate simulation-based education models. Over a century ago, aviation experts identified a need for novel training approaches after experiencing significant loss of lives due to failures in previous training methodologies. This led to a transformation in professional training through innovative simulation-based teaching strategies (Harper & Markham, 2011). The aviation industry piloted simulation technologies and techniques with the use of flight simulators (Aebersold, 2016; Harper & Markham, 2011). Instructors were able to effectively simulate unique weather conditions, challenging technical maneuvers and limited visibility flying experiences for student pilots (Harper & Markham, 2011). World War II propelled further expansion of aviation and military simulation training with inclusion of interactive team-based procedures, continuous training improvements, and increased standardization of simulation within the discipline (Aebersold, 2016). The 8 standardization of simulation practice allowed for consistency and reproducibility within training and evaluation procedures, enabling pilots to effectively learn technical and team skills for practical application through this methodology (Aebersold, 2016). Remarkably, many student pilots actually have their first experiences landing commercial planes while passengers are on board with another pilot alongside for guidance (Aebersold, 2016). Of course, this occurs only after pilots have experienced high quality and effective simulation-based training (Aebersold, 2016). This is analogous with the first clinical experiences of nursing students. Nursing students are expected to provide care for actual patients under the supervision of a clinical instructor, which also follows primary training with both didactic content and simulation-based modalities (Hayden et al., 2014). Evolution of simulation-based education continued with The National Aeronautics and Space Administration’s (NASA) development of a training method called Crew Resource Management (CRM) in 1979, intended for application in aviation (Aebersold, 2016). CRM offered aviation students a low-risk simulated practice training environment, with increased focus on optimization of interpersonal communication, stress management, situational awareness, fatigue management, team dynamics, and debriefings (Aebersold, 2016). CRM was heavily influenced by the human and organizational error-management work of James Reason (Reason, 1997). Anesthesiology specialists were the first to utilize CRM for medical training, specifically for crisis response and management training applications (Harper & Markham, 2011). The Veteran’s Health Association later translated the CRM training methods into a training program implemented for frontline nursing staff (Sculli et al., 2013). The authors 9 observed positive cultural and clinical safety impacts in the wake of program implementation (Sculli et al., 2013). th In healthcare, low-fidelity simulation began early in the 20 century using mannequins, which were primarily used for foundational nursing skill training (Harper & Markham, 2011; Levett-Jones & Lapkin, 2014). There have been numerous evolutions within simulation-based education for healthcare since that time. Change in simulation practice is informed by the progressive evolution of guidelines for best-practice simulation by the International Nursing Association for Clinical Simulation and Learning (INACSL), which is maintained as a living document (INACSL, 2016). The INACSL Standards of Best Practice Guidelines offer a framework for the many facets of simulation design, implementation, debriefing evaluation, and research aspects (INACSL, 2016). The Society for Simulation in Healthcare (SSH) provides certification to simulation educators and serves as the accrediting body for institutions offering simulation-based education (Jeffries, 2016). Throughout the evolution of simulation in each of the disciplines described, “methods were driven by educational needs and were shaped by available technology” (Aebersold, 2016, p. 60). Simulation has become a standard of educational practice as an effective clinical training tool for prelicensure nursing students and practicing healthcare providers, similar to what has been observed in the aviation and military industries (Brett-Fleegler et al., 2012; Hayden et al., 2014). Continued integration of evidence-based and standardized approaches to simulation-based learning in the context of nursing education certainly has the potential to positively impact patient outcomes and safety at both individual and population health levels (Hayden et al., 2014). 10 Simulation Theory and Elements Theoretical Underpinnings A few key theoretical frameworks support the use of simulation in the context of experiential learning. The Experiential Learning Theory (ELT) was proposed by David Kolb, PhD, and suggests that knowledge gains are achieved through transformational experiences. The ELT incorporates the following cycle of effective learning: concrete experience, reflective observation, abstract conceptualization, and active experimentation (Lisko & O'Dell, 2010). Additionally, the ELT “presents four different learning styles or modes: accommodating, diverging, converging, and assimilating,” which are based on the learning cycle described above (Lisko & O'Dell, 2010, p. 106). Although the ELT emerged originally from the organizational behavior discipline, this theory is applied effectively within clinical and simulation-based nursing education where experiential learning is essential (Lisko & O'Dell, 2010). The National League for Nursing (NLN) Jeffries Simulation Theory provides an additional framework to support the scope of this simulation debriefing project (Jeffries, 2016). Jeffries noted a transition occurring within healthcare education away from more passive traditional approaches toward increasingly active or engaged teaching and learning methods , with simulation theory and practice leading the way (Oermann et al., 2018). The INACSL team and other colleagues evaluated the state of simulation literature with a thorough review of available simulation evidence and research findings, which identified knowledge gaps and opportunities for future research (Jeffries, 2016). A need for greater empirical support to improve ongoing simulation theory progression became clear as relatively few simulation publications, 11 methodological limitations, and little terminology standardization were observed in the literature (Jeffries, 2016; NLN, 2021). A founding NLN Simulation Study demonstrated that effective use of simulation is crucial and ultimately led to the development of the original NLN Jeffries Simulation Framework (Jeffries & Rizzolo, 2006). A subsequent examination of the evidence-base surrounding simulation then catalyzed the evolution of Jeffries’ NLN Jeffries Simulation Framework into a theory (Jeffries, 2016). Central aspects of the NLN Jeffries Simulation Framework, including student, teacher, educational practices, simulation design characteristics, and outcomes, were critically reviewed (Jeffries, 2016). Significant support for the original NLN Jeffries Simulation Framework was identified in the literature (Jeffries, 2016; NLN, 2021). Common themes, including active learning, feedback, student/faculty interaction, collaboration, high expectations, diverse learning, time on task, curricular integration, and learning theory as a foundation for simulation practice and research, were observed in the literature (Jeffries, 2016). Three principal literature themes uncovered were: simulation is effective, fidelity is an important consideration, and debriefing is an essential component (Jeffries, 2016). Student satisfaction, confidence, and performance outcomes suggest effectiveness of simulation in nursing education (Jeffries, 2016). Although simulation fidelity was not well-defined across the evidence base, high, medium, and low levels of fidelity are generally observed (Jeffries, 2016; NLN, 2021). Of primary importance is the concept of choosing an “appropriate level of fidelity to meet the objectives of specific simulation activities” (Jeffries, 2016, p. 16). In general, the evidence suggests that high quality simulation leads to better outcomes for faculty and students, ultimately 12 with the potential to improve downstream delivery of patient care (Hayden et al., 2014; Jeffries, 2016). The NLN Jeffries Simulation Theory has evolved through years of systematic literature review (Jeffries, 2016). This theory has gleaned further support from the perspectives and experiences of nurses and educators who have participated in simulation-based learning (Jeffries, 2016). Key concepts within the NLN Jeffries Simulation Theory include contextual factors, background, design, simulation experience, facilitator and educational strategies, and participant variables and outcomes (Jeffries, 2016). The idea of simulation context relates to the general method of delivery and intention of a proposed simulation experience (Jeffries, 2016). The concept of background involves further delineation of specific simulation expectations and objectives, as well as their impact on simulation design (Jeffries, 2016). Background conceptualization also relates to the theoretical underpinnings, which may be simulation-specific (Jeffries, 2016). Curricular alignment, resource availability and resource allocation are typically also assessed as background elements (Jeffries, 2016). A strong background concept informs both of the next steps; simulation design and implementation (Jeffries, 2016). Simulation design involves planning and development of simulation activities and scenarios by faculty, with a goal of appropriate content and complexity selection to match student abilities and learning needs (Jeffries, 2016). Fidelity, in both a physical and conceptual sense, is established within the simulation design as a sequence of planned events being generated (Jeffries, 2016). During the simulation-design phase, roles and responsibilities are delineated for each person involved in the experience (Jeffries, 2016). The concept of experience within simulation is multifaceted, characterized by a cornerstone of shared responsibility for the maintenance of a productive and 13 safe learning space. It is critical to strive for buy-in from each participating team member to ensure optimization of the fidelity and authenticity of each simulation experience (Jeffries, 2016). The literature supports theory-based debriefing as a critical simulation element, but findings are somewhat mixed with regard to which debriefing characteristics are most effective and how debriefing should be applied (Shinnick et al., 2011). Many authors have explored whether incorporation of video is valuable or effective as a supplementary debriefing tool (Jeffries, 2016; Wilbanks et al., 2020; Zhang et al., 2020). Participant and facilitator interaction and communication are essential for successful simulation and debriefing. A learner-centered experience is prioritized, with efforts to provide a collaborative, interactive, dynamic, and trusting simulation environment (Jeffries, 2016). Simulation experiences are conducted in an effort to facilitate participant learning in the context of doing, also described as situated cognition (Jeffries, 2016). The facilitator role has been described as a “guide on the side” position, where students actively participate in the experience of simulation and drive the debriefing discussion as much as possible (Jeffries, 2016). Facilitators of simulation experiences are responsible for maintaining sufficient education and practical experience with current evidence-based simulation methods and educational strategies (Jeffries, 2016). Facilitators are called to respond with prompt adaptation to shifts in the group dynamic or participant needs, which may require timely redirection or feedback cues either during or after the experience (Jeffries, 2016). Faculty facilitators are to provide formative or summative evaluation of the student performance in simulation experiences (Jeffries, 2016). 14 Institutional administrators are encouraged to prioritize the provision of appropriate training and support resources for simulation faculty to optimize outcomes (Jeffries, 2016). Academic and personal attributes of both the facilitator and participants are also likely to have an impact on simulation-based interactions and experiences (Oermann et al., 2018). Some of these attributes are modifiable, while others are not (Jeffries, 2016). Participant factors such as preparation level or attitude may be modified by an individual student. Some participant variables, including orientation, group size, role, or patient assignment may be controlled by the simulation facilitator (Jeffries, 2016). Other participant factors, such as age, gender, or past experience are not modifiable (Jeffries, 2016). Facilitator variables which may also impact simulation dynamics are demographics, personality, technical skills, attitude, nursing competence, self-awareness, teaching ability, roles, responsibilities, values attributes, interpersonal relationships, and use of technology (Jeffries, 2016). Simulation outcomes are grouped into three fundamental categories that depict participant, patient, and system-level impacts (Jeffries, 2015). Reactions, learning, and behavior in response to simulated learning experiences are of particular interest in the literature (Jeffries, 2016). Evidence suggests that simulation-based learning offers opportunities for increased learner satisfaction, knowledge acquisition, and clinical skill performance (Jeffries, 2016). Critical thinking abilities, confidence, self-efficacy, and cultural and self-awareness may also be achieved through simulation experiences (Jeffries, 2016). There remains a gap in our literature surrounding the mechanisms and extent to which simulation training affects bedside care (Jeffries, 2016). Large and longitudinal studies are needed to further evaluate whether improved simulation performance translates to elevated preparedness for independent practice. The 15 evidence thus far suggests that “while simulation-based training can and does affect patient care, we cannot assume that upstream participant reactions and learning necessarily translate into downstream behaviors and results” (Jeffries, 2016, p. 27). The aim of NLN Jeffries Simulation Theory is to shape and guide simulation research and its effective translation into educational and clinical nursing practice (Jeffries, 2016). The central concepts and facets within the NLN Jeffries Simulation Theory are carefully woven together to achieve best-practice simulation in nursing education (Jeffries, 2016). This theory is versatile and has demonstrated application in a variety of diverse simulation environments (Jeffries, 2016). Collaborative efforts have resulted in the development of a virtual simulation ® product titled vSim for Nursing, which was utilized in this project (Kivett et al., 2014). The NLN vision statement asserts that future research is needed to evaluate the effectiveness of various debriefing characteristics on specific outcomes, and to assess whether simulation and debriefing can be effectively delivered through virtual platforms (Jeffries, 2016). National League for Nursing (NLN) Simulation Vision Statement The NLN described a vision for transformation of learning in nursing education through simulation-based methods that are evidence-based and derived from diverse disciplines (Jeffries, 2016). The rapid increase in adoption of simulation is multifactorial. Healthcare and nursing education are rapidly and continually evolving, with technological advances spurring creative innovation in this realm (Jeffries, 2016). Decreases in the length of stay for patients who are hospitalized and the number of available clinical placement opportunities have led to novel and 16 creative theory-based approaches with greater emphasis on outpatient and community health topics (Jeffries, 2016). Simulation “provides a means for teaching that is situated in the context of practice” (Jeffries, 2016, p. 44). Patient safety is of utmost importance in nursing practice and education. Simulated patient scenarios which closely replicate clinical practice allow for experiential learning without compromising patient safety (Jeffries, 2016). Simulation experiences may improve and foster collaborative, interdisciplinary education and practice (Jeffries, 2016). Simulation is delivered in many diverse forms, which may include virtual, computer-mediated, manikins, case studies, standardized patients, psychomotor skill training, role play, or a combination of methods (Jeffries, 2016). Diverse perspectives and previous experiences of faculty and participants allow for well-rounded experiential learning and debriefing discussions (Jeffries, 2016). Simulation Guidelines and Elements INACSL Guidelines for Best Practice in Simulation The International Nursing Association for Clinical Simulation and Learning (INACSL) has published quality measures and guidelines for best practice in simulation, using standardized terminology (INACSL, 2016). The standards include evidence-based guidance on professional integrity, participant objectives and responsibilities, facilitator role and responsibilities, facilitation, debriefing, and evaluation as they relate to simulation for nursing education. Continuous reassessment and re-evaluation of simulation experiences, programs, facilitators, and outcomes with valid and reliable tools is warranted (Jeffries, 2016). INACSL asserts that 17 specialized leadership and faculty training is important for simulation faculty and recommends creation of dedicated “simulation teams” as resources allow (INACSL, 2016; Jeffries, 2016). The nine specific INACSL standards include terminology, professional integrity of participants, participant objectives, facilitation, facilitator, debriefing process, participant assessment and evaluation, simulation-enhanced inter-professional education, and simulation design. Evaluation of the simulation quality in healthcare education and training is imperative, as an increased proportion of clinical hours are being achieved through simulation (Rudolph et al., SM 2016). The INACSL Standards for Best Practice: Simulation provide a standard for simulation design, implementation, and evaluation (INACSL, 2016). The INACSL guidelines support pre- briefing and debriefing as integral simulation components, offering an evidence-based approach to integration of each essential simulation element (INACSL, 2016). Pre-briefing occurs at the beginning of simulation experiences, serving as an orientation to the simulation scenario, use of required equipment and technology, expectations, roles, timeframe, and logistical aspects (Dileone et al., 2020). The pre-briefing simulation component is less studied to date than debriefing, according to a recent review of the literature (Dileone et al., 2020). Rudolph and colleagues (2014) describe the pre-briefing phase as a time to “create a psychologically safe context for learning, a so-called safe container,” and make recommendations for this practice (p.339). Existing literature highlights inconsistencies in simulation pre-briefing terminology, components, and implementation (Dileone et al., 2020). Further research is clearly warranted to enable development of a pre-briefing framework to 18 inform simulation pre-briefing best practice, in an effort to provide guidance for facilitators while improving experiential simulation outcomes for students (Dileone et al., 2020). Efficacy of Simulation in Nursing Education Due to the various challenges and barriers related to the provision of high-quality traditional clinical experiences for students, Hayden and colleagues (2014) completed a longitudinal randomized control study to evaluate outcomes associated with replacement of clinical hours with simulation-based learning experiences. A group of 666 prelicensure nursing students in 10 different programs were randomly assigned to three study groups. Clinical hours were replaced with simulation up to a different prescribed threshold for each group followed by debriefing, in alignment with the INACSL simulation guidelines (Hayden et al., 2014; INACSL, 2016). Outcomes were evaluated for a control group without simulation hours, a group where 25% of clinical hours were replaced with simulation, and a group where 50% of clinical hours were replaced with simulation (Hayden et al., 2014). No statistical difference in performance was demonstrated in the outcomes of clinical competency (according to preceptors or instructors) [p=.668], comprehensive nursing knowledge R assessment scores [p=.478], or NCLEX board examination pass rates [p=.737] (Hayden et al., 2014). The same group of students were further assessed at three intervals throughout the transition into their first clinical nursing positions (Hayden et al., 2014). Nursing managers were asked to rate study participants in areas of clinical competence and readiness for practice (Hayden et al., 2014). Differences between the three groups did not demonstrate statistical significance at 6-week [p=.737], 3-month [p=.51], or 6-month [p=.527] follow-up intervals (Hayden et al., 2014). These findings have been utilized to support the replacement of up to half 19 of required prelicensure nursing program clinical hours with simulation without a significant compromise in clinical learning or future performance (Hayden et al., 2014). Use of Simulation During the COVID-19 Pandemic In March, 2020, the onset of a global COVID-19 pandemic demanded enactment of specific public-health measures to prevent disease transmission. Due to social distancing requirements, a majority of institutions of higher education were forced to stop all face-to-face learning, effectively sending students, faculty, and clinical instructors into uncharted and unfamiliar terrain with a rapid shift to online learning in exclusively virtual environments (Carolan et al., 2020). A roller-coaster of pandemic-related transitions has led to significantly diverse and variable experiences within prelicensure clinical nursing educational settings, on both individual and institutional levels (Carolan et al., 2020). A vast majority of student nurses experienced abrupt cancellation of clinical rotations by universities and partnering clinical sites (Carolan et al., 2020; Mitchell, 2020). Without an ability to facilitate in-person clinical experiences, a number of states then rushed to advance legislation allowing virtual simulation experiences to count for a portion of the clinical hours required for undergraduate nursing program completion and eligibility for licensure (Carolan et al., 2020). Other nursing students were able to take certification examinations and enter into new-graduate clinical positions sooner than anticipated (Carolan et al., 2020). International students may have been affected by other variables such as travel restrictions, changes in immigration law, and time-zone variations (Carolan et al., 2020). Some students were even offered paid and extended clinical placements without completing their clinical education in the United Kingdom due to severe staffing shortages 20 (Carolan et al., 2020). Nevertheless, several common experiential themes have been observed (Carolan et al., 2020). Generalized fear, uncertainty, frustration, anxiety, loneliness, moral distress, and feelings of loss and grief were encountered and described by students and faculty alike during the tumultuous pandemic experiences of the Spring 2020 Semester (Carolan et al., 2020). Efforts to combat isolation have been made through the creation of an interactive and supportive virtual nursing education community, coupled with careful maintenance of an appropriate balance between asynchronous and synchronous approaches to virtual learning (Carolan et al., 2020). Nursing students were unexpectedly asked to change their routines and adapt to new course-delivery methods mid-semester, while often simultaneously grappling with concerns about nursing program progression, completion, and potential COVID-related delays (Carolan et al., 2020). Many students appreciated the flexible location and scheduling of coursework associated with a shift to online learning (Carolan et al., 2020). Other students expressed challenges related to this abrupt change including decreased social engagement or camaraderie, varying levels of digital literacy or technologic competency, and the stressful nature of an unanticipated transformation on this scale (Carolan et al., 2020). Important barriers and challenges are reportedly associated with virtual simulation. Concern has been centered around significant variation in attitudes toward e-learning, previous experience with remote learning, and digital literacy for both students and faculty (Carolan et al., 2020). Limited availability of information technology (IT) infrastructure and support technicians to guide transitions quickly became evident in many institutions, as a switch to entirely virtual modalities placed strain on these resources (Carolan et al., 2020). In some institutions, academic 21 clinicians were redeployed to bedside care positions out of necessity, which further decreased the proportion of faculty available to aid in the facilitation of this pivotal transition (Carolan et al., 2020). The mental health effects of isolation are of particular concern, due to the potential for significant negative downstream effects if appropriate monitoring, intervention, and evaluation are not prioritized (Carolan et al., 2020). Disparities in available space for virtual learning and teaching coupled with computer and internet access were also observed (Carolan et al., 2020). Effective prioritization of equal access to adequate and consistent internet, appropriate hardware, IT technician support, and digital literacy education demanded extensive and creative infrastructure expansion on a time crunch (Carolan et al., 2020). Protective factors were also identified within this transition toward e-learning. Strong leadership, transparency, open- mindedness, flexibility, collaboration, creativity, and resilience were demonstrated to be characteristics that positively affected student and faculty experiences within the COVID-19 transition to virtual learning and simulation (Carolan et al., 2020). Some of the changes in our healthcare and education systems that have resulted from the COVID-19 pandemic are likely to be long-lasting, and may catalyze ongoing systemic reform (Thornton, 2020). Systemic solutions for adaption and innovation within the context of pandemic-related constraints are vital, as these situations are multifaceted and dynamic (Thornton, 2020). Experience and evidence generated during the COVID-19 pandemic may be useful in versatile ways moving forward. Significant uncertainty remains regarding the impact of clinical nursing education changes over the long-term. For example, if novel approaches to care delivery such as telemedicine and virtual appointments continue, it will be important for universities to train nursing students for practice within these modalities as well. 22 Virtual Simulation Face-to-face methods are the traditional and most thoroughly studied approach to simulation in healthcare education. However, human patient simulation (HPS) with traditional face-to-face simulation methods is noted to be costly in terms of both monetary and human resources (Betts et al., 2020). Challenges related to scheduling can also present barriers for coordination of face-to-face simulation experiences (Peddle et al., 2016). Virtual simulation has reduced many costs typically associated with face-to face simulation including laboratory space, as well as equipment and staffing expenses (Betts et al., 2020). Many virtual simulation software programs exist and are promoted for use in this context. Although it remains a relatively novel methodology, virtual simulation already plays an important role in nursing education (Betts et al., 2020; Peddle et al., 2016; Verkuyl et al., 2020). However, further research is needed to evaluate the effectiveness of the various virtual simulation methods and approaches, with particular attention to outcomes of interest for nursing students and faculty. Peddle and colleagues (2016) completed an integrative literature review of possible impacts of simulated virtual-patient use in undergraduate clinical education curricula on student acquisition of nontechnical skills. Nontechnical skills “are comprised of situational awareness, decision-making, communication, teamwork, leadership, managing stress, and coping with fatigue” (Peddle et al., 2016, p. 400). Virtual-patient interactions were associated with improved student communication, teamwork, and decision-making abilities (Peddle et al., 2016). In healthcare, nontechnical, skill-related failures are often linked to mistakes and adverse outcomes. Thus, virtual-patient simulation is an important tool for training health professionals to provide safe and patient-centered care. 23 Betts and colleagues (2020) explored how simulation quality and experience may be affected by the use of a virtual interactive digital simulator (IDS) when compared with a control group who experienced face-to-face HPS for undergraduate nursing students. Body Interact™ software was utilized for the IDS group in this study (Betts et al., 2020). The IDS and HPS simulation experiences were both designed to align with Kolb’s theory of experiential learning as well as the nursing process (Betts et al., 2020). Although higher student satisfaction and confidence levels were observed in the HPS cohort, ability to complete high-priority actions was 14.4% higher for those in the IDS group (Betts et al., 2020). This study was completed with a convenience sample (n=22), which is a limitation. A secondary benefit of the Body Interact™ platform is that students are able to select activities by specific parameters such as length in minutes or content focus (Betts et al., 2020). Virtual simulation demonstrates versatility and can either be utilized in a face-to-face classroom setting as a group activity or may be assigned for independent, asynchronous completion (Betts et al., 2020; Verkuyl et al., 2020). Virtual Simulation in COVID-19 The unique pressures and risks associated with the COVID-19 pandemic led to rapid innovation and research throughout healthcare and education. For example, The University of West London switched to a virtual simulation model in response to pandemic-related public- health guidance with the addition of an innovative three-stage approach to program implementation (Mitchell, 2020). In order to increase fidelity through the provision of context, the university developed films that followed a set of virtual patients through three years of their lives, allowing students to be privy to various health-related and social changes (Mitchell, 2020). The associated virtual simulations were centered around teaching objectives including holistic 24 assessment, communication skills, recognition of clinical deterioration, and continual reassessment of care (Mitchell, 2020). The three stages of this simulation approach included investigation, application, and consolidation. A lead lecturer taught key concepts and provided contextual orientation in the investigation phase. Application of theoretical knowledge was then brought to simulation practice through the use of the contextual videos and filmed scenarios coupled with “live” interactive virtual care sessions (Mitchell, 2020). The consolidation phase offered students a space for small group reflection and debriefing (Mitchell, 2020). This innovative model was very well-received by University of West London’s students and faculty alike, as an effective tool for guiding clinical assessment and decision-making development (Mitchell, 2020). This model arguably offers faculty increased command over course material when compared to traditional in-hospital clinical experiences, as they are able to design or select specific content (Mitchell, 2020). Limitations of this novel approach included a lack of patient interaction, failure to provide immediate performance feedback, and inability to offer kinesthetic feedback (Mitchell, 2020). Debriefing Definitions and Significance Debriefing has been described as a facilitated critical reflection and discussion of simulation events and performance that improves cognitive and clinical skills, perhaps resulting in dual learning for both facilitators and students (Doherty-Restrepo et al., 2018). Shinnick and colleagues (2011) were some of the first to examine the simulation process in an attempt to learn which components or steps facilitated greater knowledge gains for a sample of 162 prelicensure nursing students learning about heart failure. Student knowledge outcomes were measured at three intervals, using a two-group repeated measures experimental 25 study design (Shinnick et al., 2011). A pre-test was administered at baseline prior to a hands-on simulation experience (Shinnick et al., 2011). Post-test #1was administered immediately after this hands-on experience, and the authors observed a statistically significant decrease in knowledge test scores between pre-test and post-test #1 [M= -.563, SD =3.89, p < .001] (Shinnick et al., 2011). Next, the students experienced a debriefing of the hands-on simulation experience, with post-test #2 administered thereafter (Shinnick et al., 2011). Students’ heart- failure knowledge scores were significantly better after the debriefing experience [M= +6.75, SD = 4.32, p < .001]. These findings suggest that a majority of knowledge gains actually occur after a debriefing has taken place, supporting the unique value and importance of this simulation component (Shinnick et al., 2011). A recent systematic review evaluated the evidence base surrounding the effectiveness of debriefing in healthcare simulation (Levett-Jones & Lapkin, 2014). This review included studies with varied simulation debriefing types, methodologies, and outcomes of interest, including after-simulation, during-simulation, instructor-facilitated, and video-assisted debriefing approaches (Levett-Jones & Lapkin, 2014). Regardless of the type of debriefing utilized, student performance of technical and nontechnical skills was improved from pre-test to post-test in the studies reviewed. Improvements were observed in areas such as psychomotor and physical skill performance, time and task management, teamwork, and situational awareness (Levett-Jones & Lapkin, 2014). Based on this review, Levett-Jones and colleagues (2014) were able to recommend continued incorporation of debriefing as an essential element for future simulation experiences. 26 Debriefing Methods Although the evidence suggests that debriefing is one of the most important aspects of simulation-based learning, the specific structural, organizational, and methodologic approaches to debriefing remain somewhat varied (Dusaj, 2014). In addition, the impacts of these variables are less understood (Dusaj, 2014). Debriefing may take place in virtual or in-person environments. Other relevant variables, such as debriefing group size, format, and choice of underlying theoretical framework, are important considerations. Many authors have attempted to compare the effectiveness of various debriefing approaches. One recent study divided a group of 95 undergraduate nursing students into four study groups to evaluate facilitated-group debriefing, feedback, self-debriefing, or a sequential mix of all three methods for one semester following several simulation experiences (Gantt et al., 2018). The group who received facilitated-group debriefing alone experienced comparatively greater performance score improvements on face-to-face simulation experiences throughout the semester (Gantt et al., 2018). Largely, the facilitated-group debriefing method was preferred by students and faculty alike (Gantt et al., 2018). Doherty-Restrepo and colleagues (2020) compared peer-facilitated and faculty-facilitated debriefing methods on student perception of activity effectiveness and clinical confidence levels for a group of graduate-level athletic training students. The authors observed a statistically significant increase in student-confidence scores with regard to constructing differential diagnoses [F=4.26, p=.03] and ability to openly share thoughts and emotions without fear of being shamed or humiliated [F= 2.08, p= .05] for both study groups (Doherty-Restrepo et al., 2018). Peer and faculty effectiveness were evaluated using the Debriefing Assessment for Healthcare Simulation (DASH) tool without observation of a significant difference between the 27 peer and faculty-facilitated debriefing groups in any of the six debriefing-element categories (Doherty-Restrepo et al., 2018). These study findings were limited by a small convenience sample population (Doherty-Restrepo et al., 2018). Several other recently published investigations have attempted to evaluate the incorporation of video playback in simulation debriefing as a potential adjunct to more traditional verbal methods (Chronister & Brown, 2012; Levett-Jones & Lapkin, 2014; Zhang et al., 2020). One crossover study compared the effects of verbal feedback with video-assisted verbal debriefing methods on various student-learning outcomes (Chronister & Brown, 2012). The authors obtained mixed results, but observed that learning was better following verbal debriefing (alone) than video-assisted verbal debriefing (Chronister & Brown, 2012). Interestingly, response times and quality of skill performance were both significantly better in the video-assisted verbal debriefing participant group (Chronister & Brown, 2012). However, this study was significantly limited by a nonrandomized convenience sample of 37 student participants, yielding reduced statistical power (Chronister & Brown, 2012). A prospective controlled trial compared a three-phase video-assisted debriefing (VAD) method with traditional verbal debriefing (VD) methods to evaluate effects on student © experiences, stress levels, and DASH -SV scores (Zhang et al., 2020). The study population consisted of 145 third-year nursing students in Singapore (Zhang et al., 2020). The authors observed improved student experience without additional stress for the students who experienced VAD when compared with the control VD group (p <.001) (Zhang et al., 2020). Another study utilized videotaped simulation-based team training to work on interdisciplinary collaboration among staff, nurses, and providers on a surgical inpatient unit (Severson et al., 2014). The 28 authors observed in their qualitative investigation that the team simulation training illuminated essential themes such as leadership, communication, and role clarification for participants (Severson et al., 2014). Uncertainty remains with regard to how this training impacts bedside interactions at the point of care, and which styles or facets of the debriefing are most valuable (Severson et al., 2014). However, simulation-based training appears to offer a helpful training adjunct for prelicensure students as well as practicing healthcare providers. Cognitive dispositions to respond, or cognitive biases, are also frequently implicated in clinical reasoning errors (Bond et al., 2006). Clinicians and educators have explored various approaches to introduction and awareness of these potential diagnostic pitfalls, with a focus on error prevention and improved patient safety (Bond et al., 2006). One comparative study evaluated differences in the learning and perceptions of emergency medicine residents during technical or cognitive forms of debriefing following two simulation experiences. The objective of each simulation experience was to draw attention to vertical line failure, a cognitive disposition to respond that arises when routine or repetitive tasks cause inflexible and “inside the box” thinking patterns (Bond et al., 2006). The authors found that the technical debriefing, which was focused on medical aspects of the simulations, was better received by the EM residents (Bond et al., 2006). The group assigned to the cognitive debriefing arm experienced a discussion focused more on the concept of vertical line failure (Bond et al., 2006). The technical and cognitive debriefing groups each demonstrated increased awareness of the potential risks associated with cognitive dispositions to respond (Bond et al., 2006). The authors concluded that simulation with subsequent debriefing offers an effective way to increase student familiarity with these topics (Bond et al., 2006). 29 A few prominent debriefing methodologies highlight the importance of thoughtful, reflective debriefing practice with observation of the underlying assumptions and thought processes that drive student actions in simulation-based learning (Dreifuerst, 2012; Rudolph et al., 2006). One of the approaches, called “debriefing with good judgement,” involved reflective practice where actions are explored in an effort to uncover the cognitive frames and emotional responses that influenced the decision-making (Rudolph et al., 2006). When using “debriefing with good judgement,” instructors must carefully examine their own cognitive frames in order to contextualize and present their view of the situation for discussion (Rudolph et al., 2006). This is not to presume that the instructor viewpoint is inherently correct. Rather, this viewpoint is offered as a contribution or comparison within a robust discussion of our knowledge, feelings, and assumptions as they relate to clinical decision-making (Rudolph et al., 2006). Another practice called “debriefing for meaningful learning,” utilizes Socratic questioning to facilitate reflection and assign meaning to the clinical simulation experience (Dreifuerst, 2012). Virtual Simulation Debriefing The best approaches to online debriefing of virtual simulation experiences have become a recent topic of interest due to pandemic-mediated shifts toward distance learning (Mitchell, 2020; Verkuyl et al., 2020) Self-debriefing has been considered and studied in this context due to the asynchronous and self-paced nature of virtual simulation (Lapum et al., 2019). Lupum and colleagues (2019) defined self-debriefing as an “individual, written activity in which a series of questions (designed based on a theoretical debriefing framework) facilitate learners’ reflection on a simulation” (p. E6). Self-debriefing followed by group debriefing has demonstrated a potential to improve learning, promote open-mindedness, and challenge personal or experiential 30 bias, while introducing alternative approaches and perspectives for participant discussion and consideration (Verkuyl et al., 2020). Verukyl et al. (2020) explored three different methods for debriefing virtual experiences specifically: self-debrief alone, self-debrief followed by a small group debrief of (≤ 12), and self- debrief followed by a large group debrief (≤ 30). A convenience sample of 19 first-year nursing students participated in this focus group study (Verkuyl et al., 2020). A structured set of open- ended questions was assigned for individual completion immediately following each simulated experience (Verkuyl et al., 2020). Upon simulation completion, the virtual simulation program provides each participant with an evaluation summary report (Verkuyl et al., 2020). Students were encouraged to utilize this report as a reference while completing the self-debrief afterward (Verkuyl et al., 2020). Self-debriefing offered students an opportunity for immediate self- assessment while the content and experience remained fresh, which is consistent with best- practice guidelines (INACSL, 2016; Verkuyl et al., 2020). Study outcomes suggest that an immediate self-debrief may be advantageous following virtual simulation, with group debriefing sessions providing additional value through reflection optimization (Verkuyl et al., 2020). Developing process, promotion of safe debriefing spaces, knowledge fortification, and engagement in a safe learning space are four of the distinct themes that emerged from the study focus groups (Verkuyl et al., 2020). The development of specific self-debrief processes related to aspects such as timing, guided reflection, and incentivization for completion were other highlights within focus group discussions (Verkuyl et al., 2020). High value was placed on development and maintenance of trusting relationships among the facilitators and peer participants (Verkuyl et al., 2020). Humans yearn for affirmation and 31 opportunities for experiences, which are both achievable within a safe learning and debriefing space whether in the face-to-face or virtual learning environment (Verkuyl et al., 2020). Engaging in reflection was another common thread, beginning with self-reflection and extending to incorporate the experiences and perspectives of others in the group (Verkuyl et al., 2020). Reflective group discussion allows students to learn from successes and challenges experienced by others, leading to a more holistic view and understanding of each virtual patient case (Verkuyl et al., 2020). Another group discussion point accentuates the fortification of knowledge that can be achieved through simulation debriefing specifically through identification of knowledge gaps, deepened understanding, and improved knowledge application (Verkuyl et al., 2020) Montana State University (MSU) The pandemic experience of 2020 at Montana State University (MSU) was likely similar to that of other institutions. Clinical nursing education requirements are increasingly met through a mix of face-to-face clinical and simulation experiences, and the literature review supports this as an equally effective option if done in coordination with the INACSL guidelines (Hayden et al., 2014). Many important considerations and questions arose during that time as institutions scrambled to ensure innovative and adaptive compensatory strategies to prevent interruption of nursing student clinical training progress. These included questions and considerations surrounding the ability for institutions to provide continued high-quality clinical and simulation experiences, with subsequent debriefing, in the context of social distancing and a limited PPE supply. Prior to the onset of the COVID-19 pandemic, the Montana State University College of Nursing (MSU CON) was using the docucare platform for charting patient information during 32 traditional face-to-face simulation and debriefing experiences. These simulation experiences were completed in addition to the clinical experiences The MSU CON has adopted the INACSL guidelines for simulations that fall under the moderate- and high-level fidelity simulations as outlined in Policy C-2, which describes specific procedures for the use of simulation to enhance learning in the MSU CON (Montana State University College of Nursing, 2018). The MSU nursing students transitioned from in-person clinical and simulation-based experiences to an entirely online model. The online model included synchronous introduction and pre-briefing on WebEx, vSim virtual patient case completion independently, charting experience using docucare, and subsequent synchronous debriefing on WebEx (please see glossary for definitions). Pivoting from one simulation debriefing method to another under adverse circumstances is less studied than other aspects of debriefing. This project aims to assess how nursing student simulation debriefing experiences may have been impacted by the shift from traditional face-to-face simulation debriefing methods to virtual, which method students prefer, and student perception of positive and negative aspects for each. Project Objective This project aims to evaluate whether student experiences with virtual debriefing following virtual simulation experiences would be superior to, the same as, or less desirable than traditional in-person debriefing methods. The project lead hypothesized that the students would have higher DASH-SV scores and a preference for face-to-face debriefing methods. 33 CHAPTER THREE METHODS Sample Demographics and Setting MSU is a land grant institution located in Bozeman, Montana, which was established in 1893. Montana residents comprise just over 60% of the MSU student body, with total enrollment exceeding 16,000 (Montana State University [MSU], 2021). The MSU College of Nursing (CON) was comprised of 91% female and 9% male students, with a total enrollment of 804 nursing majors back in 2004 (MSU, 2021). In 2019, enrollment in the MSU CON was much higher at 1149, with little change observed in the gender demographics (86% female, 13% male and about 1% other) (MSU, 2021). The racial and ethnic demographic within the MSU CON in 2019 was about 84% white, 5% two or more races, 4.9% Hispanic or Latino, and 4.8% American Indian or Alaska Native (MSU, 2021). Those who identify as Asian or are listed as unknown ethnicity make up just over 1% each, while African Americans, non-US citizens, and Pacific Islanders represent 0.5% per group (MSU, 2021). The typical number of students accepted into upper division with clinical rotations in Bozeman is 16 annually. Seventeen traditional Bachelor of Science in Nursing (BSN) program students at Montana State University (MSU) on the Bozeman campus participated voluntarily in a sequence of two anonymous surveys as participants in this project. The student surveys were completed by one section of MSU 3rd-year nursing students who were enrolled in the Medical- Surgical Nursing Course, NRSG 352, in the spring semester of 2020. There were two male and 15 female students in this study population. Other demographic data were not collected due to 34 the importance of safeguarding student anonymity. However, some variation in race, age, and socioeconomic status was observed in the sample population. Maintenance of participant anonymity was prioritized as the project lead was also serving in the clinical instructor role for this particular section of NRSG 352. Study Design This project was completed using a one-group repeated-measures experimental design, meaning that the same group of individuals were surveyed before and after COVID-19-related changes from F2F to virtual simulation debriefing. The 17 students were divided into two groups at the beginning of the semester. Each group initially attended clinical training every other week on Thursday and Friday, caring for patients in the local hospital. The plan to shift mid-semester to virtual simulation as an alternative to in-person clinical experiences was introduced to the student group in March, 2020, immediately following return to campus from spring break. A subsequent opportunity for students and faculty to ask questions and express concerns was provided. Upon shifting to an online model, faculty agreed that the originally established groups should be maintained within the shift to vSim experiences and debriefing. Faculty generated a schedule for virtual simulation experiences. Attendance and participation in synchronous pre- briefing, completion of an assigned vSim for Nursing (vSim) experience, and subsequent synchronous debriefing via WebEx was required for students to obtain credit for the clinical hours necessary for course completion. The students were asked to view online vSim orientation videos prior to completing their first simulation online. On the morning of the first course vSim experience, time was set aside for further orientation and pre-briefing, with clarification of remaining student questions. 35 ® vSim for Nursing (vSim) MSU selected the vSim for Nursing (vSim) platform for use in the place of traditional clinical training and simulation methods during the COVID-19 pandemic. An interdisciplinary partnership between the NLN, Wolters Kluwer Health, and Laerdal Medical resulted in the creation of vSim for Nursing. The vSim for Nursing platform pairs interactive, online avatar- based clinical nursing experiences with integrated learning resources congruent with essential nursing curriculum content (Kivett et al., 2014). The NLN authored the patient scenarios in vSim platform development phases, and endorses its use for clinical training in this context (Kivett et al., 2014; The National League for Nursing, 2021). The vSim for Nursing experience is accessed online, meaning it can be utilized anytime from any location (NLN, 2021). Experiences in vSim start with a pre-simulation quiz and links to recommended preparatory reading resources. Next, the student interacts with the virtual patient (NLN, 2021). To increase fidelity, students are able to ask the patient questions and elicit responses, conversing with the avatar to conduct an assessment and provide primary nursing care. Students are expected to review the available information in the chart and prioritize actions to provide appropriate patient care within the construct of the vSim scenario (NLN, 2021). Meanwhile, the vSim software is continually tracking and evaluating student actions to construct an individualized performance feedback log complete with decision-making rationale and suggestions for improvement (Kivett et al., 2014). The feedback log is presented for student review and reflection upon simulation completion. After the simulation is complete, a post- simulation quiz, documentation activities, and guided reflection questions are completed by the student (Kivett et al., 2014). For the students who participated in this project, a synchronous 36 WebEx group debriefing meeting was held immediately following simulation completion facilitated by the project lead and one other clinical instructor. Virtual simulation sessions and subsequent virtual debriefing sessions were completed with half of the student group at a time. One group had eight students, while the other had nine. The groups were intentionally small in order to promote active discussion participation between students and facilitators during the virtual debriefing WebEx meetings. Objectives This project was completed utilizing the COVID-19-catalyzed shift to virtual simulation debriefing methods. This project compared student perceptions of simulation debriefing effectiveness with traditional in-person clinical debriefing methods. The goal was to evaluate ® whether student DASH -SV ratings were significantly different between methods. Specifically, this project compared student perceptions of debriefing in an online WebEx conference following a vSim for Nursing virtual patient experience (post-COVID) with the face-to-face simulation debriefing models experienced during the first half of the semester (pre-COVID). Thus, the pre-test is assessment of traditional methods, while the post-test represents an assessment of the virtual methods brought about by the pandemic. Outcome Measures: Debriefing Assessment for Simulation in Healthcare (DASH) The DASH tool was developed in 2010 in an effort to better assess the quality of simulation-based clinical learning experiences from three different angles (Brett-Fleegler et al., 2012). A student version of the DASH allows for students to rate their instructor on the quality of a simulation debriefing experience. Another version is designed for instructor self-assessment, 37 © while a third version is specifically for trained faculty DASH Raters (Sawyer et al., 2016). Interestingly, the DASH tool has been translated into several different languages for international use (Simon et al., 2010). The DASH involves completion of a of six-item assessment of essential simulation debriefing elements with selection experiential rating from 1 to 7 on a behaviorally anchored rating scale (Brett-Fleegler et al., 2012). A score of 1 translates to “very ineffective” while a 7 indicates “extremely effective” on the rating scale (Brett-Fleegler et al., 2012). Internal consistency was evaluated with a Cronbach’s alpha = 0.89, indicating good reliability (Brett- Fleegler et al., 2012). In addition, the intraclass correlation coefficient for the six elements combined was 0.74, which provides further evidence for reliability of the DASH tool (Brett- Fleegler et al., 2012). The DASH-SV was incorporated into the surveys utilized for project data collection. The DASH-SV also appears to have good reliability, as the Cronbach’s alpha coefficient value was determined to be 0.82 in another recent simulation study (Dreifuerst, 2012). In addition, inter- rater reliability between the student and trained DASH Rater versions has demonstrated to be high in some circumstances, and the scales are directly comparable (Brett-Fleegler et al., 2012). DASH offers a reliable and conceptually valid approach for an evaluation of simulation debriefing, specifically (Rudolph et al., 2016) Institutional Review Board (IRB) Exemption An application for IRB exemption was submitted on March 28, 2020, and exemption was granted by Mark Quinn who serves as the chairman of the Institutional Review Board for the Protection of Human Subjects at MSU. A copy of the IRB Exemption letter can be found in Appendix A. 38 Ethical Issues The project lead was also the instructor for the cohort of 17 student participants, which presented some ethical considerations for this project. Efforts were made to ensure complete anonymity in survey responses, and this was clearly communicated to all participants. However, there remains potential for bias in survey responses related to this. Each student was awarded 5 extra-credit points only if the entire class submitted a survey response. Demographic data were not collected in an effort to maximize anonymity, so the only demographic data reported are the percentage of male and female student survey participants. Data Collection with Qualtrics Survey Qualtrics was utilized as an online platform for survey data collection, storage, and protection of participant anonymity. The survey was distributed to students via an email that included instructions with a Qualtrics survey link. A copy of the pre-COVID (F2F) and post- COVID (virtual) DASH-SV surveys are included in Appendices C and D. The post-COVID (virtual) DASH surveys included five additional questions beyond the original DASH-SV elements. The first additional item asked the student to designate which of the two methods they preferred. In an effort to facilitate contextualization of the quantitative data, the final four questions provided an opportunity for students to list what they found beneficial and challenging about each of the methods. Prospective student participants were informed at the start that participation in this survey was completely voluntary and that all responses would remain completely anonymous. They were also informed that the data collected from this survey might be utilized for process 39 improvement or publication. They were offered an incentive bonus of 5 points of credit if the entire class responded to the survey. Statistical Analysis The quantitative data were imported and analyzed using Statistical Package for the Social Sciences (SPSS) software. Data analysis in SPSS included descriptive statistics and a two-tailed repeated-measures student’s t-test. The student’s t-test was utilized to compare and statistically quantify the differences in DASH-SV scores for F2F and virtual debriefing modalities. The t-test facilitates a determination of whether two samples are likely to have different population means, and the statistical significance of that difference (Gravetter et al., 2018). The t-test is performed in an effort to either reject or accept the null hypothesis, which states that DASH-SV scores will be no different for virtual simulation debriefing compared to traditional face-to-face simulation debriefing (Gravetter et al., 2018). The alternative hypothesis for this project is that there is a statistically significant difference between DASH-SV scores between the two groups, either in a positive or negative direction (Gravetter et al., 2018). Statistical confidence at a 95% probability is indicated by a p value <.05 for the t-test results. The results of this data analysis are reported and represented graphically in Chapter 4. 40 CHAPTER FOUR RESULTS Data Quality The project lead was able to maintain data quality through maintenance of student anonymity with voluntary participation. The DASH-SV was utilized to assess the same 17 students in the F2F and virtual surveys, with a few additional qualitative questions added to the latter. Missing Data One data point was missing on the F2F survey with no response recorded for Element #1, which evaluated student perception of the introduction phase of debriefing. The DASH-SV recommends skipping Element #1 entirely if the student did not participate in the introduction portion; perhaps this could explain this missing data point. The missing data point was managed through the use of a case-wise deletion during SPSS analysis. The missing data point was omitted, and an n of 16 was used instead of 17 for the descriptive statistics and a paired, two- tailed t-test analysis. Quantitative Results The number of responses to the DASH-SV obtained from the student cohort following face-to-face and virtual instruction is summarized with the full survey text for each element in Table 1. The distribution of response scores is compared graphically for the face-to-face and 41 virtual methods in Figure 1. Statistics describing the face-to-face and virtual debriefing DASH score distributions are provided in Table 1. Responses to all elements showed a notable skew towards higher scores, with most rankings scoring 6 or higher on a scale of 1 to 7. Minor variation in minimum and mean values were observed between the two types of debriefing. Very little difference was observed for instructor’s ability to set the stage (Element 1), maintain an engaging context (Element 2), provide an organized debriefing structure (Element 3), and provide suggestions for improvement (Element 6). Somewhat larger differences in scores for in-depth reflection (Element 4) and feedback on successes and failures (Element 5) were observed; however, with scores shifting down from face-to-face to virtual responses. For Element 4, this resulted in only a small change in the mean, but for Element 5, the mean dropped from 6.47 to 5.88 with a greater associated standard deviation. Cumulative scores were also strongly skewed to the right for both debriefing styles, but the face-to-face responses were generally higher and had a smaller range than the virtual responses, which ranged more broadly and included a lower minimum value. This caused the distributions of the two sets of cumulative scores to have somewhat different means, as shown in Figure 2 and Table 1. There was a higher standard deviation about the mean in the composited virtual data than was observed for the face- to-face responses (Table 1). 42 Table 1. Descriptive Statistics and Paired T-test Results for Comparison of Face-to-Face (F2F) ® and Virtual (V) DASH -SV Scores Pretest Posttest (V) (F2F) 95% CI for Mean Difference Outcome M SD M SD n p t df Element 1 6.29 .772 6.25 .856 16 -0.469, 0.594 .806 .251 15 Element 2 6.71 .588 6.65 .606 17 -0.326, 0.443 .750 .323 16 Element 3 6.29 .772 6.53 .624 17 -0.700, 0.229 .299 -1.074 16 Element 4 6.41 .795 6.47 .624 17 -0.484, 0.336 .733 -0.293 16 Element 5 6.47 .795 5.88 .993 17 0.141, 1.036 .013 2.787 16 Element 6 6.71 .470 6.41 .712 17 -0.142.0.731 .712 1.429 16 Composite 2.91 37.8 3.67 38.88 17 -0.632, 2.749 .203 1.328 16 3 2 8 * p < .05 43 ® Figure 1. Histograms of Student DASH -SV Scores for Each Survey Element 44 ® Figure 2. Histogram of Cumulative student DASH -SV Scores for Virtual and Face-to-Face Simulation Debriefing Results of a student’s t-test used to compare paired responses between face-to-face and virtual debriefing methods are summarized graphically in Figure 3 and tabulated in Table 1. In Figure 3, the individual responses are plotted as dots, the mean is represented by the height of the column, and the deviation about the mean is represented by the bars shown for each element. The strong similarity in responses to the instructor’s ability to set the stage (Element 1, p=.806) and maintain an engaging context (Element 2, p=.750) are reflected in the t-test statistics and reported significance level, which indicate no significant difference between the debriefing methods. Despite the observed shifts in scores for in-depth reflection (Element 4, p=.773), the lack of a change in the mean for these pairs led to little difference of no significance in the t-test. Interestingly, although there was little apparent difference in the responses for the instructor’s 45 capacity to provide an organized debriefing structure (Element 3, p=.299) or provide suggestions for improvement (Element 6, p=.172), Figure 3 and Table 1 suggest that there was a slightly higher mean score in the virtual data for debriefing structure organization and a slightly lower mean score in the virtual data for the instructors capacity to provide suggestions for improvement, though neither were statistically significant. What was statistically significant was feedback on successes and failures (Element 5), which had a p=.013. Students found that the instructor’s capacity to give meaningful feedback on successes and failures was significantly better in face-to-face debriefing. The paired comparison of composite scores is described in Table 1 and Figure 4. Despite the increased variance in range of responses for virtual composite score, and some difference between the mean of the composite scores, t-test results shown in Figure 4 illustrate little overall difference between face-to-face and virtual debriefing methods, at a significance of p=.203. Although qualitative responses were not solicited at the end of the face-to-face training, five questions were posed to students at the end of virtual simulation debriefing. Student responses indicated a universal preference for face-to-face training, which differs substantially from the statistical results described above. This may reflect a student preference for, or familiarity with, in-person learning, potentially driven by metrics other than debriefing. 46 ® Figure 3. Paired T-test Results for DASH -SV Virtual and Face-to-Face Simulation Debriefing for Each Survey Element 47 ® Figure 4. Paired T-test Results for Cumulative Student DASH -SV Scores for Virtual and Face- to-Face Simulation Debriefing 48 Qualitative Results Students were asked to describe the benefits and challenges associated with each of the debriefing methods. The project lead placed individual responses into groups to evaluate for predominant themes within the qualitative response data. Face-to-Face Benefits Uniformly, the student participants preferred F2F debriefing (n=17). A majority included the ease of communication in-person (n=10) and in a familiar setting (n=6). The ability to share real experiences with one another and identify common issues was also cited (n=5). One student summarized this as being a “way better experience. Hands on learning.” Another student explained that they learned more about different patient experiences from their peers during in- person debriefing. Face-to-Face Challenges A majority of students found nothing negative to say about face-to-face debriefing experiences (n=7). Two cited fatigue or distraction. Two cited the inconvenience of having to show up in-person. Two raised HIPPA concerns about sharing too much detail in F2F debriefings. One student further noted the “awkward pauses” when instructors waited for responses. One participant found it more challenging to “share more difficult information in the group setting.” 49 Virtual Benefits Aspects of virtual debriefing that students found to be beneficial were more varied. Their ability to share similar experience from the virtual experiences was cited by five students. Four students liked the fact that virtual focused in-depth on a more limited range of topics. Four cited the vSim technology as beneficial, specifically with regard to the guided questions, outlined plan and ability to go back and review content. Three Students cited the convenience of the virtual teaching method and two were more comfortable with the analysis of their mistakes during debriefing. One student commented that “it was good to still talk about the sim even though it was virtual. I think I got more out of the experience.” Virtual Challenges A majority of students described challenges with communication (n=7) and the impersonal nature of the method (n=5) in the virtual setting. Some cited problems with the vSim or WebEx technology itself (n=4). Five students found virtual debriefing to be impersonal. Two expressed confusion with the program expectations or process. The student descriptions of virtual debriefing challenges were more varied than those reported for face-to-face debriefing. One commented that “because we all had the same patients on sim there is only so much to talk about.” Another mentioned that “it just does not feel relatable or real.” One student described an inability to “have control over what exactly I wanted to say to the patient. It was also a little more difficult to create a care plan because we were only with the patient for a short period of time and did not know much about them on a personal level.” Finally, one student asserted that 50 virtual debriefing was “worse than in-person in almost every way,” while another stated that it was “nothing like face-to-face.” 51 CHAPTER FIVE DISCUSSION Statistical comparison of DASH score results for F2F and virtual simulation debriefing show little difference between methods except with regard to instructor feedback in debriefing (Element 5). Lower student DASH scores for Element 5 indicated that virtual simulation debriefing was less effective for delivery of instructor feedback on student successes and failures (p=.013). Interestingly, student responses to questions following the virtual simulation indicated a universal preference for in-person debriefing with clinical instructors. This may seem inconsistent with the quantitative survey, but the student preference for face-to-face training was not directly addressed in the debriefing questions included in the DASH-SV survey. These quantitative results were surprising in that the project lead anticipated a strong preference for in- person simulation debriefing methods overall. The disruption of in-person training caused by COVID-19 provided the opportunity for this pilot project, which is based on a small convenience sample of students whose training pivoted suddenly mid-semester. IRB Exemption was granted for the scope of this project. Out of concern for potential about bias related to student efforts to please the instructor/project lead, responses were entirely anonymous and demographic data were not collected. These results support literature suggesting that virtual simulation and debriefing provide effective nurse training in support of patient safety (Mitchell, 2020; Verkuyl et al., 2020). Due to the small size of the convenience sample, it was not possible to determine effect size for statistical analysis. The short study duration prevented analysis of student performance in subsequent clinical training or practice settings. For these reasons, these results should not be used to generalize 52 about the success of these methods in other settings. However, they may be translatable to other sections of clinical nursing education at Montana State University where similar programmatic elements and demographics exist. This may help to inform decisions about future training in light of the lower overhead cost and greater convenience of this training method, where clinical partnerships and simulation lab space are not required. The qualitative student responses may provide context for the statistical results. Students universally preferred a face-to-face debriefing, but were able to identify valuable aspects of the virtual debriefing. Students generally agreed that communication in-person was more familiar and allowed them to share real simulation experiences. Some students enjoyed the narrower focus of topics and similar experiences of the virtual learning environment and found that vSim technology and WebEx were more convenient. They also enjoyed a less personal environment for analysis of mistakes. Many found communication challenges and the impersonal environment of virtual debriefing to be challenging. Some expressed confusion or challenges with vSim technology or WebEx. In contrast, many students had no problems with face-to-face debriefing. A few students expressed concerns about HIPPA disclosure, fatigue, or distraction by the end of experiences, and the inconvenience of attending in-person. In general, a broader range of responses were observed for virtual than for face-to-face, which is reflected both statistically and in the qualitative comments received. Numerically, the only statistically significant difference involved instructor feedback. Interestingly, only a few students mentioned this aspect at all in qualitative responses. 53 Isolation during the COVID-19 pandemic has a tendency to amplify the joys of socialization, which could have impacted DASH-SV ratings and qualitative feedback obtained in this project. Personality type or learning style may impact a student’s ability to connect with patients and colleagues in either virtual or face-to-face patient-care interactions. Debriefing online may be better perceived and accepted by students who signed up initially to be in an online program compared to students who signed up to be in a face-to-face course with in-person clinical experiences (Gordon, 2017). Another possibility is that the ratings and comments expressed were not exclusively related to the debriefing experience. Lower ratings for Element 5, Instructor Feedback suggest that students felt they received less concrete feedback and assistance in the exploration of strengths and weaknesses at key moments within the simulation experience from the clinical instructors. This may represent a genuine limitation of the DASH-SV tool for assessment of virtual experiences. The DASH-SV does not account specifically for variability in student learning styles or debriefing modalities. vSim for Nursing provides instant feedback for students without the potential for compromised patient or nursing student safety, which can be associated with face-to-face clinical. Timely feedback provision is essential to help solidify knowledge gains and promote long-term memory storage. The software also delineates areas for improvement in future performance. There is arguably a higher probability that incomplete or missed opportunities for instructor feedback could occur in a face-to-face simulation debriefing secondary to the potential for human error, while the vSim platform keeps a log of actions and rationale for decision making with accurate prioritization within the case study. 54 Students may have somewhat less accountability in the virtual simulation environment, but there are ways to increase this. For example, using a synchronous WebEx meeting platform requires that students arrive at a specified time, having prepared for the simulation. Participation can be easily monitored this way, and instructors are then available to help address any questions or challenges that may arise during a virtual simulation experience. The author observed that was particularly helpful for student guidance during the early transitional period from F2F to virtual. Clinical instructors are not able to be present in the virtual simulation environment to observe performance and provide feedback or suggestions related to individual student actions during the virtual visit. However, vSim technology is actually well-suited for the provision of consistent and accurate student performance evaluation (NLN, 2021). vSim provides a log-in checklist format for student feedback after simulation. Perhaps the vSim software could provide the student with a video playback of their performance for review of feedback as a part of the debriefing process. Incorporation of virtual simulation and video-assisted debriefing has demonstrated benefit in some contexts (Wilbanks et al., 2020; Zhang et al., 2020). Institutional history and culture have shaped common practice of lower and upper divisions within nursing programs. A potential benefit of shifts toward virtual simulation and virtual education platforms might be the ability to deliver didactic and clinical application content through a more integrated or parallel approach (Carolan et al., 2020). Disparities between students became evident through the virtual platform during virtual simulation and debriefing sessions. Students may encounter challenges associated with WebEx, poor internet connection, computer access, rural communities, family commitments/childcare, quiet space, or pets. 55 However, some of these challenges can also be viewed in a positive light by creating a sense of virtual community and camaraderie, possibly combating isolation associated with COVID-19. Students can learn effectively from watching each other perform F2F simulation or interacting to complete a simulation as a team. Experiencing a variety of in-person clinical settings offers valuable diversity in experience and practice with interpersonal, inter- professional, technical, and nontechnical skills (Peddle et al., 2016). An important question will become whether vSim is capable of producing similar results. One concern is that the students might miss out in these opportunities for learning from the instincts and actions of others. However, following virtual simulations with an evidence-based virtual debriefing may help students to learn from the experiences of their peers from a distance. One clear benefit of vSim is that it provides a safe environment for experimentation with choice-of-care interventions and the prioritization of these decisions in a low-stakes situation with minimal consequence compared to traditional clinical experiences in the hospital. Virtual simulation platforms which are steeped in an evidence-based theoretical basis such as Kolb’s Experiential Learning Theory and the NLN Jeffries Simulation Theory are preferred (INACSL, 2016; Jeffries, 2016; Lisko & O'Dell, 2010; Mitchell, 2020; The National League for Nursing, 2021). Although the existing literature on this topic remains quite limited, these findings are in alignment with other evidence supporting the use of vSim for Nursing as an effective supplementary teaching strategy (Gu et al., 2017). This provides further support for the importance of immediate debriefing after simulation experiences and the value that can be added through guided self-reflection added on to the group debriefing experience (Lapum et al., 2019; Verkuyl et al., 2020). Betts and colleagues (2020) observed that, although human patient 56 simulation (in-person methods) was associated with higher satisfaction and confidence, an assessment of completion of high priority actions was about 15% higher for those who experienced simulation with a virtual interactive simulator called Body Interact™. Ongoing attention to the risk for disparities related to technology-facilitated virtual learning is essential, and systems-level efforts to mitigate these risks should be employed (Carolan et al., 2020). The NLN vision statement identified distance simulation and debriefing delivery and efficacy as a relevant future research question (Jeffries, 2016). This project offers preliminary information on this topic and may help to inform future research directions. Clinical teaching and learning are dynamic and continuously evolving, and nursing faculty must remain adaptable to ensure consistent delivery of high-quality and evidence-based clinical education to students (Aebersold, 2016). Limitations This project compared vSim coupled with a virtual debriefing to debriefing with traditional methods, and solicited qualitative data surrounding the benefits and challenges associated with each method. As a result, the scope of the results discussed here do not offer information about faculty experiences, subsequent student clinical performance, and professional practice trajectory. This project is statistically underpowered, but it may provide a foundation and possible directions for future work on this topic. Participant bias is a possibility within this project, as the students potentially provided responses that they considered to be consistent with instructor expectations in order to please the instructor. Due to the small convenience sample population evaluated in this study, there is also risk for sample bias. Effect size was not determined, and thus the t-test results may not be representative of, or generalizable to, the 57 general undergraduate nursing student population. Without calculation of effect size, evaluation of the absolute magnitude of the treatment effect is not possible (Gravetter et al., 2018). Although there was sufficient statistical power to use a student’s t-test to evaluate differences between the debriefing method datasets, the size of the data sets (n= 16, n=17) may limit the ability to resolve statistical significance. For example, a more comprehensive study across several class sections might allow use of more powerful statistical analytical methods, such as ANOVA (Gravetter et al., 2018). 58 CHAPTER SIX CONCLUSION This project evaluated the experiences of third-year nursing students at MSU before and after in-person clinical, simulation, and debriefing were forced to stop secondary to public-health restrictions associated with the COVID-19 pandemic. A cohort of 17 students who experienced pre-COVID simulation debriefing methods were surveyed initially about those experiences. A second survey was distributed to evaluate nursing education methods with vSim for Nursing and subsequent virtual debriefing over WebEx. Although virtual simulation and debriefing appear to be appropriate strategies for clinical nursing education in the context of a global pandemic, these methodologies are certainly not without limitations. Virtual simulation debriefing, while not as familiar or easy in terms of communication, appears to be an effective alternative to F2F debriefing. This is particularly true in the context of this year’s COVID-19 pandemic, when F2F clinical experiences and simulations became impossible for a period of time. Interestingly, the students uniformly expressed a preference for traditional F2F debriefing methods over virtual simulation debriefing. This could represent an appreciation for the familiar, a limitation of the vSim software, or genuine limitations of the virtual space and method in comparison. The only statistically significant difference is demonstrated in Element 5 t-test results (Table 1), although the statistical rankings did not suggest any other significant differences between F2F and virtual debriefing (see Table 1). The qualitative responses tell a different story. Remaining questions, such as how this may impact employers, patient safety, student training, competence, and future employment opportunities, require further investigation. Student 59 qualitative responses indicated some challenges relating primarily to communication, but increased convenience and shared experience with focused topics were noted to be positive aspects of students’ experiences with virtual methods. A difference may exist between what is preferred by students and processes that are most effective in achieving desired nursing- education outcomes, particularly under external constraints associated with the COVID-19 pandemic. Continued use of the vSim for Nursing platform may produce additional benefits for undergraduate nursing education in the post-pandemic era when F2F clinical and simulation experiences are possible. Perhaps virtual simulations could be developmentally assigned in conjunction with traditional methods as a supplementary or adjunct either as a pre-clinical preparation exercise or as an adjunctive learning experience. For example, a student might complete a vSim experience with a virtual patient who is being treated for heart failure as a way to prepare for face-to-face clinical caring for a real patient who also has heart failure. Another possibility would be to include use for sophomores to prepare for clinical rotations. Perhaps vSim for Nursing, if used earlier in the program of study, could improve preparedness and performance for clinical experiences in healthcare settings. Implications for Future Work It was beyond the scope of this project, to collect instructor DASH self-assessment scores or trained faculty DASH Rater scores in addition to the DASH-SV data which were gathered from the student perspective. Perhaps a more well-rounded and broad data collection approach could invite a more comprehensive understanding of what constitutes effective virtual simulation debriefing and how to optimize our application of this in clinical nursing education. Simulation 60 and debriefing experiences are inherently interactive and collaborative, which mirrors the nature of professional nursing practice. Clearly there are many lingering questions surrounding the impacts of virtual simulation use. The best way to assess the long-term impacts would be to continue asking these questions and studying outcomes of interest as they may relate to virtual simulation-based experiences. It may be helpful to keep an eye out for innovation and findings relevant to other tactile learning disciplines such as engineering. Public education may also be relevant, although perhaps less so than professional education. The work of this project could easily be continued with a shift toward longitudinal methods, following the participants forward to look for possible impacts and outcomes of this experience over time. For example, one could re-poll the same student group to look at downstream effects with evaluation of specific outcomes at isolated points in time. Possible outcomes of interest might include performance in the remainder of the nursing program, final GPA, NCLEX pass rates, first positions out of school, and performance in clinical practice. Another potential future research direction could involve comparison between the five MSU campuses and various sections or courses to obtain a more robust sample with more statistical power. A project with clustered groups on different MSU campuses may provide an ability to assign stronger confidence in findings with years of data collection over a longer evaluation period and with a much larger sample population size. In the future, it will be helpful to also compare vSim to other available platforms designed for virtual patient interaction. In summary, the quantitative data suggest that the only statistically significant difference in DASH-SV scores between F2F and virtual debriefing methods was for Element 5, instructor 61 identification and communication of successes and failures. No statistically significant difference was observed in the ratings for other elements, or the cumulative DASH-SV ratings. The qualitative data indicated a universal preference for F2F debriefing. The observed outcome was similar to the expected outcome for this project. Clinical nursing education is expensive in general (Oermann et al., 2018). In many circumstances, vSim and subsequent virtual debriefing could offer a cheaper and more flexible alternative to traditional simulation methods. These project findings may be translatable to other MSU sections or possibly campuses, but are not generalizable due to the very small sample size and scope. 62 REFERENCES CITED 63 Aebersold, M. (2016). The History of Simulation and Its Impact on the Future. AACN Advanced Critical Care, 27(1), 56–61. https://doi.org/10.4037/aacnacc2016436 Agency for Healthcare Research and Quality. (2020). Healthcare simulation dictionary. Retrieved from https://www.ahrq.gov/patient-safety/resources/simulation/terms.html Betts, L., Schmid, J., Sivaramalingam, S., & Verkuyl, M. (2020). Using Virtual Interactive Digital Simulator to Enhance Simulation Experiences for Undergraduate Nursing Students. Nursing Education Perspectives (Wolters Kluwer Health), 41(3), 193–194. https://doi.org/10.1097/01.NEP.0000000000000472 Bond, W. F., Deitrick, L. M., Eberhardt, M., Barr, G. C., Kane, B. G., Worrilow, C. C., . . . Croskerry, P. (2006). Cognitive versus technical debriefing after simulation training. Academic Emergency Medicine, 13(3), 276–283. http://search.ebscohost.com.proxybz.lib.montana.edu/login.aspx?direct=true&db=ccm& AN=106312542&login.asp&site=ehost-live Brett-Fleegler, M., Rudolph, J., Eppich, W., Monuteaux, M., Fleegler, E., Cheng, A., & Simon, R. (2012, Oct). Debriefing assessment for simulation in healthcare: Development and psychometric properties. Simul Healthc, 7(5), 288–294. https://doi.org/10.1097/SIH.0b013e3182620228 Carolan, C., Davies, C. L., Crookes, P., McGhee, S., & Roxburgh, M. (2020). COVID 19: Disruptive impacts and transformative opportunities in undergraduate nurse education. Nurse Education in Practice, 46, N.PAG-N.PAG. https://doi.org/10.1016/j.nepr.2020.102807 Chronister, C., & Brown, D. (2012). Comparison of Simulation Debriefing Methods. Clinical Simulation in Nursing, 8(7), e281–288. https://doi.org/10.1016/j.ecns.2010.12.005 Cohen, B. S., & Boni, R. (2018, Mar). Holistic Nursing Simulation: A Concept Analysis. J Holist Nurs, 36(1), 68–78. https://doi.org/10.1177/0898010116678325 Dileone, C., Chyun, D., Diaz, D. A., & Maruca, A. T. (2020). An Examination of Simulation Prebriefing in Nursing Education: An Integrative Review. Nursing Education Perspectives (Wolters Kluwer Health), 41(6), 345–348. https://doi.org/10.1097/01.NEP.0000000000000689 Doherty-Restrepo, J., Odai, M., Harris, M., Yam, T., Potteiger, K., & Montalvo, A. (2018, Summer). Students' Perception of Peer and Faculty Debriefing Facilitators Following Simulation-Based Education. Journal of Allied Health, 47(2), 107–112. http://search.ebscohost.com.proxybz.lib.montana.edu/login.aspx?direct=true&db=ccm& AN=130371516&login.asp&site=ehost-live 64 Dreifuerst, K. T. (2009, Mar-Apr). The essentials of debriefing in simulation learning: A concept analysis. Nurs Educ Perspect, 30(2), 109–114. Dreifuerst, K. T. (2012, Jun). Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ, 51(6), 326–333. https://doi.org/10.3928/01484834-20120409-02 Dusaj, T. K. (2014). Five Fast Fixes: Debriefing. Clinical Simulation in Nursing, 10(9), 485– 486. https://doi.org/10.1016/j.ecns.2014.06.002 Gordon, R. M. (2017). Debriefing Virtual Simulation Using an Online Conferencing Platform: Lessons Learned. Clinical Simulation in Nursing, 13(12), 668–674. https://doi.org/10.1016/j.ecns.2017.08.003 Gantt, L. T., Overton, S. H., Avery, J., Swanson, M., & Elhammoumi, C. V. (2018). Comparison of Debriefing Methods and Learning Outcomes in Human Patient Simulation. Clinical Simulation in Nursing, 17, 7–13. https://doi.org/10.1016/j.ecns.2017.11.012 Gravetter, F. J., Wallnau, L. B., & Forzano, L.-A. B. (2018). Essentials of statistics for the behavioral sciences (9th ed.). Cengage Learning. Gu, Y., Zou, Z., & Chen, X. (2017). The Effects of vSIM for Nursing™ as a Teaching Strategy on Fundamentals of Nursing Education in Undergraduates. Clinical Simulation in Nursing, 13(4), 194–197. https://doi.org/10.1016/j.ecns.2017.01.005 Harper, M., & Markham, C. (2011). Clinical simulation: Exploring its history and evolution. Technic: The Journal of Operating Department Practice, 2(2), 11–14. http://search.ebscohost.com.proxybz.lib.montana.edu/login.aspx?direct=true&db=ccm& AN=104849261&login.asp&site=ehost-live Hayden, J. K., Smiley, R. A., Alexander, M., Kardong-Edgren, S., & Jeffries, P. R. (2014). The NCSBN National Simulation Study: A Longitudinal, Randomized, Controlled Study Replacing Clinical Hours with Simulation in Prelicensure Nursing Education. Journal of Nursing Regulation, 5(2), S3–S40. https://doi.org/10.1016/S2155-8256(15)30062-4 INACSL. (2016, December). INACSL Standards of Best Practice: Simulation Design. Clinical Simulation in Nursing, 12(S), S5–S12. https://doi.org/10.1016/j.ecns.2016.09.005 Jeffries, P., & Rizzolo, M. (2006). Summary report: Designing and implementing models for the innovative use of simulation to teach nursing care of ill adults and children: A national, multi-site, multi-method study http://www.nln.org/docs/default-source/professional- development-programs/read-the-nln-laerdal-project-summary-report-pdf.pdf?sfvrsn=0 65 Jeffries, P. R. (2015). NLN Jeffries Simulation Theory: Brief Narrative description...excerpted from The NLN Jeffries Simulation Theory, a monograph published by the national league for Nursing, copyright 2015. 36, 292–293. https://doi.org/10.1097/00024776-201509000- 00004 Jeffries, P. R. (Ed.). (2016). The NLN Jeffries Simulation Theory [Monograph]. Wolters Kluwer. Kivett, T., Dekker, R., & Pearse, C. (2014). Laerdal Medical and Wolters Kluwer Health introduce virtual simulation learning tool for nursing students http://download.lww.com/vsim/Vsim_PressRelease_0127144final.pdf Lapum, J. L., Verkuyl, M., Hughes, M., Romaniuk, D., McCulloch, T., & Mastrilli, P. (2019). Self-Debriefing in Virtual Simulation. Nurse Educator, 44(6), E6–E8. https://doi.org/10.1097/NNE.0000000000000639 Lavoie, P., Deschênes, M.-F., Nolin, R., Bélisle, M., Blanchet Garneau, A., Boyer, L., . . . Fernandez, N. (2020). Beyond Technology: A Scoping Review of Features that Promote Fidelity and Authenticity in Simulation-Based Health Professional Education. Clinical Simulation in Nursing, 42, 22–41. https://doi.org/10.1016/j.ecns.2020.02.001 Levett-Jones, T., & Lapkin, S. (2014, Jun). A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Educ Today, 34(6), e58–63. https://doi.org/10.1016/j.nedt.2013.09.020 Lisko, S. A., & O'Dell, V. (2010). Integration of theory and practice: Experiential learning theory and nursing education. Nursing Education Perspectives (National League for Nursing), 31(2), 106–108. http://search.ebscohost.com.proxybz.lib.montana.edu/login.aspx?direct=true&db=ccm& AN=105182444&login.asp&site=ehost-live Mitchell, A. (2020). Pandemic inspires innovative use of virtual simulation to teach practical skills. British Journal of Nursing, 29(20), 1214–1214. https://doi.org/10.12968/bjon.2020.29.20.1214 Montana State University. (2021). Office of planning and analysis: Student data. Retrieved February 27, 2021 from https://www.montana.edu/opa/students/ National League for Nursing. (2021). Simulation innovation resource center. Retrieved February 20, 2021 from http://www.nln.org/sirc Oermann, M. H., Schellenbarger, T., & Gaberson, K. B. (2018). Clinical Teaching Strategies in Nursing (5th ed.). Springer Publishing. 66 Peddle, M., Bearman, M., & Nestel, D. (2016). Virtual Patients and Nontechnical Skills in Undergraduate Health Professional Education: An Integrative Review. Clinical Simulation in Nursing, 12(9), 400–410. https://doi.org/10.1016/j.ecns.2016.04.004 Reason, J. (1997). Managing the Risks of Organizational Accidents (Vol. 6). Ashgate. Rudolph, J. W., Palaganas, J., Fey, M. K., Morse, C. J., Onello, R., Dreifuerst, K. T., & Simon, R. (2016). A DASH to the Top: Educator Debriefing Standards as a Path to Practice Readiness for Nursing Students. Clinical Simulation in Nursing, 12(9), 412–417. https://doi.org/10.1016/j.ecns.2016.05.003 Rudolph, J. W., Simon, R., Dufresne, R. L., & Raemer, D. B. (2006, Spring). There's no such thing as "nonjudgmental" debriefing: A theory and method for debriefing with good judgment. Simul Healthc, 1(1), 49–55. https://doi.org/10.1097/01266021-200600110- 00006 Sawyer, T., Eppich, W., Brett-Fleegler, M., Grant, V., & Cheng, A. (2016, Jun). More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simul Healthc, 11(3), 209–217. https://doi.org/10.1097/sih.0000000000000148 Sculli, G. L., Fore, A. M., West, P., Neily, J., Mills, P. D., & Paull, D. E. (2013, Mar). Nursing crew resource management: A follow-up report from the Veterans Health Administration. J Nurs Adm, 43(3), 122–126. https://doi.org/10.1097/NNA.0b013e318283dafa Severson, M. A., Maxson, P. M., Wrobleski, D. S., & Dozois, E. J. (2014). Simulation-Based Team Training and Debriefing to Enhance Nursing and Physician Collaboration. Journal of Continuing Education in Nursing, 45(7), 297–305. https://doi.org/10.3928/00220124- 20140620-03 Shinnick, M. A., Woo, M., Horwich, T. B., & Steadman, R. (2011). Debriefing: The Most Important Component in Simulation? Clinical Simulation in Nursing, 7(3), e105–111. https://doi.org/10.1016/j.ecns.2010.11.005 Simon, R., Raemer, D. B., & Rudolph, J. (2010). Debriefing assessment for simulation in healthcare (DASH): Student version, long form. Center for Medical Simulation. The National League for Nursing. (2021). vSim for Nursing. Retrieved February 2021 from http://www.nln.org/enterprise-development/nln-center-for-innovation-in-education- excellence/institute-for-simulation-and-technology/vsim-for-nursing-medical-surgical Thornton, J. (2020). Covid-19: How coronavirus will change the face of general practice forever. BMJ, 368, m1279–m1279. https://doi.org/10.1136/bmj.m1279 67 Use of simulation to enhance learning procedure. (2018). https://www.montana.edu/nursing/archive/pdf/C-2%20PROCEDURE%202018.pdf Verkuyl, M., Lapum, J. L., St-Amant, O., Hughes, M., Romaniuk, D., & McCulloch, T. (2020). Exploring Debriefing Combinations After a Virtual Simulation. Clinical Simulation in Nursing, 40, 36–42. https://doi.org/10.1016/j.ecns.2019.12.002 Wilbanks, B. A., McMullan, S., Watts, P. I., White, T., & Moss, J. (2020). Comparison of Video-Facilitated Reflective Practice and Faculty-Led Debriefings. Clinical Simulation in Nursing, 42, 1–7. https://doi.org/10.1016/j.ecns.2019.12.007 Zhang, H., Wang, W., Goh, S. H. L., Wu, X. V., & Mörelius, E. (2020). The impact of a three- phase video-assisted debriefing on nursing students' debriefing experiences, perceived stress and facilitators' practices: A mixed methods study. Nurse Education Today, 90, N.PAG-N.PAG. https://doi.org/10.1016/j.nedt.2020.104460 68 APPENDICES 69 APPENDIX A INSTITUTIONAL REVIEW BOARD EXEMPTION LETTER 70 71 APPENDIX B DEBRIEFING ASSESSMENT FOR SIMULATION IN ® HEALTHCARE (DASH) STUDENT VERSION 72 73 74 APPENDIX C ® COPY OF QUALTRICS SURVEY DASH-SV : FACE-TO-FACE DEBRIEFING 75 Simulation Debriefing Survey: Face-to-Face (Pre-Covid-19) Start of Block: Default Question Block Q1 Debriefing Assessment for Simulation in Healthcare Survey Directions: Please think back to one of the recent in-person, face-to-face simulation debriefing experiences which you were a part of on the MSU Bozeman Campus this semester. Then, rate that debriefing experience using this six-question survey. *Please summarize your impression of the introduction and debriefing in this simulation-based exercise. Use the following scale to rate each of six "Elements". Each Element comprises specific instructor behaviors, described below. If a listed behavior is impossible to assess (e.g. how the instructor(s) handled upset people if no one got upset), don't let that influence your evaluation. The instructor(s) may do some things well and some things not so well within each Element. Do your best to rate the overall effectiveness for the whole Element guided by your observation of the individual behaviors that define it. Participation in this survey is voluntary, and all responses will be anonymous. The data collected from this survey may be utilized for process improvement or publication. Page Break 76 Q2 Element #1 assesses the introduction at the beginning of a simulation-based exercise. - **Skip this element if you did not participate in the introduction. If there was no introduction and you felt one was needed to orient you, your rating should reflect this. Q3 Element #1: The instructor set the stage for an engaging learning experience. - The instructor introduced him/herself, described the simulation environment, what would be expected during the activity, and introduced the learning objectives. - The instructor explained the strengths and weaknesses of the simulation and what I could do to get the most out of simulated clinical experiences. - The instructor attended to logistical details as necessary such as toilet location, food availability, schedule. - The instructor made me feel stimulated to share my thoughts and questions about the upcoming simulation and debriefing and reassured me that I wouldn't be shamed or humiliated in the process. Q4 Overall Rating Element #1 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor set the stage for an o o o o o o o engaging learning experience. (1) Page Break 77 Q5 Element #2: The instructor maintained an engaging context for learning. - The instructor clarified the purpose of the debriefing, what was expected of me, and the instructor's role in the debriefing. - The instructor acknowledged concerns about realism and helped me learn even though the cases(s) were simulated. I felt that the instructor respected participants. - The focus was on learning and not on making people feel bad about making mistakes. - Participants could share thoughts and emotions without fear of being shamed or humiliated. Q6 Overall Rating Element #2 Extremely Consistently Mostly Mostly Consistently Extremely Somewhat Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Good Very Good Outstanding Average (4) (1) (2) (3) (5) (6) (7) The instructor maintained an o o o o o o o engaging context for learning. (2) Page Break 78 Q7 Element #3: The instructor structured the debriefing in an organized way. - The conversation progressed logically rather than jumping around from point to point. - Near the beginning of the debriefing, I was encouraged to share my genuine reactions to the case(s) and the instructor seemed to take my remarks seriously. - In the middle the instructor helped me analyze actions and thought processes as we reviewed the case(s). - At the end of the debriefing, there was a summary phase where the instructor helped tie observations together and relate the case(s) to ways I can improve my future clinical practice. Q8 Overall Rating Element #3 Extremely Consistently Mostly Mostly Consistently Extremely Somewhat Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Good Very Good Outstanding Average (4) (1) (2) (3) (5) (6) (7) The instructor structured the o o o o o o o debriefing in an organized way. (3) Page Break 79 Q9 Element #4: The instructor provoked in-depth discussions that led me to reflect on my performance. - The instructor used concrete examples - not just abstract or generalized comments - to get me to think about my performance. - The instructor's point of view was clear; I didn't have to guess what the instructor was thinking. - The instructor listened and made people feel heard by trying to include everyone, paraphrasing, and using nonverbal actions like eye contact and nodding, etc. - The instructor used video or recorded data to support analysis and learning. - If someone got upset during the debriefing, the instructor was respectful and constructive in trying to help them deal with it. Q10 Overall Rating Element #4 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor provoked in-depth discussions o o o o o o o that led me to reflect on my performance (4) 80 Page Break Q11 Element #5: The instructor identified what I did well or poorly- and why. - I received concrete feedback on my performance or that of my team based on the instructor's honest and accurate view. - The instructor helped me explore what I was thinking or trying to accomplish at key moments. Q12 Overall Rating Element #5 Extremely Consistently Mostly Mostly Consistently Extremely Somewhat Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Good Very Good Outstanding Average (4) (1) (2) (3) (5) (6) (7) The instructor identified what I o o o o o o o did well or poorly- and why. (5) Page Break Q13 Element #6: The instructor helped me see how to improve or how to sustain good performance. 81 - The instructor helped me learn how to improve weak areas or how to repeat good performance. - The instructor was knowledgeable and used that knowledge to help me see how to perform well in the future. - The instructor made sure we covered important topics. Q14 Overall Rating Element #6 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor helped me see how to improve or o o o o o o o how to sustain good performance. (6) End of Block: Default Question Block 82 APPENDIX D ® COPY OF QUALTRICS SURVEY DASH-SV : VIRTUAL DEBREIFING 83 Simulation Debriefing Survey: Virtual (post-Covid-19) Start of Block: Default Question Block Q1 Debriefing Assessment for Simulation in Healthcare Survey Directions: Please think back to one of the recent virtual simulation debriefing experiences which you took part in via WebEx this month. Then, rate that debriefing experience using this brief survey. *Please summarize your impression of the introduction and debriefing in this simulation-based exercise. Use the following scale to rate each of six "Elements". Each Element comprises specific instructor behaviors, described below. If a listed behavior is impossible to assess (e.g., how the instructor(s) handled upset people if no one got upset), don't let that influence your evaluation. The instructor(s) may do some things well and some things not so well within each Element. Do your best to rate the overall effectiveness for the whole Element guided by your observation of the individual behaviors that define it. Participation in this survey is voluntary, and all responses will be anonymous. The data collected from this survey may be utilized for process improvement or publication. Page Break 84 =Q2 Element #1 assesses the introduction at the beginning of a simulation-based exercise. - *Skip this element if you did not participate in the introduction. - If there was no introduction and you felt one was needed to orient you, your rating should reflect this. Q3 Element #1: The instructor set the stage for an engaging learning experience. - The instructor introduced him/herself, described the simulation environment, what would be expected during the activity, and introduced the learning objectives. - The instructor explained the strengths and weaknesses of the simulation and what I could do to get the most out of simulated clinical experiences. - The instructor attended to logistical details as necessary such as toilet location, food availability, schedule. - The instructor made me feel stimulated to share my thoughts and questions about the upcoming simulation and debriefing and reassured me that I wouldn't be shamed or humiliated in the process. Q4 Overall Rating Element #1 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor set the stage for an o o o o o o o engaging learning experience. (1) Page Break 85 Q5 Element #2: The instructor maintained an engaging context for learning. - The instructor clarified the purpose of the debriefing, what was expected of me, and the instructor's role in the debriefing. - The instructor acknowledged concerns about realism and helped me learn even though the cases(s) were simulated. I felt that the instructor respected participants. - The focus was on learning and not on making people feel bad about making mistakes. - Participants could share thoughts and emotions without fear of being shamed or humiliated. Q6 Overall Rating Element #2 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor maintained an o o o o o o o engaging context for learning. (1) Page Break 86 Q7 Element #3: The instructor structured the debriefing in an organized way. - The conversation progressed logically rather than jumping around from point to point. - Near the beginning of the debriefing, I was encouraged to share my genuine reactions to the case(s) and the instructor seemed to take my remarks seriously. - In the middle the instructor helped me analyze actions and thought processes as we reviewed the case(s). - At the end of the debriefing, there was a summary phase where the instructor helped tie observations together and relate the case(s) to ways I can improve my future clinical practice. Q8 Overall Rating Element #3 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor structured the o o o o o o o debriefing in an organized way. (1) Page Break 87 Q9 Element #4: The instructor provoked in-depth discussions that led me to reflect on my performance. - The instructor used concrete examples - not just abstract or generalized comments - to get me to think about my performance. - The instructor's point of view was clear; I didn't have to guess what the instructor was thinking. - The instructor listened and made people feel heard by trying to include everyone, paraphrasing, and using nonverbal actions like eye contact and nodding, etc. - The instructor used video or recorded data to support analysis and learning. - If someone got upset during the debriefing, the instructor was respectful and constructive in trying to help them deal with it. Q10 Overall Rating Element #4 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor provoked in-depth discussions o o o o o o o that led me to reflect on my performance (4) Page Break 88 Q11 Element #5: The instructor identified what I did well or poorly- and why. - I received concrete feedback on my performance or that of my team based on the instructor's honest and accurate view. - The instructor helped me explore what I was thinking or trying to accomplish at key moments. Q12 Overall Rating Element #5 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (1) (2) (2) (3) (3) (4) (4) (5) (5) (6) (6) (7) (7) The instructor identified what I did o o o o o o o well or poorly- and why. (5) Page Break 89 Q13 Element #6: The instructor helped me see how to improve or how to sustain good performance. - The instructor helped me learn how to improve weak areas or how to repeat good performance. - The instructor was knowledgeable and used that knowledge to help me see how to perform well in the future. - The instructor made sure we covered important topics. Q14 Overall Rating Element #6 Extremely Consistently Mostly Somewhat Mostly Consistently Extremely Ineffective/ Ineffective/ Ineffective/ Effective/ Effective/ Effective/ Effective/ Detrimental Very Poor Poor Average Good Very Good Outstanding (1) (2) (3) (4) (5) (6) (7) The instructor helped me see how to improve or o o o o o o o how to sustain good performance. (1) Page Break 90 Q15 After having experienced both in-person and virtual simulation debriefing, which method do you prefer? o In-person or face-to-face debriefing (1) o Virtual debriefing (2) Page Break Q21 Based on the in-person/face-to-face experiences this semester, please share your thoughts in response to the following 2 questions: Q16 What did you like or find beneficial about the in-person, face-to-face simulation debriefing experiences? ______________________________________________________________ Q18 What did you dislike or find most challenging about the in-person, face to face simulation debriefing experiences? _________________________________________________________ Page Break Q22 Based on the virtual experiences this semester, please share your thoughts in response to the following 2 questions: Q17 What did you like or find beneficial about the virtual simulation debriefing experiences? ________________________________________________________________ Q19 What did you dislike or find most challenging about the virtual simulation debriefing experiences? _______________________________________________________________ End of Block: Default Question Block