Next Article in Journal
Self-Assessment of Cultural Competence and Social Determinants of Health within a First-Year Required Pharmacy Course
Previous Article in Journal
Remote Learning in Transnational Education: Relationship between Virtual Learning Engagement and Student Academic Performance in BSc Pharmaceutical Biotechnology
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Interactive Narrative Simulation as a Method for Preceptor Development

The Eshelman School of Pharmacy, The University of North Carolina Chapel Hill, Asheville, NC 28804, USA
The Eshelman School of Pharmacy, The University of North Carolina Chapel Hill, Chapel Hill, NC 27599, USA
Author to whom correspondence should be addressed.
Pharmacy 2022, 10(1), 5;
Submission received: 9 November 2021 / Revised: 20 December 2021 / Accepted: 24 December 2021 / Published: 28 December 2021
(This article belongs to the Section Pharmacy Education and Student/Practitioner Training)


(1) Background: This proof-of-concept study assessed an interactive web-based tool simulating three challenging non-academic learning situations—student professionalism, cross-cultural interactions, and student well-being—as a means of preceptor development. (2) Methods: Three scripts focused on professionalism, cross-cultural interactions, and student well-being were developed and implemented using a commercial narrative tool with branching dialog. Delivered online, this tool presented each challenge to participants. Participants had up to four response options at each turn of the conversation; the choice of response influenced the subsequent conversation, including coaching provided at the resolution of the situation. Participants were invited to complete pre-activity, immediate post-activity, and one-month follow-up questionnaires to assess satisfaction, self-efficacy, engagement, and knowledge change with the tool. Knowledge was assessed through situational judgment tests (SJTs). (3) Results: Thirty-two pharmacist preceptors participated. The frequency of participants reflecting on challenging learning situations increased significantly one-month post-simulation. Participants affirmatively responded that the tool was time-efficient, represented similar challenges they encountered in precepting, was easily navigable, and resulted in learning. Self-efficacy with skills in managing challenging learning situations increased significantly immediately post-simulation and at a one-month follow-up. Knowledge as measured through SJTs was not significantly changed. (4) Conclusions: Preceptors found an interactive narrative simulation a relevant, time-efficient approach for preceptor development for challenging non-academic learning situations. Post-simulation, preceptors more frequently reflected on challenging learning situations, implying behavior change. Self-efficacy and self-report of knowledge increased. Future research is needed regarding knowledge assessments.

1. Introduction

Health professions educators continually seek new and effective methods to engage preceptors. Preceptors, when surveyed, show greater preference for online versus live training due to flexibility, while self-study is valued due to accessibility [1,2]. Consistency in presentation is important to ensure a base level of competency, yet there are benefits to providing training in varied formats since not all preceptors respond similarly to the same media [2,3]. However, more data on effective methods for preceptor development are needed, as a preferred delivery method based on learning outcomes has not yet been identified [2,4,5,6,7].
Simulation through virtual role-playing represents an attractive research area for preceptor development due to its accessibility, interactivity, adaptability, and availability in self-study formats [8]. Simulated environments have been used extensively within health professions to allow students to practice professionalism skills with simulated patients [9,10,11,12]. Though type of technology varies considerably among these studies, findings generally indicate that students engage in interactions with simulated patients and apply learned skills more broadly [13,14,15,16]. Simulation may also help students attain and retain educational goals due to its active nature and self-directed format [17].
Virtual or screen-based simulation has also been studied for other uses—such as for social interaction, change management, stress management, and job interview skills—due to its abilities to offer repetitive practice, individualization, immediate feedback, and an interpersonal safe environment [18,19,20,21,22]. It also addresses some limitations of accessibility and resource requirements with in-person simulation [21,22]. Furthermore, serious games are a type of simulation-based education to develop nontechnical abilities such as decision-making skills [23].
Screen-based simulation also allows for recreating situations that may be difficult or infeasible in person due to discomfort with participating in role play, inability to attend live training, or availability of facilitators. This capability could potentially be used to train preceptors for challenging non-academic learning situations such as professionalism concerns. Given preceptors’ level of involvement in pharmacy curricula, it is expected that preceptors may frequently encounter such situations [24,25,26]. Actor-based in-person role-plays have been used effectively as a type of simulation for nurse preceptor development in difficult situations [27]. The scenarios were included as part of a preceptor training program for new preceptors that also had a didactic component. The scenarios included developing a precepting plan, implementing the plan, and delivering feedback. Half of the participants had the opportunity to play the role of the preceptor while the rest of the participants observed. Learning outcomes were assessed with pre-post knowledge questions and were significantly improved after the program, and preceptors positively received the program. A limitation was the time commitment of faculty. Since the training included didactic content in addition to the simulation, it is unknown what contribution simulation alone made to knowledge change [27]. A web-based simulation may increase preceptors’ accessibility to training that includes application with real-world scenarios and would not require faculty oversight. While computer-based simulation has been used as a viable training modality in health education curricula [28,29], it is unknown if this translates to preceptors.
This proof-of-concept study evaluated narrative-based simulation using a web-based tool designed to prepare preceptors for managing challenging non-academic learning situations. The simulation included three situations that preceptors may encounter in practice: professionalism, cross-cultural interactions, and student well-being. The research aims were to determine if simulation is an effective method for increasing preceptor knowledge, behavior, and self-efficacy, and to gauge level of preceptor engagement and satisfaction with simulation.

2. Materials and Methods

Three challenging situations were created based on topics frequently offered in preceptor development programs, situations reported to the authors’ School, and topic requests from School preceptors [2]. Challenging non-academic learning situations were defined as situations demanding preceptor intervention; intervention is prompted due to concern about the student’s performance or well-being and requires applying skills to facilitate interaction beyond the traditional role of preceptor as educator only [25]. The situations involved professionalism (tardiness), cross-cultural interactions (patient refusing care), and student well-being (noticeable changes in emotional state). Literature on challenging learning situations, remediation, feedback, well-being, and cross-cultural interactions informed script development, coaching, and creation of supplemental (reference) resources [25,26,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44].
Scripts were implemented using the ink editor from Inkle Studios (Cambridge, UK). This narrative tool enables rapid writing and testing of branching dialog. In prior studies [45,46], different organization methods of dialog were used, including a dynamic main menu and simple linear flow. The structure used here was mixed: dialog had set flow from start (introduction of the situation with the student) to finish (resolution of the situation); in between, at each turn in the conversation, participants could choose from up to four options of how to address or respond to the student. Response options were created by three subject matter experts. The presentation was primarily text-based and required menu selections to progress. The program provided textual feedback at resolution of each challenge based on participant choices. This individualized coaching summarized what participants did well and what could be improved on instead of indicating whether their response was correct or incorrect. The simulation allowed participants to explore the merits and challenges to potential solutions. General written strategies from the literature and topic experts were provided to all participants regardless of simulation performance.
A convenience sample of participants was obtained by sending an email request for voluntary participation to 1275 preceptors at one school of pharmacy. Each participant had an individual session with the tool and research team member, either in person or via Zoom (San Jose, CA, USA). Sessions were audio-recorded. Participants were asked to think aloud as they read descriptions and selected among options, providing a steady flow of detail of their observations and reasoning [47]. The three situations could be completed in any order, and participants could replay or review challenges as often as they liked. Sessions were untimed, generally requiring 45 min to one hour total.
Three Kirkpatrick levels of training evaluation were considered in the outcome assessments: reaction, learning, and behavior [48]. All participants were asked to complete three questionnaires: pre-activity (prior to running through challenges), post-activity (immediately after), and one-month follow-up. All questionnaires were accessed online using Qualtrics (Provo, UT, USA). The pre-activity questionnaire included demographics, the frequency participants encountered challenging learning situations, their perceived behaviors, self-efficacy, and knowledge of managing these situations. Situational judgment tests (SJTs) informed design of knowledge questions, presenting three additional situations that participants may encounter. For the eight-item knowledge test, participants were instructed to prioritize potential responses in order of most to least favored [49].
The immediate post-activity questionnaire asked participants to rate satisfaction with the simulation, then through open-ended questions sought features liked most and least about the tool, suggestions for improvement, and desired additional resources. Participants were given the same self-efficacy, perceived behaviors, and knowledge questions. The one-month follow-up questionnaire included questions on perceived precepting behavior with challenging learning situations, whether coaching from the simulation had been used, and resources still needed. Participants also rated their self-efficacy, described behaviors, and completed the knowledge questions. The self-efficacy instrument was designed, using a five-point scale anchored by ‘strongly disagree’ and ‘strongly agree’, in accordance with recognized design principles and reviewed by education research and pharmacy practice experts before administration [50]. Descriptive statistics were utilized to assess demographics. For knowledge questions, participants were asked to rank solutions for the presented scenarios from a list of four responses from most to least favored; correct answers were awarded one point and based on a key from subject-matter experts (three investigators with precepting and education expertise and eight preceptor faculty with five-plus years’ precepting experience). Paired comparisons were performed using t-tests when a sufficient sample size and normality of distribution was warranted, otherwise Wilcoxon signed rank tests were utilized, with significance set at 0.05, entries removed that had missing values, and Bonferroni correction for multiple tests. Qualitative data were analyzed using informal methods with patterns sought among participant responses to open-ended questions. The study was reviewed and deemed exempt by the School’s Institutional Review Board.

3. Results

Thirty-two preceptors were recruited to test the simulation tool (Table 1). Of these participants, 75% (N = 24) and 78% (N = 25) completed the immediate and one-month follow-up questionnaires, respectively.
Percentages of participants responding that they addressed tardiness, cross-cultural interactions, or mental health with students at least monthly prior to the simulation were 53% (N = 17), 50% (N = 16), and 38% (N = 12), respectively. Over their careers, 66% (N = 21), 53% (N = 17), and 53% (N = 17) indicated they had addressed these concerns. On a question asking about preceptor behavior related to regularly reflecting on challenging situations of all kinds with learners, 66% (N = 21) answered in the affirmative (agreeing or strongly agreeing) prior to the simulation experience while 80% (N = 20) answered so one-month after participation; a signed rank test comparing paired pre-activity and post-activity responses indicated that post-activity responses were significantly higher than pre-activity responses (p < 0.04). Participants’ beliefs in their ability to successfully address challenging situations also changed, using the same test, from pre-activity to one-month post (75%, N = 24 vs. 80%, N = 20, p < 0.05). Forty-two percent (N = 10) in the one-month post-activity survey indicated they had used strategies from the tool.
Regarding usability, 92% (N = 22) of participants answered positively about time-efficiency and 92% (N = 22) felt navigation was easy. All agreed or strongly agreed that the experience was applicable to their practice, and 79% (N = 19) noted the challenges were similar to situations they experienced. Nearly all (96%, N = 23) claimed to have learned from using the tool. Open-ended positive comments included realism of challenges, ability to replay and explore dialog progression based on different response options, reflection on reactions, coaching provided at each challenge’s resolution, and engagement and ease of use. Constructive comments included need for a back button (as opposed to replay of the whole challenge), desire to pinpoint which response selection(s) drove feedback (since coaching was provided only at the end of a dialog and may have referred to one or several choices made by the participant), and some vagueness in response options. Forty-two percent (N = 10) of participants disagreed or strongly disagreed that there were right answers to the scenarios in the simulation, debating if just one answer fit, that response options were vague, or how certain answers were not considered “ideal”. Several participants requested additional challenging situations focused on good communication skills and social cues. One participant suggested using the tool to create virtual “rounds” for preceptors to share interactions within the situations and learn from each other how to manage the student. Additional resources requested included discussion with a School mentor, discussion with colleagues, continuing education programs on challenging situations, and handouts outlining resources.
Moreover, participant self-efficacy significantly increased from pre-activity to immediate post-activity (Table 2) and maintained from immediate to one-month later. Results from the SJT-structured knowledge test administered before, immediately after, and at a one-month follow-up suggested no change in participant knowledge (p > 0.12; see Table 3). There were no systematic differences in how participants ranked response options across the three questionnaires, reflecting that participants differed among themselves and with experts regarding “right” answers provided to the challenges.

4. Discussion

This is one of the first studies to assess the effectiveness of web-based narrative simulation for preceptor development. Its strengths are in measuring preceptor satisfaction, self-reported behaviors, self-efficacy, and knowledge pre-post simulation. The simulation had a positive effect on preceptor self-efficacy and self-perception of knowledge but not on knowledge as measured by SJTs. Overall, most participants found the tool useful and the scenarios realistic; issues seemed to surround the sometimes-vague response options.
Preceptors’ responses to their frequency of addressing challenging non-academic learning situations (particularly tardiness, cross-cultural issues, and mental health concerns) indicated challenges selected for the simulation were relevant. Preceptors agreed these challenges were similar to situations they had experienced and applicable to their practice, indicating a higher prevalence of students requiring intervention than has been cited in the literature [24,25,26]. A potential explanation to this finding is preceptors’ possible historical underreporting [25]. Challenging learning situations and feedback are frequently offered or recommended as topics of preceptor development, and this pilot study validates their inclusion in simulation training [2,51,52].
One-month after simulation, preceptors more frequently reflected on challenging learning situations, implying behavior change. Preceptors’ belief in their ability to successfully manage challenging learning situations increased after the simulation. Self-efficacy measures increased for challenging situations with learners having general professionalism, cultural awareness, and mental health concerns. Preceptors also believed the tool was time-efficient and easily navigable, an important consideration given time constraints that preceptors face [52,53].
Literature is presently lacking in assessment of preceptor self-efficacy with web-based simulation. Satisfaction data are also relatively lacking. A comparison of three simulation modalities (paper, actor, and computer-based) in Master of Pharmacy students found similar satisfaction with all modalities [29]. Though that study was with a student population, similarities in positive aspects of computer-based simulation with the present study include ability to replay or repeat situations, high engagement, and encouragement of reflection. No overlap was found in negative feedback, which could be related to differences in design or in needs between students and preceptors.
Most preceptors reported learning using the web-based simulation tool. However, learning was not demonstrated in any systematic changes to their response prioritizations compared to expert response prioritizations. Participants and experts did not always agree on correct responses to the simulation scenarios nor the knowledge situations in the SJTs. Problem solution may be idiosyncratic; preceptors may learn strategies how to address a challenge, but each individual does so in their own way based on situational constraints. While the dynamic simulation allowed for variability in responses to complex situations, SJTs with a pre-selected best response may not have and therefore may not be optimal for assessing knowledge change for these types of situations. It is also possible that these preceptors were a self-selected group already knowledgeable in managing challenging situations, or that response options were too unclear in how they should be prioritized. Further investigation of the knowledge testing is warranted. There is evidence of learning using role play simulation for nurse educator development when comparing open-ended knowledge questions related to course objectives pre- and post-simulation [27]. Participants in that study believed role play simulation was more effective than lecturing only [27]. Other literature describing knowledge change using simulation for preceptors is not presently available; additional study of best practices with knowledge change assessments in simulation is needed.
This pilot study has several limitations. First, it was conducted with a small number of preceptors at one school of pharmacy to assess proof-of-concept; therefore, caution should be used when applying to other populations of preceptors who may have different experiences. Second, preceptors agreeing to participate may have opted in because they were open to using computer-based technology for preceptor development, perhaps causing bias in favor of simulation. Third, knowledge change assessed through SJTs did not show systematic effect. This form of assessment, in addition, was static, in contrast to the dynamic simulation. In addition, it is unclear if preceptors’ access to other training and resources after simulation could have impacted their self-efficacy. Further study is needed to confirm findings in larger populations of preceptors and determine best practices around use of simulation for training of preceptors in dealing with challenging situations and testing for knowledge change.

5. Conclusions

Preceptors responded favorably to an interactive narrative simulation and demonstrated sustained increased self-efficacy with skills related to managing challenging non-academic learning situations from before the simulation to after. Preceptors reported an increase in reflecting on challenging situations. A test of change of knowledge from before to after the simulation did not show systematic effects when comparing participants’ prioritized response options to a SJT to a separate group of experts’ prioritizations. Future work with additional situations, preceptors, and formal knowledge assessments is warranted.

Author Contributions

Conceptualization, C.R.W. and R.H.; methodology, C.R.W., R.H. and M.D.W.; software, R.H.; validation, C.R.W. and A.K.; formal analysis, R.H.; writing—original draft preparation, C.R.W. and R.H.; writing—reviewing and editing, C.R.W., R.H. and M.D.W.; project administration, C.R.W. All authors have read and agreed to the published version of the manuscript.


This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Biomedical Institutional Review Board of the University of North Carolina (protocol code 19-2604, 12 May 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.


The authors thank Ethan Burch for implementing and delivering the dynamic situations.

Conflicts of Interest

The authors declare no conflict of interest.


  1. American Association of Colleges of Pharmacy. AACP Council of Sections Report, June 2013. Am. J. Pharm. Educ. 2013, 77, S23. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Mulherin, K.; Walter, S.; Cox, C. National preceptor development program (PDP): Influential evidence and theory. The first part of 3-part series. Curr. Pharm. Teach. Learn. 2018, 10, 255–266. [Google Scholar] [CrossRef] [PubMed]
  3. Engle, J.P.; Burke, J.M.; Ashjian, E.J.; Avery, L.; Borchert, J.S.; Faro, S.J.E.; Harris, C.S.; Herink, M.C.; Jain, B.; MacLaughlin, E.J.; et al. ACCP clinical pharmacist competencies: Advocating alignment between student, resident, and practitioner competencies. J. Am. Coll. Clin. Pharm. 2020, 3, 124–132. [Google Scholar] [CrossRef] [Green Version]
  4. Larsen, R.; Zahner, S.J. The impact of web-delivered education on preceptor role self-efficacy and knowledge in public health nurses. Public Health Nurs. 2011, 28, 349–356. [Google Scholar] [CrossRef]
  5. Bardella, I.; Janosky, J.; Elnicki, D.; Ploof, D.; Kolarik, R. Observed versus reported precepting skills: Teaching behaviours in a community ambulatory clerkship. Med. Educ. 2005, 39, 1036–1044. [Google Scholar] [CrossRef] [PubMed]
  6. Vos, S.S.; Trewet, C.B. A comprehensive approach to preceptor development. Am. J. Pharm. Educ. 2012, 76, 47. [Google Scholar] [CrossRef] [PubMed]
  7. Salerno, S.M.; Jackson, J.L.; O’Malley, P.G. Interactive faculty development seminars improve the quality of written feedback in ambulatory teaching. J. Gen. Intern. Med. 2003, 18, 831–834. [Google Scholar] [CrossRef] [Green Version]
  8. Hubal, R.C.; Kizakevich, P.N.; Guinn, C.I.; Merino, K.D.; West, S.L. The virtual standardized patient. Simulated patient-practitioner dialog for patient interview training. Stud. Health Technol. Inform. 2000, 70, 133–138. [Google Scholar] [CrossRef] [PubMed]
  9. Foster, A.; Chaudhary, N.; Kim, T.; Waller, J.L.; Wong, J.; Borish, M.; Cordar, A.; Lok, B.; Buckley, P.F. Using virtual patients to teach empathy: A randomized controlled study to enhance medical students’ empathic communication. Simul. Healthc. 2016, 11, 181–189. [Google Scholar] [CrossRef] [PubMed]
  10. Hubal, R.; Kizakevich, P.; Furberg, R. Synthetic Characters in health-related applications. Stud. Comput. Intell. 2007, 65, 5–26. [Google Scholar] [CrossRef]
  11. Kononowicz, A.A.; Zary, N.; Edelbring, S.; Corral, J.; Hege, I. Virtual patients—What are we talking about? A framework to classify the meanings of the term in healthcare education. BMC Med. Educ. 2015, 15, 11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Williams, K.; Wryobeck, J.; Edinger, W.; McGrady, A.; Fors, U.; Zary, N. Assessment of competencies by use of virtual patient technology. Acad. Psychiatry 2011, 35, 328–330. [Google Scholar] [CrossRef] [PubMed]
  13. Berman, N.B.; Durning, S.J.; Fischer, M.R.; Huwendiek, S.; Triola, M.M. The role for virtual patients in the future of medical education. Acad. Med. 2016, 91, 1217–1222. [Google Scholar] [CrossRef] [PubMed]
  14. Cook, D.A.; Erwin, P.J.; Triola, M.M. Computerized virtual patients in health professions education: A systematic review and meta-analysis. Acad. Med. 2010, 85, 1589–1602. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Friedl, K.E.; O’Neil, H.F. Designing and using computer simulations in medical education and training: An introduction. Mil. Med. 2013, 178 (Suppl. S10), 1–6. [Google Scholar] [CrossRef] [Green Version]
  16. Rizzo, A.; Shilling, R.; Forbell, E.; Scherer, S.; Gratch, J.; Morency, L.-P. Autonomous virtual human agents for healthcare information support and clinical interviewing. In Artificial Intelligence in Behavioral and Mental Health Care; Luxton, D.D., Ed.; Elsevier Academic Press: San Diego, CA, USA, 2016; pp. 53–79. [Google Scholar] [CrossRef]
  17. French, H.M.; Hales, R.L. Neonatology faculty development using simulation. Semin Perinatol. 2016, 40, 455–465. [Google Scholar] [CrossRef] [PubMed]
  18. Won, A.S.; Pandita, S.; Kruzan, K.P. Social Interaction and Pain Threshold in Virtual Reality. Cyberpsychol. Behav. Soc. Netw. 2020, 23, 829–845. [Google Scholar] [CrossRef] [PubMed]
  19. Guérin, A.; Lebel, D.; Hall, K.; Bussières, J.-F. Change Management in Pharmacy: A Simulation Game and Pharmacy Leaders’ Rating of 35 Barriers to Change. Int. J. Pharm. Pract. 2015, 23, 439–446. [Google Scholar] [CrossRef] [PubMed]
  20. Sigwalt, F.; Petit, G.; Evain, J.-N.; Claverie, D.; Bui, M.; Guinet-Lebreton, A.; Trousselard, M.; Canini, F.; Chassard, D.; Duclos, A.; et al. Stress Management Training Improves Overall Performance during Critical Simulated Situations: A Prospective Randomized Controlled Trial. Anesthesiology 2020, 133, 198–211. [Google Scholar] [CrossRef] [PubMed]
  21. Meese, M.M.; O’Hagan, E.C.; Chang, T.P. Healthcare Provider Stress and Virtual Reality Simulation: A Scoping Review. Simul. Healthc. J. Soc. Simul. Healthc. 2021, 16, 268–274. [Google Scholar] [CrossRef] [PubMed]
  22. Smith, M.J.; Humm, L.B.; Fleming, M.F.; Jordan, N.; Wright, M.A.; Ginger, E.J.; Wright, K.; Olsen, D.; Bell, M.D. Virtual Reality Job Interview Training for Veterans with Posttraumatic Stress Disorder. J. Vocat. Rehabil. 2015, 42, 271–279. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Ghoman, S.K.; Patel, S.D.; Cutumisu, M.; von Hauff, P.; Jeffery, T.; Brown, M.R.G.; Schmölzer, G.M. Serious Games, a Game Changer in Teaching Neonatal Resuscitation? A Review. Arch. Dis. Child. Fetal Neonatal Ed. 2020, 105, 98–107. [Google Scholar] [CrossRef] [Green Version]
  24. Guerrasio, J.; Garrity, M.J.; Aagaard, E.M. Learner deficits and academic outcomes of medical students, residents, fellows, and attending physicians referred to a remediation program, 2006–2012. Acad. Med. 2014, 89, 352–358. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Davis, L.E.; Miller, M.L.; Raub, J.N.; Gortney, J.S. Constructive ways to prevent, identify, and remediate deficiencies of “challenging trainees” in experiential education. Am. J. Health-Syst. Pharm. 2016, 73, 996–1009. [Google Scholar] [CrossRef]
  26. Yao, D.C.; Wright, S.M. The challenge of problem residents. J. Gen. Intern. Med. 2001, 16, 486–492. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Wilson, R.; Acuna, M.; Ast, M.; Bodas, E. Evaluation of the effectiveness of simulation for preceptor preparation. J. Nurses Prof. Dev. 2013, 29, 186–190. [Google Scholar] [CrossRef]
  28. Bindoff, I.; Ling, T.; Bereznicki, L.; Westbury, J.; Chalmers, L.; Peterson, G.; Ollington, R. A computer simulation of community pharmacy practice for educational use. Am. J. Pharm. Educ. 2014, 78, 168. [Google Scholar] [CrossRef] [Green Version]
  29. Tait, L.; Lee, K.; Rasiah, R.; Cooper, J.M.; Ling, T.; Geelan, B.; Bindoff, I. Simulation and feedback in health education: A mixed methods study comparing three simulation modalities. Pharmacy 2018, 6, 41. [Google Scholar] [CrossRef] [Green Version]
  30. Maize, D.F.; Fuller, S.H.; Hritcko, P.M.; Matsumoto, R.R.; Soltis, D.A.; Taheri, R.R.; Duncan, W. A review of remediation programs in pharmacy and other health professions. Am. J. Pharm. Educ. 2010, 74, 25. [Google Scholar] [CrossRef] [Green Version]
  31. Roze des Ordons, A.; Cheng, A.; Gaudet, J.; Downar, J.; Lockyer, J. Adapting feedback to individual residents: An examination of preceptor challenges and approaches. J. Grad. Med. Educ. 2018, 10, 168–175. [Google Scholar] [CrossRef] [Green Version]
  32. Bing-You, R.; Hayes, V.; Varaklis, K.; Trowbridge, R.; Kemp, H.; McKelvy, D. Feedback for learners in medical education: What is known? A scoping review. Acad. Med. 2017, 92, 1346–1354. [Google Scholar] [CrossRef] [PubMed]
  33. Katz, E.D.; Dahms, R.; Sadosty, A.T.; Stahmer, S.A.; Goyal, D.; CORD-EM Remediation Task Force. Guiding principles for resident remediation: Recommendations of the CORD remediation task force. Acad. Emerg. Med. 2010, 17 (Suppl. S2), S95–S103. [Google Scholar] [CrossRef]
  34. Teply, R.; Spangler, M.; Klug, L.; Tilleman, J.; Coover, K. Impact of instruction and feedback on reflective responses during an ambulatory care advanced pharmacy practice experience. Am. J. Pharm. Educ. 2016, 80, 81. [Google Scholar] [CrossRef] [PubMed]
  35. Lacasse, M.; Audétat, M.-C.; Boileau, É.; Fon, N.C.; Dufour, M.-H.; Laferriere, M.-C.; Lafleur, A.; La Rue, È.; Lee, S.; Nendaz, M.; et al. Interventions for undergraduate and postgraduate medical learners with academic difficulties: A BEME systematic review: BEME guide no. 56. Med. Teach. 2019, 41, 981–1001. [Google Scholar] [CrossRef] [PubMed]
  36. Wu, J.S.; Siewert, B.; Boiselle, P.M. Resident evaluation and remediation: A comprehensive approach. J. Grad. Med. Educ. 2010, 2, 242–245. [Google Scholar] [CrossRef]
  37. Domen, R.E. Resident remediation, probation, and dismissal basic considerations for program directors. Am. J. Clin. Pathol. 2014, 141, 784–790. [Google Scholar] [CrossRef] [Green Version]
  38. Ramani, S.; Könings, K.D.; Ginsburg, S.; van der Vleuten, C.P.M. Twelve tips to promote a feedback culture with a growth mind-set: Swinging the feedback pendulum from recipes to relationships. Med. Teach. 2019, 41, 625–631. [Google Scholar] [CrossRef]
  39. Grant, V.J.; Robinson, T.; Catena, H.; Eppich, W.; Cheng, A. Difficult debriefing situations: A toolbox for simulation educators. Med. Teach. 2018, 40, 703–712. [Google Scholar] [CrossRef]
  40. Acosta, D.; Ackerman-Barger, K. Breaking the silence: Time to talk about race and racism. Acad. Med. 2017, 92, 285–288. [Google Scholar] [CrossRef] [Green Version]
  41. Shen, Z. Cultural competence models and cultural competence assessment instruments in nursing: A literature review. J. Transcult. Nurs. 2015, 26, 308–321. [Google Scholar] [CrossRef]
  42. Carr, S. The Foundation Programme assessment tools: An opportunity to enhance feedback to trainees? Postgrad. Med. J. 2006, 82, 576–579. [Google Scholar] [CrossRef] [Green Version]
  43. Gable, K.N. Starting the conversation about depression and suicide prevention. Pharm. Today 2019, 25, 44–53. [Google Scholar] [CrossRef] [Green Version]
  44. Fischbein, R.; Bonfine, N. Pharmacy and medical students’ mental health symptoms, experiences, attitudes and help-seeking behaviors. Am. J. Pharm. Educ. 2019, 83, 7558. [Google Scholar] [CrossRef]
  45. Hubal, R.; Heneghan, J. Carolina virtual patient initiative [abstract]. Pharm Educ. 2017, 17, 292. [Google Scholar]
  46. Chen, F.; Lee, Y.; Hubal, R. Testing of a virtual patient: Linguistic and display engagement findings. In Immersive Learning Research Network, Proceedings of the 6th International Conference, Online, 21–25 June 2020; Economou, D., Klippel, A., Dodds, H., Peña-Rios, A., Lee, M.J.W., Beck, D., Pirker, J., Dengel, A., Peres, T.M., Richter, J., Eds.; iLRN (Immersive Learning Research Network): Vienna, Austria, 2020; pp. 348–350. [Google Scholar]
  47. Wolcott, M.D.; Lobczowski, N.G. Using cognitive interviews and think-aloud protocols to understand thought processes. Curr. Pharm. Teach. Learn. 2021, 13, 181–188. [Google Scholar] [CrossRef] [PubMed]
  48. Kirkpatrick, J.D.; Kirkpatrick, W.K. Kirkpatrick’s Four Levels of Training Evaluation, 1st ed.; Association for Talent Development: Alexandria, VA, USA, 2016. [Google Scholar]
  49. Patterson, F.; Zibarras, L.; Ashworth, V. Situational judgement tests in medical education and training: Research, theory and practice: AMEE Guide No. 100. Med. Teach. 2016, 38, 3–17. [Google Scholar] [CrossRef]
  50. Bandura, A. Guide for Constructing Self-Efficacy Scales. In Self-Efficacy Beliefs of Adolescents; Pajares, F., Urdan, T., Eds.; Information Age Publishing: Greenwich, CT, USA, 2006; pp. 307–337. [Google Scholar]
  51. Worrall, C.L.; Chair; Aistrope, D.S.; Cardello, E.A.; Fulginiti, K.S.; Jordan, R.P.; Martin, S.J.; McGrath, K.; Park, S.; Shepler, B.; et al. Priming the preceptor pipeline: Collaboration, resources, and recognition. The report of the 2015–2016 Professional Affairs Standing Committee. Am. J. Pharm. Educ. 2016, 80, S19. [Google Scholar] [CrossRef] [PubMed]
  52. Hartzler, M.L.; Ballentine, J.E.; Kauflin, M.J. Results of a survey to assess residency preceptor development methods and precepting challenges. Am. J. Health Syst. Pharm. 2015, 72, 1305–1314. [Google Scholar] [CrossRef]
  53. O’Sullivan, T.A.; Cox, C.D.; Darbishire, P.; Dinkins, M.M.; Johanson, E.L.; Joseph, A.; Vos, S. The status and adequacy of preceptor orientation and development programs in US pharmacy schools. Am. J. Pharm. Educ. 2020, 84, 7540. [Google Scholar] [CrossRef] [PubMed]
Table 1. Demographics of preceptors participating in a simulation for challenging student learning situations.
Table 1. Demographics of preceptors participating in a simulation for challenging student learning situations.
CharacteristicParticipants (N = 32)
26–3544% (N = 14)
36–4531% (N = 10)
46–5519% (N = 6)
No response6% (N = 2)
Bachelor’s degree (e.g., BSPharm, BS, BA)53% (N = 17)
Master’s degree (e.g., MS, MEd)16% (N = 5)
Professional degree (e.g., PharmD, MD, JD)88% (N = 28)
Doctoral degree (e.g., PhD, EdD)3% (N = 1)
Post-graduate year 1 (PGY1) residency38% (N = 12)
Post-graduate year 2 (PGY2) residency38% (N = 12)
Board certification (e.g., BCPS)53% (N = 17)
Years Precepting Experience
<1 year9% (N = 3)
1–5 years53% (N = 17)
6–10 years6% (N = 2)
11–15 years6% (N = 2)
16–20 years13% (N = 4)
>20 years13% (N = 4)
Number of Students Precepted Per Year
1–559% (N = 19)
6–1013% (N = 4)
11–156% (N = 2)
No response22% (N = 7)
Number of Residents Precepted Per Year
047% (N = 15)
1–541% (N = 13)
6–103% (N = 1)
11–153% (N = 1)
No response6% (N = 2)
Type of Student Precepted1
First-year student pharmacists34% (N = 11)
Second-year student pharmacists50% (N = 16)
Third-year student pharmacists59% (N = 19)
Fourth-year student pharmacists94% (N = 30)
First-year pharmacy residents50% (N = 16)
Second-year pharmacy residents41% (N = 13)
Other health professions students (physician, physician assistant, etc.)19% (N = 6)
Previous Preceptor Training and Continuing Education Utilized1
School-based preceptor training and development programming91% (N = 29)
Preceptor development programs for resident preceptors at the organization56% (N = 18)
Attendance at national pharmacy preceptor’s conference9% (N = 3)
Continuing education seminars/webinars/workshops78% (N = 25)
Pharmacist letter resources31% (N = 10)
Professional organization resources34% (N = 11)
Preceptor development books6% (N = 2)
Other national training program6% (N = 2)
1 Respondents could select all that apply.
Table 2. Preceptor degree of confidence (self-efficacy) responding to challenging situations.
Table 2. Preceptor degree of confidence (self-efficacy) responding to challenging situations.
Confidence (i.e., “____ Percent of the Time I Am Confident I Can...”)
(0—Not Confident; 100—Highly Confident)
Mean (N = 32)
Immediate Post Mean (N = 24)
One-Month Follow-Up
Mean (N = 25)
General activitiesRecognize challenging learning situations79.4 (13.7)85.0 (12.5)87.6 (8.8)
Discuss challenging situations with students70.0 (15.9)80.0 (16.7)81.2 (10.5)
Understand strategies to address challenging situations with students62.9 (15.1)73.3 (15.8)79.2 (10.4)
Identify resources to help address challenging situations with students55.0 (21.1)73.3 (18.1)72.8 (15.9)
Reflect on opportunities to improve how I address challenging situations with students68.8 (21.8)82.5 (16.2)86.0 (12.2)
General activities average 67.2 (19.4)78.8 (16.4) 181.4 (12.8) 2,4
ProfessionalismIdentify issues with professionalism among students86.3 (12.4)90.0 (9.3)93.6 (7.0)
Discuss professionalism issues with students80.0 (15.9)85.4 (13.2)89.2 (10.8)
Understand strategies to address professionalism issues with students71.6 (17.6)82.5 (13.9)87.2 (13.1)
Identify resources to help address professionalism issues with students59.7 (21.8)77.9 (20.0)83.6 (16.6)
Reflect on opportunities to improve how I address challenging professionalism situations with students76.3 (16.2)87.9 (13.5)89.6 (12.1)
Professionalism average 74.8 (19.1)84.8 (14.8) 188.6 (12.5) 2,3
Cross-cultural interactionsDefine cultural awareness71.3 (16.0)79.6 (15.2)83.6 (13.2)
Discuss cross-cultural issues with students63.8 (20.4)69.6 (19.7)76.0 (20.0)
Understand strategies to address cross-cultural issues with students51.9 (22.9)68.8 (20.7)74.0 (18.9)
Identify resources to help address cross-cultural issues with students39.7 (21.5)68.3 (24.3)68.0 (21.4)
Reflect on opportunities to improve how I address challenging cross-cultural situations with students60.9 (26.9)80.0 (19.6)80.8 (15.8)
Cross-cultural interactions average 57.5 (24.2)72.3 (20.8) 176.5 (18.6) 2,3
Mental health concernsIdentify mental health concerns with students67.5 (21.1)78.8 (18.7)79.6 (17.2)
Discuss mental health concerns with students57.8 (25.5)69.6 (24.0)73.2 (22.5)
Understand strategies to address mental health concerns with students58.8 (27.0)70.4 (22.7)73.2 (21.9)
Identify resources to help address mental health concerns with students53.4 (27.1)71.3 (23.6)76.8 (20.1)
Reflect on opportunities to improve how I address mental health concerns with students65.9 (23.9)79.6 (24.0)83.2 (19.3)
Mental health concerns average 60.7 (25.3)73.9 (22.8) 177.2 (20.3) 2,3
1 Difference between immediate post and pre-activity: all p < 0.02. 2 Difference between follow-up and pre-Activity: all p < 0.01. 3 Difference between follow-up and immediate post: all p < 0.01. 4 No difference between follow-up and immediate post; all p > 0.05.
Table 3. Percent of respondents matching their first prioritized response to SJT questions compared to expert rankings.
Table 3. Percent of respondents matching their first prioritized response to SJT questions compared to expert rankings.
Scenario 1Scenario 2Scenario 3
Immediate post5853749079538474
One-month follow-up6464689182739177
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Williams, C.R.; Hubal, R.; Wolcott, M.D.; Kruse, A. Interactive Narrative Simulation as a Method for Preceptor Development. Pharmacy 2022, 10, 5.

AMA Style

Williams CR, Hubal R, Wolcott MD, Kruse A. Interactive Narrative Simulation as a Method for Preceptor Development. Pharmacy. 2022; 10(1):5.

Chicago/Turabian Style

Williams, Charlene R., Robert Hubal, Michael D. Wolcott, and Abbey Kruse. 2022. "Interactive Narrative Simulation as a Method for Preceptor Development" Pharmacy 10, no. 1: 5.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop