Home Print this page Email this page Users Online: 723
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2015  |  Volume : 3  |  Issue : 2  |  Page : 146-150

Assessment of dental students' psychomotor skills using oral surgery simulation models


1 Department of Biomedical Dental Sciences, College of Dentistry, University of Dammam, Dammam, Kingdom of Saudi Arabia
2 Department of Preventive Dental Sciences, College of Dentistry, University of Dammam, Dammam, Kingdom of Saudi Arabia
3 Department of Medical Education, University of Dammam, Dammam, Kingdom of Saudi Arabia

Date of Web Publication6-May-2015

Correspondence Address:
Hesham F Marei
College of Dentistry, University of Dammam, P. O. Box 1982, Dammam 31441
Kingdom of Saudi Arabia
Login to access the Email id

DOI: 10.4103/1658-631X.156428

Rights and Permissions
  Abstract 

Aim: The aim of this study was to determine the validity of using oral surgery simulation models as a tool to assess the psychomotor skills of dental students.
Materials and Methods: All students in the 4 th year of a 6 years dental program were enrolled in the study. 23 dental students were asked to display their competency in the injection of local anesthesia and dental extraction in two summative testing environments namely; in the simulation and in the outpatient clinic. A panel of four experts assessed the students' performance during the injection of local anesthesia and tooth extraction on patients and simulation using a pre-validated checklist. Students' scores were compared in both settings.
Results: The results showed no significant correlation between the scores on patients and simulation in the two settings (P = 0.759).
Conclusion: The study revealed that the real patient remains the gold standard in summative assessment of dental students' psychomotor skills.

  Abstract in Arabic 


ملخص البحث :

عنيت هذه الدراسة بالتحقق من صحة استخدام نماذج محاكاة جراحة الفم كأداة لتقييم المهارات الحركية لطلاب طب الأسنان. تم تسجيل جميع طلاب المستوى الرابع من برنامج طب الأسنان حيث قاموا بعرض كفاءاتهم في حقن التخدير الموضعي وخلع الأسنان في اثنين من الاختبارات النهائية في معمل المحاكاة وفي العيادات الخارجية لجراحة الفم. قامت لجنة من أربعة خبراء بتقييم أداء الطلاب في الاختبارين وذلك باستخدام قائمة مرجعية محققة مسبقاً. تمت مقارنة نتائج الطلاب في الاختبارين. أظهرت النتائج عدم وجود ارتباط إحصائي بين نتائج الطلاب في الاختبارين على المرضى والمحاكاة. وخلصت الدراسة إلى أن المريض لا يزال المعيار الذهبي في تقييم المهارات الحركية لطلاب طب الأسنان.






Keywords: Assessment, oral surgery, psychomotor skills, simulation, teeth extraction, validity


How to cite this article:
Marei HF, Al-Jandan BA, Al-Khalifa KS, Al-Masoud NN, Al-Eraky MM, Wajid G. Assessment of dental students' psychomotor skills using oral surgery simulation models. Saudi J Med Med Sci 2015;3:146-50

How to cite this URL:
Marei HF, Al-Jandan BA, Al-Khalifa KS, Al-Masoud NN, Al-Eraky MM, Wajid G. Assessment of dental students' psychomotor skills using oral surgery simulation models. Saudi J Med Med Sci [serial online] 2015 [cited 2022 Jan 20];3:146-50. Available from: https://www.sjmms.net/text.asp?2015/3/2/146/156428


  Introduction Top


Simulators are educational tools that fall into the broad context of simulation-based medical education. [1] Simulation can be defined as "a device or exercise that enables participants to reproduce under test conditions, phenomena that are likely to occur in actual performance". [2]

Simulation provides a reproducible, standardized, objective setting for both formative and summative assessment, [1] and allows students to be tested in a safe environment. [3] Issenberg et al. [4] stated that simulation as an example of nonwork-based forms of assessment will continue to grow, because it provides a venue for assessment with immediate feedback during the early stages of learning, while protecting patients from potential harm. In competency assessments, simulation sits on the third level of Miller's pyramid as it can provide an environment for testing the "shows how" of clinical ability. [5]

For a simulation model to be considered as an assessment tool, it must possess reliability, validity, educational impact, acceptability and feasibility. [6] Concurrent validity is a method for establishing the validity of simulation-based assessment. For establishing concurrent validity, subject performance on a new assessment is compared to performance on a previously validated gold standard assessment, and the degree of correlation between the subjects' performances establishes the degree of concurrent validity of the new tool. [7]

Different studies in medical and dental education have evaluated the validity of using various simulation models as a tool for assessing technical skills. In dental education, the faculty perception of content validity of a haptic-three-dimensional virtual reality dental training simulator was assessed. The simulator had proved to be successful in developing the necessary dental tactile skills. [8] In medical education, there has been a significant correlation between operative performance of laparoscopic technical skills in the operating room and psychomotor performance in a virtual environment assessed by a computer simulator, which has provided strong evidence for the validity of the simulator system as an objective tool for assessing laparoscopic skills. [9]

Since an objective, standardized assessment is essential for making a judgment before moving someone from the third level of Millard's pyramid to the fourth; which is the clinical practice in work-place, [10] it was crucial to evaluate the validity of using oral surgery simulation models as a tool in assessing students' psychomotor skills. Dunkley inquired, "how well does the assessment of performance under artificial conditions match that in the real world?" [11]

Van Nortwick et al. [7] stated that training institutions should adopt reliable; valid assessment to fill the gaps in clinical experiences and ensure patient safety. The purpose of this study was to investigate the validity of using oral surgery simulation models as a tool to assess dental students' psychomotor skills.


  Materials and Methods Top


All students in the 4 th year of a 6 years dental program were enrolled in the study (n = 23). The students were asked to display their competency in the injection of local anesthesia (LA) and dental extraction in two summative testing environments; in simulation, and then perform the same tasks in an outpatient clinic, and their performances in both settings compared. The students' summative assessment sessions were at the end of the 4 th year of their dental training program. During this year, as a part of course requirements, all students performed an average of 20 cases of LA injection and closed teeth extraction on real patients, and they were able to perform the procedure safely without supervision in at least four of the 20 cases.

In oral surgery simulation laboratory, the students displayed their psychomotor skills in injecting a local anesthetic in simulation models [Figure 1]. These models were fitted in phantom heads, which were bench-mounted.
Figure 1: Phantom heads, which are bench-mounted, AG-3, Frasaco, Tettlning, Germany

Click here to view


With regard to tooth extraction, the students displayed their psychomotor skills on dental extraction models. The extraction models simulated the upper and lower jaws respectively. Each model consisted of 32 metal teeth in CR Ni-steel, embedded in resilient plastic resin, with back plate fitted in phantom heads, which were bench-mounted. The removal of teeth from the models showed difficulty similar to the resistance shown by real teeth during extraction in a patient [Figure 2].
Figure 2: Extraction model consists of 32 metal teeth in CR Ni-steel, A-EM, Frasaco, Tettlning, Germany

Click here to view


In oral surgery outpatient clinics, the students displayed their psychomotor skills in the injection of LA and tooth extraction on real patients who required closed extraction as competently diagnosed before the assessment session.

A panel of four experts from Oral and Maxillofacial Surgery Department assessed the students' performance during the injection of LA, and tooth extraction on patients, and on simulators. The examiners used the validated checklist of Macluskey et al. [12] The checklist is composed of 20 steps with a maximum score of 20 marks [Figure 3].
Figure 3: Student assessment performance sheet (checklist)

Click here to view


The examiners were randomly assigned to the two settings. Examiners E1, and E2 assessed students' performance on patients while E3, and E4 assessed students' performance on simulation models. The students' final score was the average of both scores given by two examiners on patients and simulation encounters.

Pearson correlation coefficient was calculated to find the association between scores on simulation and on real patients. The same test was used to find the inter-rater reliability between E1, E2, and between E3, E4 examiners, respectively.

Wilcoxon Signed Rank test was used to find any significant difference between the average scores on patients and simulation.

Cronbach's alpha test was used to find the reliability of the average scores given by E1, E2, and E3, E4.

The Dental College Research Committee approved the study, which followed the research principles in the Declaration of Helsinki.


  Results Top


Pearson correlation coefficient between the scores on patients and - simulators was 0.068, which indicated no significant correlation between the performances in both settings (P = 0.759) [Figure 4].
Figure 4: Scores on patients and simulation showing no significant correlation

Click here to view


The mean students' scores on patients as rated by the E1, and E2 was (15.39 ± 2.27), and (15.70 ± 2.91), respectively, while the mean students' scores on simulation as rated by E3, and E4 was (16.57 ± 2.85), and (16.78 ± 2.21), respectively. The correlation between scores of E3 and E4 on simulation was stronger (0.88, P < 0.0001) than that between scores of E1 and E2 on patients (0.63, P < 0.001).

Reliability of average scores given on patients, and simulation models were calculated. Cronbach's alpha was greater on simulation models (0.937) than on real patients (0.773).

All students were able to score on steps 3, 4, 6, 7, 8, 10, 11, 12, 15, 16 on the checklist, while performance on the other steps was variable in both encounters.


  Discussion Top


The movement towards competency-based education in dentistry challenges traditional testing techniques because careful measurements of students' skills are required to move from one stage of training to another. [13] The new educational approach highlights the need for valid tools to assess students' competencies without compromising the safety of patients or the quality of the service delivered.

Evidence suggests that a large number of patients receive suboptimal care as a result of adverse events and medical errors. [14] In oral surgery, students gain technical skills related to the injection of LA, and teeth extraction through theoretical training, which involves photos, and videos on different techniques, and then commence practicing on different simulation models if available, and finally on humans. [10],[15] Brand et al. [16] stated that only a minority of dental schools around the world employed preclinical simulations for LA and dental extraction models, which indicates that students could begin their first injection directly on real patients. In the light of above-mentioned practice, a complication-free learning environment requires continuous assessment, and feedback to achieve specific benchmarks in made-up situations before moving to the real work-place.

This study involved a comparison between students' performance on simulation models and real patients. Operating on patients was considered the gold standard in establishing concurrent validity in many studies. [9]

The study was designed to control the confounding factors that can lead to misinterpretation of assessment results, and, therefore, threaten validity. Some of these factors are reliability of ratings, flawed cases, inappropriate case difficulty, rater bias, and flawed checklists. [17]

Downing discussed the five sources of validity evidence as follows: Content, relations with other variables, internal structure, response process, and consequences evidence. [18]

Content evidence represents the steps taken to ensure that assessment content reflects the construct. One of these steps is basing the assessment on prior instruments. [19] Our study fulfilled this criterion by using of a pre-validated checklist revised and approved by a panel of experts.

Relation with other variables is another type of evidence, which depends on statistical correlation between assessment scores of two measures of the same construct. [19] Our study was not able to show any statistical association or correlation between students' scores on simulation models and real patients, which casts doubt on the concurrent validity of using these simulation models in summative assessment.

The lack of correlation in our study could be due to variability in patients' responses towards students and the extent of fidelity of the used simulators. The study limited the assessment to the relatively simple straightforward task of a simple closed extraction. Dunkley stated that the conditions under which the assessment occurs affect students' performance and the simulation setting used must try to recreate all the confounding aspects of everyday reality. [11]

Internal structure evidence represents the reliability measures of reproducibility across items, stations, or raters. [19] The study showed scores with high reliability, as Cronbach's alpha was 0.773, and 0.937 on patients, and simulations respectively.

The study involved the use of a 20-point checklist as an assessment instrument for students' performance. Checklists have been used in various studies to ensure different types of validity. [20],[21] They are perceived to be more objective and able to produce slightly more reliable scores than global rating scale. [22] The main problem that was documented by all the examiners was the inability to reflect on the quality of each step of the students' performance., This was noted by Norcini, and Burch as one of the disadvantages of using checklists. [22]

Although four examiners assessed students' performance, the use of the checklist reduced subjectivity to a minimum, which was reflected in high inter-rater reliability.

All the students were able to score on the specific steps in the checklist. Some of these steps were: Choice of the correct technique of LA, assembly of the syringe, safe handling of sharps, and selection of the correct forceps. It was important to add these steps to the checklist in order to simulate what occurs in the real environment. In 2003, Hodges stated that one of the requirements of examining validity is the contextual fidelity of the test. [23]

Our study was limited to 23 dental students, which represented all students enrolled in the 4 th year of the college dental program. To achieve a statistically significant correlation between performances in both settings with Pearson correlation coefficient of 0.068, it would have required the application of the study to 850 students, which would not have been practical in our environment. We recommend multi-center studies on a larger sample size with different levels of experience to reflect on the construct validity of the oral surgery simulation tool.

The use of oral surgery simulation has been proved to enhance cognitive, psychomotor skills in individuals and teams, [15],[16] but for assessment, we recommend that its use be kept within the boundaries of formative assessment.


  Conclusion Top


The real patient remains the gold standard in summative assessment of dental students' psychomotor skills.


  Acknowledgments Top


The authors would like to thank all the undergraduate students of College of Dentistry, University of Dammam who participated in this study. The authors would also like to express their gratitude to Dr. Faiyaz Ahmed Syed, Dr. Adel Ibrahim Abdelhadi and Dr. Imran Farooq for their support and assistance in completing this project.

 
  References Top

1.
Ziv A. Simulators and simulation based medical education. In: Dent JA, Harden RM 3 rd editors. A Practical Guide for Medical Teachers. London: Churchill Livingstone; 2009. p. 218-27.  Back to cited text no. 1
    
2.
Krummel TM. Surgical simulation and virtual reality: The coming revolution. Ann Surg 1998;228:635-7.  Back to cited text no. 2
    
3.
Acton RD, Chipman JG, Gilkeson J, Schmitz CC. Synthesis versus imitation: Evaluation of a medical student simulation curriculum via Objective Structured Assessment of Technical Skill. J Surg Educ 2010;67:173-8.  Back to cited text no. 3
    
4.
Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005;27:10-28.  Back to cited text no. 4
    
5.
Ker J, Bradley P. Simulation in medical education. In: Swanwick T, editor. Understanding Medical Education: Evidence, Theory and Practice. 1 st ed. Chichester: A John Wiley & Sons, Ltd., Publication; 2010. p. 164-80.  Back to cited text no. 5
    
6.
Schuwirth LW. Assessing medical competence: Finding the right answers. Clin Teach 2004;1:14-8.  Back to cited text no. 6
    
7.
Van Nortwick SS, Lendvay TS, Jensen AR, Wright AS, Horvath KD, Kim S. Methodologies for establishing validity in surgical simulation studies. Surgery 2010;147:622-30.  Back to cited text no. 7
    
8.
Steinberg AD, Bashook PG, Drummond J, Ashrafi S, Zefran M. Assessment of faculty perception of content validity of PerioSim, a haptic-3D virtual reality dental training simulator. J Dent Educ 2007;71:1574-82.  Back to cited text no. 8
    
9.
Kundhal PS, Grantcharov TP. Psychomotor performance measured in a virtual environment correlates with technical skills in the operating room. Surg Endosc 2009;23:645-9.  Back to cited text no. 9
    
10.
Kneebone RL. Twelve tips on teaching basic surgical skills using simulation and multimedia. Med Teach 1999;21:571-5.  Back to cited text no. 10
    
11.
Dunkley MP. Competence assessment using simulation. Minim lnvasive Ther Allied Technol 2000;9:341-5.  Back to cited text no. 11
    
12.
Macluskey M, Hanson C, Kershaw A, Wight AJ, Ogden GR. Development of a structured clinical operative test (SCOT) in the assessment of practical ability in the oral surgery undergraduate curriculum. Br Dent J 2004;196:225-8.  Back to cited text no. 12
    
13.
Boone WJ, McWhorter AG, Seale NS. Purposeful assessment techniques (PAT) applied to an OSCE-based measurement of competencies in a pediatric dentistry curriculum. J Dent Educ 2001;65:1232-7.  Back to cited text no. 13
    
14.
Aggarwal R, Darzi A. Simulation to enhance patient safety: Why aren't we there yet? Chest 2011;140:854-8.  Back to cited text no. 14
    
15.
Marei HF, Al-Jandan BA. Simulation-based local anaesthesia teaching enhances learning outcomes. Eur J Dent Educ 2013;17:e44-8.  Back to cited text no. 15
    
16.
Brand HS, Kuin D, Baart JA. A survey of local anaesthesia education in European dental schools. Eur J Dent Educ 2008;12:85-8.  Back to cited text no. 16
    
17.
Downing SM, Haladyna TM. Validity threats: Overcoming interference with proposed interpretations of assessment data. Med Educ 2004;38:327-33.  Back to cited text no. 17
    
18.
Downing SM. Validity: On meaningful interpretation of assessment data. Med Educ 2003;37:830-7.  Back to cited text no. 18
    
19.
Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract 2014;19:233-50.  Back to cited text no. 19
    
20.
Datta V, Bann S, Beard J, Mandalia M, Darzi A. Comparison of bench test evaluations of surgical skill with live operating performance assessments. J Am Coll Surg 2004;199:603-6.  Back to cited text no. 20
    
21.
Girzadas DV Jr, Clay L, Caris J, Rzechula K, Harwood R. High fidelity simulation can discriminate between novice and experienced residents when assessing competency in patient care. Med Teach 2007;29:472-6.  Back to cited text no. 21
    
22.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No 31. Med Teach 2007;29:855-71.  Back to cited text no. 22
    
23.
Hodges B. Validity and the OSCE. Med Teach 2003;25:250-4.  Back to cited text no. 23
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4]


This article has been cited by
1 Impact of the Haptic Virtual Reality Simulator on Dental Studentsí Psychomotor Skills in Preclinical Operative Dentistry
Abeer Farag, Danya Hashem
Clinics and Practice. 2021; 12(1): 17
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
  Results
  Discussion
  Conclusion
  Acknowledgments
   Materials and Me...
   References
   Article Figures

 Article Access Statistics
    Viewed3380    
    Printed80    
    Emailed0    
    PDF Downloaded314    
    Comments [Add]    
    Cited by others 1    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]