Received:

19-III-2021

Accepted:

24-IV-2021

Published Online:

23-VI-2021

WALI O., VANKA A., VANKA S., 2022: Faculty Perceptions on Objective Structured Clinical Exam in Dental Education.-ODOVTOS-Int. J. Dental Sc., 24-2 (May-August): 145-156.

Faculty Perceptions on Objective Structured Clinical Exam

in Dental Education

Percepción de los profesores sobre el examen clínico estructurado y su objetivo en la educación dental

Othman Wali¹; Amit Vanka Phd, MDS, BDS²; Shanthi Vanka MDS, BDS³

1. Vice Dean, Dentistry program. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0001-6250-4376

2. Faculty of Dentistry, Department of Preventive Dental Sciences. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0002-0890-0471

3. Faculty of Dentistry, Department of Preventive Dental Sciences. Ibn Sina National College for Medical Studies, Jeddah Kingdom of Saudi Arabia. https://orcid.org/0000-0001-7712-8756

Correspondence to: Dr. Othman Wali - oaakwali@yahoo.com

ABSTRACT: Structured Clinical Exam (OSCE) uses standardized content and procedures to assess students across multiple domains of learning. The study is aimed to assess knowledge, attitudes, practices and observations of dental faculty on OSCE. The survey was distributed into dental faculty members in randomly selected government and private institutions in Saudi Arabia. The questionnaire was pre-tested and consisted of 4 categories including general characteristics of respondents, knowledge on utility of OSCE in curriculum and it’s reliability, attitudes regarding OSCE on a 5 point Likert scale, practices and observations on OSCE on Multiple choice questions (both single answer and multiple answer) and responses on a 5 point Likert scale. The sample size was determined to be 93 and the survey was sent electronically to 10 institutes. 101 complete responses from 7 institutions were considered from the 122 received. Faculty participation in OSCE was high within evaluators 94% (n=94) and administrators 61% (n=61). Majority of respondents (62%) believed that OSCE is most suited for competency based education, to assess cognitive skills (73%) and diagnostic interpretation (79%). Reliability of OSCE can be increased by standardization of evaluators (77%) with highest number believing that 6-8 stations (42%) are the minimum required in an OSCE. Institution guidelines (49%) coupled with workshops (47%) was the preferred method of preparation for OSCE. Majority felt that OSCE is most suitable for high stakes exams (mean=3.37) and it is an indispensable part of dental assessment (mean=3.78). Minimum number of stations for adequate reliability was reported to be lesser that in reported literature, specially so for high stakes assessments. Logistics required for arranging an OSCE and difficulty in standardized patients, may suggest that OSCE should be used in select situations.

KEYWORDS: OSCE; Faculty; Knowledge; Practices; Reliability.

RESUMEN: El examen clínico estructurado (ECOE) utiliza contenido y procedimientos estandarizados para evaluar a los estudiantes en múltiples dominios de aprendizaje. Este estudio tiene como objetivo evaluar los conocimientos, las actitudes, las prácticas y las observaciones de los profesores de odontología sobre la ECOE. La encuesta se distribuyó a los miembros de la facultad de odontología en instituciones gubernamentales y privadas seleccionadas al azar en Arabia Saudita. El cuestionario se utilizó previamente y constaba de 4 categorías que incluían generalidades de los encuestados, conocimiento sobre la utilidad de la ECOE en el plan de estudios y su confiabilidad, actitudes con respecto a la ECOE en una escala Likert de 5 puntos, prácticas y observaciones sobre la ECOE en preguntas de opción múltiple (ambas respuesta y respuesta múltiple) y respuestas en una escala Likert de 5 puntos. Se determinó el tamaño de la muestra en 93 y la encuesta se envió electrónicamente a 10 institutos. Se consideraron 101 respuestas completas de 7 instituciones. La participación del profesorado en ECOE fue alta entre los evaluadores 94% (n=94) y los administradores 61% (n=61). La mayoría de los encuestados (62%) cree que la ECOE es más adecuada para la educación basada en competencias, para evaluar las habilidades cognitivas (73%) y la interpretación del diagnóstico (79%). La confiabilidad de la ECOE puede aumentarse mediante la estandarización de los evaluadores (77%) y el número más alto cree que 6-8 estaciones (42%) son el mínimo requerido en una ECOE. Las directrices de la institución (49%) junto con los talleres (47%) fue el método preferido de preparación para la ECOE. La mayoría consideró que la ECOE es más adecuada para exámenes de alto riesgo (media=3,37) y es una parte indispensable de la evaluación dental (media=3,78).Se informó que el número mínimo de estaciones para una confiabilidad adecuada es menor que en la literatura reportada, especialmente para evaluaciones de alto riesgo. La logística necesaria para organizar un ECOE y la dificultad en los pacientes estandarizados pueden sugerir que el ECOE se debe utilizar en situaciones seleccionadas.

PALABRAS CLAVE: ECOE; Docencia; Conocimiento; Prácticas; Fiabilidad.

INTRODUCTION

Assessment is important to determine if students are achieving the outcomes designed for respective courses and programs. Student assessment methods and strategies have witnessed significant changes over the years and remain in a state of continuous flux. As the need grows to include innovative educational strategies and focus on skill development, particularly so in the dental field, methods ranging from a simple pen and paper test to more complex assessment strategies such as problem based learning and standardized patients have evolved (1). While written exams are designed to assess cognitive knowledge, clinical exams may be confined to technical skills, with concerns on examiner variability and patient safety (2). The Objective Structured Clinical Exam (OSCE) is an assessment tool conceptualized by Harden in the year 1975 (3). From its nascent stage, OSCE has evolved to be a tool capable of assessing students across domains and skill sets.

In the broad sense, OSCE encompasses a group of examinations that use multiple, standardized stations each of which requires candidates to use their clinical skills to successfully complete one or more problem solving tasks. The OSCE format often includes physical materials, such as radiographs, photographs, models, and order/prescription writing (4). Some of the variants included under the broad umbrella of OSCE include: Objective Structured Practical Examination (OSPE); Objective Structured Long Examination Record (OSLER); Group Objective Structured Clinical Examination (GOSCE). Though the objectives may vary; these examinations retain all the characteristics of the original OSCE (5).

The implementation of OSCE and its variants continues to increase in various dental schools. National boards have adopted OSCE for licensure exams (or are seriously contemplating doing so in the near future). Several guidelines have been developed on the basis of which OSCE may be incorporated into different curricula. The designing and steps in implementation however vary and standard procedures are often modified to suit the prevailing conditions (6). The advent of the COVID-19 pandemic has affected the implementation of clinical competencies involving clinical procedures on patients (7). The dependence on OSCE as a competency under the prevailing conditions, in all probability, is set to increase.

The cornerstone of a successful OSCE undoubtedly is the faculty involved in the development and implementation of an OSCE. Roles of faculty may vary from administrating, setting questions, evaluating students and providing feedback. Steps taken by faculty, based on their beliefs and knowledge, may play a crucial role in the introduction and conduction of OSCE. The aim of the present study is to assess knowledge, attitudes, practices and observations of dental faculty members on OSCE.

Aim

To assess knowledge, attitudes, practices and observations of dental faculty members on OSCE.

MATERIALS AND METHODS

The study is descriptive, cross sectional and is approved by ethical committee of Ibn Sina National College, Jeddah. The survey was conducted by means of a pre-tested questionnaire and was determined to be exempt from the ethical committee. The sample size required was calculated to be 93 and margin of error of 10% at 95% confidence interval. A web-link to an anonymous web-based survey was created and distributed to 10 randomly selected government and private institutes (out of a total of 26) in Saudi Arabia. 6 weeks after the initial round a reminder was sent. The data gathered was confidential.

The survey consisted of 4 categories: Part 1 consisted of information regarding general characteristics of respondents with their experience in participating/conducting OSCE. Part 2 (6 questions) addressed the knowledge on utility of OSCE in curriculum and it’s reliability. Part 3 (5 questions) evaluated attitudes regarding OSCE on a 5 point Likert scale (strongly agree to strongly disagree) and Part 4 (8 questions) assessed practices and observations on OSCE on Multiple choice questions (both single answer and multiple answer) and responses on a 5 point Likert scale. The first draft of the questionnaire was pilot and tested with ten faculty members (who were excluded from the total). Modifications were made to the contents and wordings based on their suggestions to enhance content validity. Response to the survey was considered to be consent from the respondents. The responses were collected electronically.

Statistical analysis: The responses were represented as number, percentage, mean and standard deviation. Statistical analysis was performed using IBM SPSS version 22. Chi-square test was applied to select questions from the questionnaire. p<0.05 was considered to be statistically significant.

RESULTS

A total of 101 complete responses (out of 122 received) from 7 dental institutions were included in the study. All the respondents that had completed the questionnaire in all aspects were included into the final sample of analysed questionnaires. Responses from institute was included if the number of responses exceeded 10. Male respondents were higher than the females and largest population of respondents were between the age group of 36-40 (32%). Table 1 shows that Master’s degree was the highest (71%) with the subgroup of respondents with designation of assistant professor (38%) and experience of 6-10 years(32%). The responses were evenly distributed between government (48%) and private (52%) institutes. Most faculty had participated in 1-4 OSCE’s (37%), but the highest number of them (39%) had never been administrators for OSCE. Table 2 shows the faculty responses regarding knowledge and practices and observations (n=6) with multiple options. Table 3A and Table 3B are about the details of Faculty practices and observations towards OSCE evaluated using multiple choice questions and 5 point likert scale which are expressed as number and percentage of responses. Faculty attitude towards OSCE evaluated on a 5 point Likert scale are detailed in Table 4. Association between select responses showed significant correlation OSCE experience and utility of OSCE in dental assessment and reliability testing being done after every exam is explained in Table 5.

DISCUSSION

The current study collected responses from private and government dental colleges in the kingdom. The results from the study may thus be considered to be fairly representative of the opinions from a diverse population. With more than 60% of the faculty having experience ranging from 6-15 years, their opinions may be considered to represent a more contemporary thought process regarding the OSCE. Only 6 respondents( approx. 6%) had never participated in the OSCE and a significant number (60%) were OSCE administrators. Together data suggests that, OSCE appears to be an assessment tool utilized frequently in dental education, with robust faculty participation.

Responses indicate that faculty believed OSCE is a flexible tool, capable of assessing students in diverse models of curriculum including objective, outcome or competency based. Moreover, the highest percentage of responses indicated suitability in competency based curriculum. A comparison with written exam showed that OSCE resulted in an increased proficiency in the tested clinical competence (8). Indeed, OSCE has been deemed to be “a valuable mechanism to assess curriculum, not only for the content but also its effectiveness”(9). A well designed OSCE is believed to be a precursor to the development of competency based curriculum since it plays an important role in the evaluation process (10).Responses from faculty appears to corroborate the findings from previous studies.

The national qualification framework in Saudi Arabia had previously designated 5 domains of learning namely: Knowledge, Cognitive skills, Interpersonal, IT and communication and Psychomotor (11). Responses indicated that while all domains can be assessed by OSCE, the highest responses were for knowledge and cognitive domains. OSCE is regarded to be a ubiquitous tool capable of assessing across domains. Nevertheless, it is important to appreciate that some domains are better assessed by methods other than OSCE. Application of knowledge in theoretical context is best measured by MCQs (12). Similarly the “Does” on Miller’s pyramid (13) is better assessed by work-based assessments such as Mini-CEX or DOPS as opposed to the OSCE that primarily assesses the “Show how” in a simulated environment (14).

We probed further to evaluate specific skills or activities that may be assessed by OSCE. Highest number of respondents believed that diagnostic interpretation is best assessed by OSCE while the lowest was clinical/preclinical skills. These findings are supported from a previous studies where the authors concluded that “OSCEs are a valuable and versatile method for of assessment in clinical disciplines, it is apparent that they are best suited to the assessment of diagnostic, interpretation and treatment planning scenarios and have limitations in the assessment of clinical operative procedures”(15,16).

Several studies have been conducted on the reliability of OSCE and have found it to be acceptable but not ideal (17). Amongst the measures employed to increase the reliability of an OSCE, most respondents agreed that standardization of the evaluator and standardization of patients were essential measures. For example, a study reported that part-time faculty accorded greater scores than full timers, thus emphasizing on the need to calibrate evaluator (18). Indeed, the maximum drop in reliability score was accounted for by variation in student performance from station to station, probably reflecting on the content and the evaluator in equal measure (19). Another major determinant of reliability is the test duration including time for each station and the number of stations (20). Poor reliability due to content specificity can be overcome by increasing the number of cases being tested (21) to achieve a Cronbach’s Alpha or generalisability value of 0.7 to 0.8 (22). Majority of the faculty believed that the number of stations needed in an OSCE range from 6-10. Coupled with a large portion of respondents agreeing that reliability testing is conducted after OSCE, number of stations would appear to be satisfactory except for high stakes examinations where anywhere from 14 to 20 stations have been recommended to achieve acceptable reliability (10).

The overall attitude regarding OSCE is a positive one and majority agree that it is an indispensable part of dental assessment. A large portion of dentists agreed that OSCE is the most suitable exam for high stakes such as licensure exams. The use of live patients for licensing examinations has been extensively debated upon (23), and arguments and resolutions for banning the practice has been made (24). Additionally, positive correlation has been reported between the written parts and OSCE of license exam (25) or while transitioning between preclinical and clinical parts of the curriculum (26). The literature on OSCE and its adoption by countries such as Canadian Board, appears to endorse the opinions of faculty on incorporating OSCE more vigorously in high stake exams. Subsequently, it would appear natural that faculty wanted multiple disciplines and domains to be assessed within a single exam to probably emulate the high stakes exam.

An institute, besides ensuring achievement of the learning outcomes, bears responsibility for preparing students for licensure exams. Faculty overall expressed confidence that their students are well prepared for an OSCE due to adequate skills and knowledge and exposure to the OSCE format. A formative OSCE seems to be favoured by faculty, particularly by evaluators, as supported from previous research (27) in which case the OSCE can be used as a teaching tool as well. (28,29).

While implementing an OSCE, faculty generally followed the guidelines laid down by institution supported by workshops. Faculty preparation is essential for calibration of assessment, providing feedback and logistic planning and the resources developed by an institute can significantly affect faculty performance and in turn the OSCE itself. Student feedback, reliability scores and OSCE exam scores must be used by institutions to design faculty development programs.

Faculty reported barriers to implement OSCE had high scores on difficulty to get standardized patients and the task being a difficult one, as related to logistics. Additionally, OSCE administrators believed that OSCE causes more stress among students and no significant association was found between number of OSCE’s conducted and suggestion that formative OSCE may prepare students for a summative assessment. Together, they appear to suggest that OSCE must be used judiciously in the curriculum and should not be considered a panacea for all assessment related problems.

CONCLUSION

Faculty participation in OSCE is extensive and they are are well versed with various aspects of OSCE.

Faculty believe:

REFERENCES

  1. Turner J.L., Dankoski M.E. Objective structured clinical exams: a critical review. Fam Med. 2008 Sep 1; 40 (8): 574-8.
  2. Zayyan, Marliyya. “Objective structured clinical examination: the assessment of choice.” Oman Med J 2011; 2: (6) 219-22.
  3. Harden R.M. What is an OSCE?. Med Teach. 1988 Jan 1; 10 (1): 19-22.
  4. National Dental Examining Board of Canada available at: https://ndeb-bned.ca/en/accredited/osce-examination
  5. Shumway J.M., Harden R.M. Association for Medical Education in Europe (AMEE) Education Guide No 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003; 25 (6): 569-84.
  6. Vanka A., Wali O., Akondi B.R., Vanka S., Ravindran S. OSCE-A New Assessment Method for Pharmaceutical Education. Indian J Pharm Educ. 2018 1; 52 (4):S1-6.
  7. Boursicot K., Kemp S., Ong T.H., Wijaya L., Goh S.H., Freeman K., Curran I. Conducting a high-stakes OSCE in a COVID-19 environment. Med Ed Publish. 2020 27; 9.
  8. Schoonheim-Klein M., Walmsley A.D., Habets L.L., Van Der Velden U., Manogue M. An implementation strategy for introducing an OSCE into a dental school. Eur J Dent Educ. 2005 ; 9 (4):143-9.
  9. Zartman R.R., McWhorter A.G., Seale N.S., Boone W.J. Using OSCE-based evaluation: curricular impact over time. J Dent Edu. 2002; 66 (12): 1323-30.
  10. Carraccio C., Englander R. The objective structured clinical examination: a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000; 154 (7): 736-41.
  11. National Qualifications Framework for Higher Education in the Kingdom of Saudi Arabia. National Commission for Academic Accreditation & Assessment. Available at https://www.mu.edu.sa/sites/default/files/National%20Qualifications%20Framework%20for%20HE%20in%20KSA.pdf
  12. Khan K.Z., Ramachandran S., Gaunt K., Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no.81. Part I: an historical and theoretical perspective. Med teach. 2013; 35 (9): e1437-46.
  13. Miller G.E. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63-S67.
  14. Baharin S. Objective structured clinical examination (OSCE) in operative dentistry course-its implementation and improvement. Procedia-Procedia Soc Behav Sci 2012; 60: 259-65.
  15. Mosssey P. Scope of the OSCE in the assessment of clinical skills in dentistry. Br. Dent. J.2001; 190: 323-6.
  16. Turner J.L., Dankoski M.E. Objective structured clinical exams: a critical review. Fam Med. 2008; 40 (8): 574-8.
  17. Park S.E., Kim A., Kristiansen J., Karimbux N.Y. The influence of examiner type on dental students’ OSCE scores. J Dent Edu. 2015; 79 (1): 89-94.
  18. Boulet J.R., McKinley D.W., Whelan G.P., Hambleton R.K. Quality assurance methods for performance-based assessments. Adv Health Sci Educ Theory Pract 2003; 8 (1): 27-47.
  19. Newble D. Techniques for measuring clinical competence: Objective structured clinical examinations. Med Educ 2004; 38: 199-203
  20. Roberts C., Newble D., Jolly B., Reed M., Hampton K. Assuring the quality of high-stakes undergraduate assessments of clinical competence. Med Teach 2006; 28: 535-543.
  21. Khan K.Z., Gaunt K., Ramachandran S., Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration. Med Teach. 2013; 35 (9): e1447-63.
  22. Gibbs A., Christ A. The Ethics of Using a Live Patient for Dental Board Exams. Available at : https://scholarscompass.vcu.edu/denh_student/17/
  23. Formicola A.J., Shub J.L., Murphy F.J. Banning live patients as test subjects on licensing examinations. J Dent Edu 2002; 66 (5): 605-9.
  24. Gerrow J.D., Murphy H.J., Boyd M.A., Scott D.A. Concurrent validity of written and OSCE components of the Canadian dental certification examinations. J Dent Edu. 2003; 67 (8): 896-901
  25. Graham R., Bitzer L.A., Anderson O.R. Reliability and predictive validity of a comprehensive preclinical OSCE in dental education. J Dent Edu 2013 ;77 (2):161-7.
  26. Lele S.M. A mini OSCE for formative assessment of diagnostic and radiographic skills at a dental college in India. J Dent Edu. 2011; 75 (12): 1583-9.
  27. Brazeau C., Boyd L., Crosson J. Changing an existing OSCE to a teaching tool: the making of a teaching OSCE. Academic medicine: J Assoc Am Med Coll. 2002 Sep; 77 (9): 932.
  28. van der Vleuten C.P.M., Swanson D.B. Assessment of clinical skills with standardized patients: state of the art. Teach Learn Med 1990; 2 (2): 58-76.
  29. Graham R., Bitzer L.A., Mensah F.M., Anderson O.R. Dental student perceptions of the educational value of a comprehensive, multidisciplinary OSCE. J Dent Edu 2014; 78 (5): 694-702.