IADVL
Indexed with PubMed and Science Citation Index (E) 
 
Users online: 2832 
     Home | Feedback | Login 
About Current Issue Archive Ahead of print Search Instructions Online Submission Subscribe What's New Contact  
  Navigate here 
  Search
 
  
 Resource links
   Similar in PUBMED
    Search Pubmed for
    Search in Google Scholar for
  Related articles
   Article in PDF (346 KB)
   Citation Manager
   Access Statistics
   Reader Comments
   Email Alert *
   Add to My List *
* Registration required (free)  

 
  In this article
   Abstract
  Introduction
  Background
  Methods
  Results
  Discussion
  Conclusion
   References
   Article Tables

 Article Access Statistics
    Viewed1374    
    Printed37    
    Emailed1    
    PDF Downloaded90    
    Comments [Add]    

Recommend this journal

 


 
 Table of Contents    
ORIGINAL ARTICLE
Year : 2014  |  Volume : 80  |  Issue : 2  |  Page : 134-136

Validation of a computer based objective structured clinical examination in the assessment of undergraduate dermatology courses


1 Department of Dermatology, King Faisal University, Hofuf, Saudi Arabia
2 Department of Family and Community Medicine, King Faisal University, Hofuf, Saudi Arabia
3 Department of Ophthalmology, King Faisal University, Hofuf, Saudi Arabia

Date of Web Publication26-Mar-2014

Correspondence Address:
Feroze Kaliyadan
Department of Dermatology, College of Medicine, King Faisal University
Saudi Arabia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/0378-6323.129386

Rights and Permissions

  Abstract 

Many teaching centers have now adopted objective structured clinical examination (OSCE) as an assessment method for undergraduate dermatology courses. A modification of the standard OSCE in dermatology is computer based or electronic OSCE (eOSCE). We attempted to validate the use of a computer-based OSCE in dermatology in a group of fifth year medical students. The scores of the students in the computer-based OSCE showed a strong positive correlation with the scores on the clinical presentation (Pearson's co-efficient - 0.923, P value <0.000, significant at the 0.01 level) and a good correlation with overall scores of the student (Pearson's co-efficient - 0.728, P value <0.000, significant at the 0.01 level), indicating that this is a reliable method for assessment in dermatology. Generally, the students' feedback regarding the methods was positive.


Keywords: Computer based OSCE, dermatology assessment, reliability, validity


How to cite this article:
Kaliyadan F, Khan AS, Kuruvilla J, Feroze K. Validation of a computer based objective structured clinical examination in the assessment of undergraduate dermatology courses. Indian J Dermatol Venereol Leprol 2014;80:134-6

How to cite this URL:
Kaliyadan F, Khan AS, Kuruvilla J, Feroze K. Validation of a computer based objective structured clinical examination in the assessment of undergraduate dermatology courses. Indian J Dermatol Venereol Leprol [serial online] 2014 [cited 2019 Jul 16];80:134-6. Available from: http://www.ijdvl.com/text.asp?2014/80/2/134/129386



  Introduction Top


Assessment of undergraduate dermatology courses in most institutes continues to rely on the examination and presentation of clinical 'short cases' and 'long cases'. Unfortunately, these methods are affected by relatively low reliability because of factors like a lower sampling frame and lack of objectivity. Objective structured clinical examination (OSCE) would be a good solution to overcome this problem, but requires a lager faculty base and logistical support. [1] The main objectives of most undergraduate dermatology courses is the description of skin lesions and the diagnosis and treatment of common skin diseases. Computer-based images could be a good substitute to the actual patients for achieving these objectives and the objectivity of such an examination can be increased by conducting it in the form of a computerized OSCE. [2],[3] We have been conducting such a computer-based OSCE for the last two years in our institution. The questions are prepared by the faculty and have been modified after piloting on previous batches of students of a batch of 129 medical students of fifth year we aimed to correlate the scores obtained on a computer-based dermatology OSCE with scores in the clinical case presentation and the final scores that included the written assessment and professional behavior assessment. We also evaluated the students' feedback regarding this mode of assessment. In general, this study mainly aimed to validate OSCEs as an assessment method in dermatology and the relative ease of administration of the same in the form of a computer-based OSCE.


  Background Top


Dermatology training is a three week compulsory rotation conducted in the fifth year of the undergraduate course in our institution. The dermatology course has three credit hours and the final score is incorporated into the students' grade point average (GPA). Traditional assessment in dermatology consisted of a written examination in the form of application based MCQs (40%), clinical case presentations (40%) and formative assessment (including professional behavior assessment with a log book) (20%). For the last two years, we have modified this assessment pattern. While retaining the percentages of the written exam and formative assessment, the clinical examination has been split to incorporate a computer-based OSCE in dermatology along with the case presentation.


  Methods Top


One hundred and twenty nine medical students in the fifth year of their undergraduate course were assessed in this study using a computer-based OSCE. Each student was shown 16 images of common clinical cases and each image was followed by four questions. The questions commonly asked were about the description of skin lesions, diagnosis and differential diagnosis, investigation and treatment. Five minutes were allocated to each slide. The exam was conducted in small groups of not more than 10 students at a time, with two faculty supervisors. The images were displayed on a large screen using Microsoft PowerPoint® . The marks on the OSCE were statistically correlated with the marks obtained in the clinical case presentation and total marks, including the written examination. Student feedback was evaluated using a five-point Likert scale questionnaire along with open-ended questions. The Likert scale questionnaire had seven questions regarding the OSCE with a five-point rating (strongly agree, agree, undecided, disagree, strongly disagree). Statistical correlation between scores was done using the Pearson's correlation coefficient. Consensus analysis of the student questionnaire was done based on percentage agreement and qualitative analysis of the open-ended questions.


  Results Top


The scores of the students in the computer-based OSCE showed a strong positive correlation with scores on the clinical presentation (Pearson's co-efficient −0.923, P value <0.000, significant at the 0.01 level) and a good correlation with overall scores (Pearson's co-efficient −0.728, P value <0.000, significant at the 0.01 level), indicating that this is a reliable method for dermatology assessment [Table 1].
Table 1: Correlations between OSCE, clinical presentation and overall scores

Click here to view


The students' feedback regarding the methods in general was positive. The questionnaire had a good reliability (Cronbach's alpha −0.78). Of the total, 81 (62.7%) students agreed or strongly agreed that they would prefer this method to other traditional assessment methods in dermatology. Ninety nine (76.7%) students agreed or strongly agreed that the topics covered in OSCE were relevant and 81 (62.7%) agreed or strongly agreed that the quality of the OSCE slides was good. Disagreement was seen on the time allotted for each slide with 63 (48.8%) disagreeing or strongly disagreeing that the time per slide was sufficient. For the faculty, the method ensured that overall reliability was increased as a wider sampling of the dermatology content could be done and since each slide was attempted simultaneously by all students; the exam did not present any significant logistical difficulties either.


  Discussion Top


Grover et al., [2] have previously documented the use of a computer assisted OSCE (CA-OSCE) for undergraduate dermatology assessment and their study showed good acceptance by the students. This study highlighted the importance of considering such a method in the situation of shortage of faculty in dermatology teaching institutes. However, it was not documented how well the OSCE correlates to overall performance of the student. [2] While the OSCE is definitely more objective compared to other more traditional methods like case presentations, equal importance should be given to other aspects related to reliability and validity. One of the most important factors which increases reliability would be to increase the number of slides. Ensuring a wider sampling covering all the 'must-know' topics in undergraduate dermatology would also be essential to ensure that the assessment has good psychometric properties. [4] One of the disadvantages mentioned in the previous study by Grover et al., is the inability of the CA-OSCE to test psychomotor and affective domains. We suggest overcoming this disadvantage by adding at least one traditional station for each of these domains in the end: for example, demonstration of making a potassium hydroxide smear (for psychomotor domain) and counseling a simulated patient who acts as a parent of a child with a infantile hemangioma on the face (for affective domain). The supervising faculty themselves can act as the simulated patient. Moreover, the conduct of regular computer-assisted OSCE requires stringent quality controls to be effective including constant review of the stations based on peer and student feedback and effective standard setting methods to ensure that the final assessment score is fair and reliable.


  Conclusion Top


Computer-assisted OSCE in dermatology is an effective, simple and reliable alternative to traditional methods of undergraduate dermatology assessment. It shows a good correlation with overall academic performance of the student. Further longitudinal studies are required to evaluate how best psychomotor and affective domain assessment can be incorporated into this format and to set the optimum time allotment for each slide. A hybrid OSCE combining a computer-based model with traditional stations might be an effective method for assessment of undergraduate dermatology courses.

 
  References Top

1.Kaliyadan F. Undergraduate dermatology teaching in India: Need for change. Indian J Dermatol Venereol Leprol 2010;76:455-7.  Back to cited text no. 1
[PUBMED]  Medknow Journal  
2.Grover C, Bhattacharya SN, Pandhi D, Singal A, Kumar P. Computer Assisted Objective Structured Clinical Examination: A useful tool for dermatology undergraduate assessment. Indian J Dermatol Venereol Leprol 2012;78:519.  Back to cited text no. 2
[PUBMED]  Medknow Journal  
3.Kaliyadan F, Manoj J, Dharmaratnam AD, Sreekanth G. Self-learning digital modules in dermatology: A pilot study. J Eur Acad Dermatol Venereol 2010;24:655-60.  Back to cited text no. 3
    
4.Davies E, Burge S. Audit of dermatological content of U.K. undergraduate curricula. Br J Dermatol 2009;160:999-1005.  Back to cited text no. 4
    



 
 
    Tables

  [Table 1]



 

Top
Print this article  Email this article

    

Online since 15th March '04
Published by Wolters Kluwer - Medknow