| [Download PDF]
|Year : 2004 | Volume
| Issue : 6 | Page : 386--388
A comparative study of two evaluation techniques in pharmacology practicals: Conventional practical examination versus objective structured practical examination
Vandana Roy, U Tekur, S Prabhu
Department of Pharmacology, Maulana Azad Medical College & Associated Hospitals, New Delhi - 110 002, India
Department of Pharmacology, Maulana Azad Medical College & Associated Hospitals, New Delhi - 110 002
|How to cite this article:|
Roy V, Tekur U, Prabhu S. A comparative study of two evaluation techniques in pharmacology practicals: Conventional practical examination versus objective structured practical examination.Indian J Pharmacol 2004;36:386-388
|How to cite this URL:|
Roy V, Tekur U, Prabhu S. A comparative study of two evaluation techniques in pharmacology practicals: Conventional practical examination versus objective structured practical examination. Indian J Pharmacol [serial online] 2004 [cited 2022 Aug 10 ];36:386-388
Available from: https://www.ijp-online.com/text.asp?2004/36/6/386/13516
Evaluation is a systematic process that consists of finding out the extent to which educational objectives have been achieved by the students., Any evaluation process must thus be directly related to the educational objectives. The present system of assessing the students' performance in pharmacology practicals is not relevant as many of the skills that are assessed are not required for the making of a basic medical doctor.
Practical examination is an important component of evaluation in the medical curriculum. However, evaluation of students is not easy if the criteria of objectivity, uniformity, validity, reliability and practicability have to be met. At present, practical exercises in pharmacology in most medical colleges in India are conducted and evaluated in the conventional way, i.e., a student is given an experiment to perform, a viva is conducted after completion of the practical exercise and the candidate evaluated subsequently.
Objective Structured Practical Examination (OSPE) has been suggested as an alternative instrument for the assessment of laboratory exercises. In this, the evaluation is structured in a way that all the educational objectives of an exercise can be assessed. Also, in OSPE, questions and skills relevant for medical undergraduate training can be framed and tested keeping the educational objectives in mind. The advantages of OSPE include objectivity and uniformity in the questions and marking of students.
The objective of this study was to compare the evaluation of students' performance in pharmacology practicals by the conventional practical evaluation versus OSPE. The students' opinion of the evaluation methods was also assessed. Medical undergraduates in the 3rd, 4th and 5th semesters were included in the study. The students were subjected to both the conventional practical evaluation and OSPE in the same order.
There were three types of OSPE stations i.e., response stations, which consisted of short questions including graphs and diagrams requiring explanations, procedural stations where the candidate was expected to perform an assigned task and an examiner observed the student using a structured checklist, and stations for evaluating communication skills. The OSPE exercises were made as follows: Sets of eight stations each were made for the seven Pharmacy individual preparations that students normally make (carminative mixture, calamine lotion, Mandl's throat paint, benzoyl benzoate emulsion, liniment turpentine, ORS powder, Whitfield's ointment). The order of stations was such that a student would be assessed step by step in all the relevant aspects of theory, dispensing of the preparation and communicating both in writing and verbally how a patient must take the preparation.
Similarly, sets of 14 stations each were made for the three pharmacodynamic exercises i.e. effects of drugs on frog rectus, frog heart and rabbit ileum. Making the OSPE stations in this way enabled us to assess a student in greater depth in all the relevant aspects of that exercise than is usually possible with conventional OSPE. Also we were comparing marks obtained by conventional evaluation and OSPE. It was essential that the course format be the same in both.
The OSPE exercises were pre-tested. Marks were distributed for cognitive, psychomotor and communication skills. The conventional practical evaluation was conducted wherein groups of students were evaluated by different examiners. An example of task stations along with the checklist for the examiner and mark allotment is given in Appendix 1. [Table:1]
The results of 162 students who participated in both the evaluations are presented [Figure:1]. As is evident, the scoring in the OSPE was significantly better than in the conventional system. The average scores of students were significantly higher with OSPE (33.1 Vs. 28.8) with 28% students scoring more than 75% marks whereas only 4% scored more than 75% with the conventional evaluation. Students rarely get more than 70-75% in the conventional evaluation system, as marks are given in a subjective manner based on the teachers' discretion and teachers may set their own limits for evaluation. In OSPE, on the other hand, students may either score full marks if all answers are correct or no marks if all the answers are wrong.
Two students scored zero marks despite being present for the evaluation, which is seldom seen in the conventional system of evaluation. The marking in OSPE is on the basis of specific answer checklists and marks are not given for the student's mere presence. Thus the marks were more widely distributed in OSPE, with the students getting marks at either extreme (0-44).
63.8% of the students felt OSPE to be a better method of evaluation. The main reason given for this was greater objectivity and more uniform evaluation. 27.1% of students did not respond to this question. Only 2.6% of the students felt that OSPE was not able to test either the knowledge or the practical skills adequately.
The questions for OSPE stations were framed keeping in mind the objectives of the exercise after detailed discussion and specific answer checklists were made for all the questions, hence it is felt that the instrument is more objective and valid. OSPE required more planning, organization, pre-testing and manpower for assessments. The totaling of marks by OSPE is more time-consuming as marks are specified in the checklists. This may prove to be cumbersome if large batches of students have to be assessed. In contrast it is relatively easy to conduct practical exams in the conventional system.
The advantages of OSPE, however, outweigh its disadvantages as an evaluating tool. This kind of evaluation includes objective assessment of cognitive, psychomotor and communication skills with proportional distribution of marks with greater student satisfaction. The limitations of the present study were that inter-station reliability was not tested, also that a greater amount of lecture content had been covered by the time students were evaluated by OSPE.
The study shows that it is possible to evaluate a large number of students in an objective and uniform manner. OSPE should be introduced as an evaluation tool for the undergraduates in Pharmacology practicals. However, we suggest that OSPE may not replace the existing system totally but should complement it.
|1||Guilbert JJ. Educational Handbook for Health Personal. 6th Ed. Geneva: World Health Organization; 1987.|
|2||Batmanabane G, Raveendran R, Shashindran CH. Objective structured practical examination in pharmacology for medical laboratory technicians. Indian J Physiol Pharmacol 1999;43:242-6.|
|3||Nayar U, Malik SL, Bijlani RL. Objective structured practical examination: A new concept in assessment of laboratory exercises in pre-clinical sciences. Med Educ 1986;20:204-9.|
|4||Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160:302-5.|