Volume 43 Issue 1, Spring 2016, pp. 13-20

Virtual microscopy using digitized slides has become more widespread in teaching in recent years. There have been no direct comparisons of the use of virtual microscopy and the use of microscopes and glass slides. Third-year veterinary students from two different schools completed a simple objective test, covering aspects of histology and histopathology, before and after a practical class covering relevant material presented as either glass slides viewed with a microscope or as digital slides. There was an overall improvement in performance by students at both veterinary schools using both practical formats. Neither format was consistently better than the other, and neither school consistently outperformed the other. In a comparison of student appraisal of use of digital slides and microscopes, the digital technology was identified as having many advantages.

Since the early 1990s, computer-aided learning of different types has become an increasing part of the veterinary medicine curriculum.1 In recent years, as computing power has increased and become less expensive, the use of virtual microscopy, using digital images, has become more realistic. An early report compared teaching histology to medical students with traditional microscopes and glass slides and teaching with digitized slides,2 while implementation of virtual microscopy was described in detail by Dee and Meyerhholz in their description of teaching medical pathology.3

There have been several subsequent reports of the successful use of such methods in teaching veterinary students including first-year histology,4 cytopathology5 and histopathology.6 Each of these studies reported a generally positive response from students and highlighted various perceived advantages of virtual microscopy over traditional microscopy, including image clarity, accessibility, collaborative learning, and efficient time use.7 In addition to measuring positive student response, a comparison in the context of veterinary cytopathology5 considered the suitability of virtual microscopy for assessment. Details of the examination, graded by an instructor, were not given but student performance was not significantly different between those using traditional microscopy and those using virtual microscopes.

An introduction to veterinary pathology is provided during the third year of a 5-year program at the veterinary schools of both Bristol University and the University of Nottingham. Teaching of histopathology at Bristol is based around practical classes using microscopes and glass slides; virtual histology is used in some teaching of normal histology in previous years of the program. The University of Nottingham School of Veterinary Medicine and Science provides its students with laptop computers and has adopted virtual histology (Digital SlideBox [DSB], Leica Biosystems) as the main mode of teaching histology and histopathology. In addition to basic navigation facilities, the software allows incorporation of different quiz formats; such an option has been recognized as a feature students appreciate in the context of some other studies of computer-aided learning.7,8

This report describes the results of a comparison of virtual microscopy and traditional microscopy in teaching veterinary pathology. It assesses both objective and subjective parameters and compares the results between two institutions in which the emphasis on virtual or traditional microscopy within the curriculum differs.

Participants

Third-year veterinary students from the Universities of Bristol and Nottingham were invited to take part in a study comparing the use of traditional microscopy and virtual microscopy. Histopathology at Bristol is based around practical classes using microscopes and glass slides; at the University of Nottingham, virtual histology is the main mode of teaching histopathology. The study was carried out on a separate occasion at each site.

Students were at a roughly equivalent stage of their program, having recently had lectures covering similar topics within an introduction to veterinary pathology; aspects of the normal histology of major body systems and organs had been taught in previous years. The rationale behind what they were asked to do was only explained to them at the end of the series of tasks. The relevant Ethical Review Committees approved the exercise at each site.

The Study
Pre-Practical Objective Evaluation

Students were asked to complete a series of eight simple questions covering aspects of histology and pathology relevant to the practical class in which they were about to participate. The questions included recognition in photomicrographs (printed four per side on a single sheet of A4 paper) of normal histological features of lung (tissue recognition [assessment Question 4]) and liver (tissue recognition, portal tract identification [assessment Question 1]); recognition of individual cell types: leucocytes (identification of neutrophils, location within a vessel [assessment Question 2]), identification of plasma cells (assessment Question 3), and alveolar macrophages (assessment Question 5); knowledge of pathological processes (recognition and pathogenesis of necrosis [assessment Question 6]); and degeneration (etiology and pathogenesis of fatty change [assessment Question 8], recognition of bronchiolectasis [assessment Question 4]). Printed images (rather than digitized images or microscope slides) were used as a unified format suitable for participants from both schools.

The students were informed at this time that the test was for the instructor's benefit, rather than a formative assessment. The context of the test in the overall study was not divulged. Some questions had two components; where only one component was correct, a half mark was allocated. Each question was scored 1 for a correct answer, ½ for a partially correct answer, and 0 for an incorrect answer or no answer.

Practical Class

At each location, students assembled in an open space and were allocated to one group or another following arbitrary splitting down the middle of those present. One group used a microscope, sharing the same “class sets” of three glass slides; the other group had access to a computer suite to examine identical slides that had been digitized and viewed using DSB.

For the conventional microscopy class an instructor (DF) initially demonstrated features of interest during a brief presentation using a microscope, camera, and data projection system. The instructor was available during the class to assist students and discuss important relevant topics individually.

No introduction was provided for students using the virtual microscope system but the software allowed equivalent histopathological features of interest to be annotated and included some instructional text and multiple-choice questions to prompt students to think of relevant topics. An instructor (PB) was available to deal with any individual student's problems and questions.

Post-Practical Objective Evaluation

Immediately following the practical class, students were asked to complete the same test they had been presented with previously; they had not been told on that first occasion that they would be asked to complete this test a second time. The timescale for the different parts of the exercise was similar at each site. Fifteen minutes was allowed for completion of the pre-practical test; the microscopy or digital histology class lasted 30 minutes, followed immediately by the post-practical test lasting a further 15 minutes.

Post-Practical Qualitative Evaluation

Following completion of the post-practical test, the broader purpose of the exercise was explained to the students and they were asked to complete a paper-based questionnaire relating to their experience of microscopy and digital histology as described below:

Using microscopes and glass slides, how easy is it for you to:

  1. Obtain optimal illumination (focus/center condenser, adjust eyepieces, vary brightness)?

  2. Recognize where you are in the whole section when examining a small area at high power?

  3. Have access to a microscope whenever you want it (i.e., “24/7”)?

  4. Have access to a copy of the glass slide whenever you want it (i.e., “24/7”)?

  5. Explain to a student or teacher features that are of interest or concern?

  6. Recognize the cells and tissue changes that you are meant to identify?

Using virtual microscopy, how easy is it for you to:

  1. Obtain optimal illumination (vary brightness, contrast, etc.)?

  2. Recognize where you are in the whole section when examining a small area at high power?

  3. Have access to a computer whenever you want it (i.e., “24/7”)?

  4. Have access to a copy of a virtual slide whenever you want it (i.e., “24/7”)?

  5. Explain to a student or teacher features that are of interest or concern?

  6. Recognize the cells and tissue changes that you are meant to identify?

For each question, the options from which students could choose were very easy, easy, not too difficult, difficult, and very difficult. Since it was thought unlikely that the primary teaching method at each school was going to change significantly in the short term, participating students were not asked to choose a preference for teaching format.

Students were also invited to add any additional comments regarding the use of DSB as free text.

Statistical Analysis

Data from the content-oriented, pre- and post-practical assessments were analyzed using the Mann–Whitney U test (two-tailed) within PASW for Windows version 18.0 (SPSS Inc., Chicago, USA); p values <.05 were taken to be significant.

The total numbers of participants varied between the two veterinary schools (designated A or B). The size of student year groups was 92 at Nottingham and 100 in Bristol; the number of students who chose to participate at each site was 50 and 81 respectively. Data are presented as the relative proportions of answers that were correct, partially correct, and incorrect (or no answer given). As all test sheets were anonymous, it was not possible to compare pre- and post-practical answers for individual students.

Pre-Practical Objective Evaluation

Results for each of the eight questions, and for each veterinary school, in the pre-practical objective assessment are presented in Figure 1.

Figure 1: Pre-practical content-oriented assessment results: percentages of correct, partially correct, and incorrect answers for each of the eight questions, and for veterinary schools A and B, in the pre-practical objective assessment

* indicates significant difference in correct answers for assessment question between the schools

The proportions of correct answers provided varied for different questions. The performance of students from both schools varied between Questions and was poorest, for both schools, for assessment question 4 (identification of the features of a normal lung). For two questions (1 and 2), there was a significant difference (p<.001 and p=.005 respectively; * symbol) in the proportion of correct answers provided by students from the two veterinary schools; neither school's students performed consistently better than the other in the pre-practical test.

Combining scores for all questions, from both schools, the mean (±SD) pre-practical test score was 2.9 (±1.2) (the potential maximum score for all questions correctly answered was 8).

Post-Practical Objective Evaluation

Results for each of the eight questions, and for each veterinary school, in the post-practical objective assessment are presented, along with pre-practical test data, in Figure 2 (microscopes and glass slides) and Figure 3 (virtual microscopy).

Figure 2: Content-oriented assessment results following a practical class using microscopes and glass slides: percentages of correct, partially correct, and incorrect answers for each of the eight questions, and for veterinary schools A and B, compared with the pre-practical objective assessment

* indicates significant difference between the schools in improvement between pre- and post-practical assessment questions

† indicates significant difference in correct answers for assessment question between pre- and post-practical tests

Figure 3: Content-oriented assessment results following a virtual microscopy practical class: percentages of correct, partially correct, and incorrect answers for each of the eight questions, and for veterinary schools A and B, compared with the pre-practical objective assessment

* indicates significant difference between the schools in improvement between pre- and post-practical assessment questions

† indicates significant difference in correct answers for assessment question between pre- and post-practical tests

Following the microscope and glass slide practical (Figure 2), there was a significantly different improvement in performance by students from both universities relating to six of the assessment questions (1, 4, 5, and 6 for school A and 1, 2, 4, 5, 6, and 7 for school B; † symbol). There was a significant difference between schools in improvement in performance (B was better than A, p=.017) for only one question (relating to the * in Figure 2 which involved identification of leucocytes).

Following the DSB practical (Figure 3), there were statistically different improvements in performance for some questions by students from both schools († symbol). For school A, in addition to improvements in post-practical performance using microscopes and slides (assessment Questions 1, 4, 5, and 6), there were improvements for assessment Questions 7 and 8. For school B, improvements for four assessment questions (1, 5, 6, and 7) were similar to those found for the group using microscopes and glass slides; there was also an improvement in performance for questions relating to assessment Question 8. For three questions (1, 2, and 4), there were differences in improvement in performance between the two schools (* in Figure 3).

There was, in general, an improvement in performance by students at both veterinary schools following the practical class using either format (Figure 4), regardless of the predominant mode of practical teaching used at each school. Combining scores from both schools for all questions, the mean scores (±SD) following microscopy with glass slides or virtual microscopy were 5.5 (±1.3) and 5.2 (±1.5), respectively (the potential maximum score for all questions correctly answered was 8). Both were highly significant improvements (p<.001) over the pre-practical test performance, which was 2.9 (±1.2).

Figure 4: Mean (±SD) total scores for all eight content-oriented questions before the practical class and following two alternative practical formats

Combining the results from both practical methods, performance by students from school A improved for all image questions, while performance by students from school B improved for all image questions except for assessment questions 2 and 7.

Although improvement over pre-test performance for assessment Questions 1, 3, and 4 was not itself significantly different, the post-test improvement was significantly better for students from school A than from school B for assessment Questions 1 and 4, improving from low pre-test scores. For assessment Question 3, the improvement favored school B.

Post-Practical Qualitative Evaluation

The total number of students completing the questionnaire following the objective post-practical test varied between the veterinary schools: 57 students from Nottingham and 88 from Bristol took part in the subjective evaluation exercise; not all students responded to all elements of the questionnaire. The number of responses varied from 31 to 54 for Nottingham students and from 82 to 88 for Bristol students.

The complete range of 120 possible data items from the qualitative evaluation (possible response options: very difficult, difficult, not too difficult, easy, or very easy for each of 12 questions from each school and each practical method) is not shown and statistical analysis of individual items/responses from the qualitative questionnaire responses was not carried out, since doing comparisons at each level of the Likert scale would be prone to a Type 1 statistical error. Students' perception that virtual microscopy had advantages over the use of microscopes and glass slides is illustrated in Figure 5 (data combined from both schools), which shows the proportions of non-neutral responses (i.e., other than not too difficult). With the exception of Question 1 (relating to obtaining optimum illumination), responses relating to the use of virtual microscopy yielded a greater combination of easy and very easy responses; conversely, the use of microscopy and glass slides was associated with a greater proportion of difficult and very difficult responses.

Figure 5: Percentages of non-neutral responses (i.e., other than not too difficult) for each of the eight qualitative questionnaire items regarding the use of microscopes and slides or Digital SlideBox

Combined responses (percentage of total responses from both schools) from students from the two schools to all the prepared questionnaire items relating to the use of microscopes and glass slides or virtual microscopy are shown in Figure 6.

Figure 6: Combined responses (percentage of total responses from both schools) to all the prepared qualitative questionnaire items relating to students' perception of the use of microscopes and glass slides or virtual microscopy

Student anonymity at all stages of the exercise prevented comparison of student performance and their perceptions of the different practical class formats. The responses relating to use of DSB indicated a significantly greater ease of use and access over microscopy and glass slides (p<.001 for both schools). Furthermore, significantly more students from school B found the microscope and slides more difficult, and virtual microscopy easier than students from school A (p<.001 in both cases).

Forty-three students (9 from Nottingham, 34 from Bristol) provided additional comments; they were reviewed by one of the authors (PB). There were no negative comments about practical teaching using either format; several highlighted the advantages of virtual microscopy along the lines of those included in the questionnaire, as above. No students commented on the conduct of the exercise reported here.

There have been numerous reports of the use of virtual microscopy in teaching various different subjects to medical or veterinary students. These reports have generally concentrated on the practicality and student acceptance of the alternative technology rather than the educational benefits of either scheme. In the present study, the advantages of virtual microscopy over traditional microscopy recognized by students from both schools were similar to those listed in a study of teaching histology to first-year veterinary students4; they included the provision of clearer images, the more effective use of time, the flexibility of online learning, and the ability to learn collaboratively. In the questionnaire part of the present study, students accustomed to using microscopes and glass slides (school B) indicated greater recognition of the benefits of using virtual microscopy over the use of microscopes and glass slides than students from school A. Contributions to learning from peers was an important factor in the benefits of team-based learning in pathology9; Kumar et al.10 deliberately allocated two students per workstation. Despite general acceptance of the technology, and recognition of the many advantages of virtual microscopy in teaching veterinary cytopathology, students have expressed a desire to continue using traditional microscopes and glass slides for examination purposes.5 In the present study, recognition of the need to use a microscope was also mentioned by some students in their free-text comments but, as in many other reports, the use of DSB was generally well received. The software used allows integration of instructional text linked to selected image areas and annotated features; Schoenfeld-Tacher et al.6 emphasized the potential “dual coding” advantages that might be provided by the combination of visual and verbal information. Students' attitudes to computer-assisted learning were not determined by their learning preferences or attitudes to computers,11 while personality factors and visual perceptual ability may both be related to histopathology skill.12

In the objective comparison of pre- and post-practical class performance described in the present study, there was no consistent pattern of improvement in performance for either method of practical class teaching or between students from the different schools. While there were some differences between different questions, students acquired a similar base knowledge at the two schools using the different practical teaching formats. The relatively poorer pre-test performance for some questions is likely to reflect students having forgotten information (e.g., some features of normal histology of liver and lung presented in previous years of the program). Only a few other reports have compared the impact on test performance following virtual or traditional microscopy. No difference was found between the performance of students who were asked to “interpret major histopathological findings in the context of the supplied history, physical examination findings, and results of investigations” whether they used traditional microscopy or virtual microscopes.10 Similarly, no significantly different performance was found between those using traditional microscopy or virtual microscopes in veterinary cytopathology.5 In the context of veterinary diagnostic imaging, computer-based teaching has been shown to be as effective as or even superior to didactic lecture teaching or written text methods in terms of student exam results.12 In a study of learning anatomy, students' achievement scores did not differ significantly between computer-based and paper-based strategies but students perceived their assimilation of anatomic information to be better using computer-based strategies.13

While virtual microscopy has been recognized to have both advantages and disadvantages when compared with the use of microscopes and glass slides,9 there is evidence from other studies suggesting benefits of the use of other different forms of computer-assisted learning in medical and veterinary medical subjects, including veterinary pathology.14 Despite this, important questions that remain to be addressed were highlighted recently by Schifferdecker et al.15; these include support for staff to adopt and develop material as well as considerations of optimal design, use, and integration of computer-assisted learning. There was no marked difference in base knowledge between students who had undertaken practical teaching using either virtual microscopy or microscopes and glass slides, indicating that each can be effective. The infrastructure required to support the different modes of practical teaching (a dedicated laboratory equipped with microscopes and TV projection facilities and class slide sets or a computer suite) is likely to be a factor when considering any change in teaching method. Virtual microscopy provides a possible additional use (over and above word processing, email, data manipulation, Internet access, virtual learning, etc.) to justify support for student computer facilities. Within universities, decisions on resources for teaching are made at a high level, within departments, schools, and faculties, based on factors such as physical space and cost, rather than possible student preferences for different teaching styles (such as VARK—Visual [V], Auditory [A], Read/Write [R], and Kinesthetic [K],16 or LSI—Learning [L] Style [S] Inventory [I]17). It is likely that within a year group of students, there will be a range of effective teaching styles and using a single approach to teaching is not appropriate.18 In at least some medical students, learning styles may change in relation to time.19 Although there are advocates of tailoring instructional techniques to accommodate student learning styles, a review of published material raises doubts about the value of style-based instruction.20

Performance on a simple test on aspects of veterinary histology and histopathology improved following a practical class in which students from two UK veterinary schools examined glass slides using a conventional microscope or digitized slides using a virtual microscope. Neither format was consistently better than the other. In a comparison of student appraisal of the use of digital slides and microscopes, the digital technology had many advantages.

ACKNOWLEDGMENTS

We would like to thank the participating veterinary students from the Universities of Bristol and Nottingham who were kept in the dark about the aims of this study until the end of their sessions.

1. Holmes MA, Nicholls PK. Computer-aided veterinary learning at the University of Cambridge. Vet Rec. 1996;138(9):199203. http://dx.doi.org/10.1136/vr.138.9.199. Medline:8686151 MedlineGoogle Scholar
2. Harris T, Leaven T, Heidger P, et al. Comparison of a virtual microscope laboratory to a regular microscope laboratory for teaching histology. Anat Rec. 2001;265(1):104. http://dx.doi.org/10.1002/ar.1036. Medline:11241206 MedlineGoogle Scholar
3. Dee FR, Meyerholz DK. Teaching medical pathology in the twenty-first century: virtual microscopy applications. J Vet Med Educ. 2007;34(4):4316. http://dx.doi.org/10.3138/jvme.34.4.431. Medline:18287469 LinkGoogle Scholar
4. Mills PC, Bradley AP, Woodall PF, et al. Teaching histology to first-year veterinary science students using virtual microscopy and traditional microscopy: a comparison of student responses. J Vet Med Educ. 2007;34(2):17782. http://dx.doi.org/10.3138/jvme.34.2.177. Medline:17446646 LinkGoogle Scholar
5. Neel JA, Grindem CB, Bristol DG. Introduction and evaluation of virtual microscopy in teaching veterinary cytopathology. J Vet Med Educ. 2007;34(4):43744. http://dx.doi.org/10.3138/jvme.34.4.437. Medline:18287470 LinkGoogle Scholar
6. Schoenfeld-Tacher RM, McConnell SL, Schultheiss T. Use of interactive online histopathology modules at different stages of a veterinary program. J Vet Med Educ. 2003;30(4):36471. http://dx.doi.org/10.3138/jvme.30.4.364. Medline:14976624 LinkGoogle Scholar
7. Sims H, Mendis-Handagama C, Moore RN. Virtual microscopy in a veterinary curriculum. J Vet Med Educ. 2007;34(4):41622. http://dx.doi.org/10.3138/jvme.34.4.416 LinkGoogle Scholar
8. Denwood M, Dale VH, Yam P. Development and evaluation of an online computer-aided learning (CAL) package to promote small-animal welfare. J Vet Med Educ. 2008;35(2):31824. http://dx.doi.org/10.3138/jvme.35.2.318. Medline:18723822 LinkGoogle Scholar
9. Koles P, Nelson S, Stolfi A, et al. Active learning in a Year 2 pathology curriculum. Med Educ. 2005;39(10):104555. http://dx.doi.org/10.1111/j.1365-2929.2005.02248.x. Medline:16178832 MedlineGoogle Scholar
10. Kumar RK, Velan GM, Korell SO, et al. Virtual microscopy for learning and assessment in pathology. J Pathol. 2004;204(5):6138. http://dx.doi.org/10.1002/path.1658. Medline:15470688 MedlineGoogle Scholar
11. Steele DJ, Johnson Palensky JE, Lynch TG, et al. Learning preferences, computer attitudes, and student evaluation of computerised instruction. Med Educ. 2002;36(3):22532. http://dx.doi.org/10.1046/j.1365-2923.2002.01141.x. Medline:11879512 MedlineGoogle Scholar
12. Dale VMH, Sullivan M, Irvine DR. Computer-assisted learning as an alternative to didactic lectures: a study of teaching the physics of diagnostic imaging. ALT-J. 1999;7(3):7586 Google Scholar
13. Khalil MK, Johnson TE, Lamar CH. Comparison of computer-based and paper-based imagery strategies in learning anatomy. Clin Anat. 2005;18(6):45764. http://dx.doi.org/10.1002/ca.20158. Medline:16015614 MedlineGoogle Scholar
14. Brown P. Objective and subjective evaluation of computer-based tutorial teaching in veterinary pathology. Br J Ed Tech. 2001;32(2):24547. http://dx.doi.org/10.1111/1467-8535.00194 Google Scholar
15. Schifferdecker KE, Berman NB, Fall LH, et al. Adoption of computer-assisted learning in medical education: the educators' perspective. Med Educ. 2012;46(11):106373. http://dx.doi.org/10.1111/j.1365-2923.2012.04350.x. Medline:23078683 MedlineGoogle Scholar
16. Fleming N. The VARK questionnaire: how do I learn best? [Internet]. Christchurch, New Zealand: VARK Learn Limited; 2007 [cited 2015 Oct 22]. Available from: http://www.vark-learn.com/english/page.asp?p=questionnaire Google Scholar
17. Kolb DA. The Kolb Learning Style Inventory. Boston, MA: Hay Resources Direct; 1999 Google Scholar
18. Kharb P, Samanta PP, Jindal M, et al. The learning styles and the preferred teaching—learning strategies of first year medical students. J Clin Diagn Res. 2013;7(6):108992. http://dx.doi.org/10.7860/JCDR/2013/5809.3090. Medline:23905110 MedlineGoogle Scholar
19. Gurpinar E, Bati H, Tetik C. Learning styles of medical students change in relation to time. Adv Physiol Educ. 2011;35(3):30711. http://dx.doi.org/10.1152/advan.00047.2011. Medline:21908841 MedlineGoogle Scholar
20. Rohrer D, Pashler H. Learning styles: where's the evidence? Med Educ. 2012;46(7):6345. http://dx.doi.org/10.1111/j.1365-2923.2012.04273.x. Medline:22691144 MedlineGoogle Scholar