SPIE International Year of Light 2015 Photo Contest Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
DSS Defense + Security | Call for papers

SPIE Photonics West 2015 | Call for Papers

Journal of Medical Imaging | Learn more

SPIE PRESS




Print PageEmail PageView PDF

Biomedical Optics & Medical Imaging

Teledermatologic views at 16-bit color depth yield accurate diagnoses

The use of 16-bit imaging for evaluating dermatologic conditions in remote populations is as reliable as using 32-bit in-hospital imaging.
16 May 2014, SPIE Newsroom. DOI: 10.1117/2.1201405.005474

The Veterans Affairs (VA) Veterans Integrated Service Network 20 (VISN 20) is a group of healthcare facilities serving 135 counties in the Pacific Northwest and Alaska, an area that encompasses nearly one-quarter of the landmass of the United States.1 Its mission is to provide top-quality care to America's veterans. Given the physical expanse of the area served, however, geographic constraints can impose challenging difficulties for rural veterans seeking consultations and diagnoses. The VISN 20 teledermatology project2 using remote imaging for diagnostic purposes was implemented in 2009 to help address this issue by providing co-managed dermatology care for these veterans.

Purchase SPIE Field Guide to MicroscopyMultiple studies have compared the diagnostic accuracy of face-to-face imaging with remote imaging—teledermatology—and a majority of these report slightly better accuracy with the former. However, the American Telemedicine Association has determined that 24-bit images3 are suitable for remote diagnoses, and comparisons of face-to-face with remote approaches yielded very similar management concordance and outcomes. Further, patient satisfaction is consistently high with teledermatology.4, 5

Digital-image quality is dependent on multiple factors: the capture device used, image size, compression algorithm (i.e., JPEG, BMP, TIFF, RAW), monitor size and resolution, and color depth. In our study, in which we used similar capture devices, image sizes, and monitors, color depth measured in bits per pixel, or ‘bit,’ was the principal variable. Although in-hospital views on the VA Computerized Patient Record System are shown as 32-bit images, the Internet-based system views on the Citrix Access Gateway, a remote-access device used to produce images from off-site locations, displays 16-bit images.

We undertook a quality assurance review comparing diagnostic accuracy between 16-bit and 32-bit color (i.e., remote and local access) of a VISN 20 rural teledermatology project. Beginning in 2009, two of us, along with six board-certified dermatologists serving as consultants, chose a retrospective cohort of 618 patients from VISN 20 to evaluate for 18 months. The VA predominantly serves an older, Caucasian male population, both nationally and in VISN 20,6, 7 but because this was a quality improvement/quality assurance project, we did not collect detailed demographic data. We required all patients to have their lesions biopsied as recommended during our consultations, and we subsequently confirmed the diagnoses—821 lesions—with histopathology, the diagnostic gold standard. We required an exact diagnosis within a narrow range. For example, we considered a hypertrophic actinic keratosis equal to actinic keratosis. Conversely, however, we did not regard severely atypical nevus or hypertrophic actinic keratosis as equal, respectively, to melanoma in situ or squamous cell carcinoma in situ.

One physician performed consultations using both 16- and 32-bit imaging, and four used 16-bit imaging, allowing comparisons both within an individual physician's findings and between other physicians' findings. We performed all analyses at the level of the biopsy (n=821). As we correlated measures from biopsies within patient and within physician, we compared diagnostic accuracy between the 16- and 32-bit groups using a mixed-effects logistic regression model with one fixed effect (image group), a random intercept effect for the physician reader, and a random intercept effect for the patient. We adjusted for the physician reader because we did not randomize image groups, and allowing for any factors related to group assignment is necessary for causal inference on outcomes between image groups.

We determined the diagnostic accuracy as follows: 350 (43%) biopsies were performed in conjunction with 16-bit imaging and 471 (57%) with 32-bit imaging (see Table 1). Diagnostic accuracy was slightly higher for 32-bit versus 16-bit (77% versus 74%) imaging, and results from the mixed-effects logistic regression model estimated the odds ratio of a correct diagnosis with the 32-bit image as 1.21, with p-value=0.25; 95% confidence interval=0.88–1.67. These results are not statistically significant. The most common diagnosis was basal cell carcinoma at 189 (24%), followed by seborrheic keratosis at 104 (13%), and squamous cell carcinoma at 63 (8%). There were 18 malignant melanomas (2%) and 7 melanomas in situ (1%) identified (see Table 2).

Table 1. Diagnostic accuracy as a function of image quality. n: Number. OR: Odds ratio. CI: Confidence interval. (Click table to enlarge.)

 Table 2. Most common diagnoses.
Basal cell carcinoma 189 24%
Seborrheic keratosis 104 13%
Squamous cell carcinoma 63 8%
Actinic keratosis 61 8%
Squamous cell carcinoma in situ 35 4%
Epidermoid cyst 24 3%
Compound nevus 23 3%
Dermatofibroma 19 2%
Dermatitis 18 2%
Melanoma 18 2%
Skin tag 18 2%
Hemangioma 16 2%
Benign lichenoid keratosis 13 2%
Lentigo 13 2%
Sebaceous hyperplasia 12 2%
Dysplastic nevus 10 1%
Melanoma in situ 7 1%

The difference between 16- and 32-bit imaging is visible only at very high magnification or when using a gradient of a single color (see Figure 1), which leads to obvious color banding on the 16-bit image where the right side of the image shows sharp steps in the color gradient. It is difficult to distinguish between even very low- and very high-quality photos (see Figure 2) when comparing close-up views of 8-bit (256 colors) and 24-bit (16.7 million colors) images. When comparing 16- and 32-bit clinical images of a malignant melanoma from this study, we found them virtually indistinguishable (see Figure 3). On 400% magnification, there is subtle color banding and a granular quality on the 16-bit image, but we are confident that these would not impede the clinician's view. Thus our results, showing an overall diagnostic accuracy of 76%, which is on the high end compared with other published data,4 revealed no significant difference in diagnostic accuracy between 16- and 32-bit imaging and confirmed that using 16-bit imaging does not compromise quality of care.


Figure 1. Color banding with 16-bit color occurs in a gradient of a single color.

Figure 2. Illustrating the subtle difference between very low (top) and very high (bottom) color depth versions of a photo.8

Figure 3. 16- versus 32-bit clinical images of a malignant melanoma. Insets are at 400×magnification.

An unavoidable weakness of this study lies in the homogeneity of the veteran population, which limits our ability to generalize the findings. In addition, the requirement for histopathologic correlation that we imposed may also slant the diagnoses toward suspected skin cancers.

To continue to refine teledermatologic imaging, we are expanding the scale of the project with quality assurance initiatives to ensure the highest level of care. All remote sites now have high-quality digital cameras, and we are focusing on continued training for those obtaining the images. Finally, although this project demonstrated equivalent diagnostic accuracy for varying-image color depth, all consults will likely be done with 32-bit images as the system is upgraded.


Jonathan Olson, Jill McKenzie, Gayle E. Reiber
University of Washington
Seattle, WA
Greg Raugi, Leslie Taylor
VA Puget Sound Health Care System
Seattle, WA

References:
1. http://www.census.gov/prod/cen2010/cph-2-1.pdf US 2010 Census of Population and Housing, Tables 1 and 41. Accessed 26 March 2012.
2. L. V. McFarland, G. J. Raugi, L. L. Taylor, G. E. Reiber, Implementation of an education and skills programme in a teledermatology project for rural veterans, J. Telemed. Telecare 18(2), p. 66-71, 2012.
3. E. Krupinski, A. Burdick, H. Pak, J. Bocachica, L. Earles, K. Edison, M. Goldyne, et al., American Telemedicine Association's practice guideline for teledermatology, Telemed. J. E Health, p. 289-302, 2007. doi:10.1089/tmj.2007.0129
4. E. M. Warshaw, Y. J. Hillman, N. L. Greer, E. M. Hagel, R. MacDonald, I. R. Rutks, T. J. Wilt, Teledermatology for diagnosis and management of skin conditions: a systematic review, J. Am. Acad. Dermatol. 64(4), p. 759-772, 2011. doi:10.1016/j.jaad.2010.08.026
5. M. A. Weinstock, F. Q. Nguyen, P. M. Risica, Patient and referring provider satisfaction with teledermatology, J. Am. Acad. Dermatol. 47(1), p. 68-72, 2002.
6. http://www.visn20.med.va.gov/VISN20/docs/VISN202010AnnualReport.pdf US Department of Veteran Affairs 2010 annual report. Accessed 6 April 6, 2012.
7. http://www.va.gov/vetdata/Veteran_Population.asp Veteran population statistics for 2010 as reported on the National Center for Veterans Analysis and Statistics site. Accessed 6 April 2012.
8. http://en.wikipedia.org/wiki/Color_depth The properties of color depth are defined and described. Accessed 26 March 2012.