Research Article
BibTex RIS Cite

Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme

Year 2018, Volume: 11 Issue: 2, 1 - 12, 15.11.2018

Abstract

Mikroskop dar bir görüş alanına sahip olduğu için
sitopatolojik değerlendirme süreçlerinde patologlar numunenin sadece belirli
bir kısmını görebilmektedirler. Numunenin tüm alanını inceleyebilmek için
mikroskop platformunu X-Y-Z yönünde hareket ettirerek numune üzerinde üç
boyutlu tarama yapmaktadırlar. Yapılan çalışmada sitopatolojik değerlendirme
süreçleri otomatikleştirilerek numunenin geniş görüş alanına sahip yüksek
çözünürlüklü panoramik görüntüsünün elde edilmesi amaçlanmaktadır. Panoramik
birleştirme sürecinin otomatikleştirilmesi için yapılan literatür çalışmalarında
ortak alanlı görüntüler oluşturulurken mikroskopta var olan ve mikron cinsinden
ölçülen odaklama derinliği dikkate alınmamaktadır. Bu yüzden ortak alanlı
görüntüler arasında odaklama farklılıkları oluşmakta ve tarama anında bulanık
görüntüler elde edilmektedir. Bu problemi çözmek için çalışmada odaklama
derinliği artırılarak optimum odaklanmış ortak alanlı görüntüler
oluşturulmaktadır. Önerilen yöntemin başarısının ispatı için literatürde
önerilmiş 2 farklı tarama süreci kullanılarak panoramik görüntüler elde
edilmiştir. Oluşturulmuş panoramik görüntüler referans görüntü gerektirmeyen
metrikler kullanılarak karşılaştırılmış ve önerilen yöntemin başarısı hem
sayısal hem de görsel sonuçlarla ispatlanmıştır.

References

  • [1] Schneider T. E., Bell A. A., Meyer-Ebrecht D., Böcking A., Aach T. “Computer aided cytological cancer diagnosis: cell type classification as a step towards fully automatic cancer diagnostics on cytopathological specimens of serous effusions”, Medical Imaging, International Society for Optics and Photonics, vol.6514, pp. 6514-6524, 2007.
  • [2] Doğan H., Ekinci M., “Automatic panorama with auto-focusing based on image fusion for microscopic imaging system”, Signal, Image and Video Processing, vol. 8, pp. 5-20, 2014.
  • [3] Born M., Wolf E., “Principles of Optics (7th Ed)”, Cambridge University Press, 1999.
  • [4] Goldsmith N.T., “Deep focus; a digital image processing technique to produce improved focal depth in light microscopy”, Image Analysis – Stereology, vol. 19, pp. 163-167, 2011.
  • [5] Piccinini F., Tesei A., Zoli W., Bevilacqua A., “Extended depth of focus in optical microscopy: Assessment of existing methods and a new proposal”, Microscopy Research and Technique, vol. 75, pp. 1582-1592, 2012.
  • [6] Forster B., Van De Ville D., Berent J., Sage, D., Unser M., “Complex wavelets for extended depth-of-field: A new method for the fusion of multichannel microscopy images”, Microscopy Research and Technique, vol. 65, pp. 33–42, 2004.
  • [7] Ma B., Zimmermann T., Rohde M., Winkelbach S., He F., Lindenmaier W., Dittmar K. E., “Use of autostitch for automatic stitching of microscope images”, Micron, vol. 38, pp. 492-499, 2007.
  • [8] Yang F., Deng Z. S., Fan Q. H., “A method for fast automated microscope image stitching”, Micron, vol. 48, pp. 17-25, 2013.
  • [9] Appleton B., Bradley A. P., Wildermoth M., “Towards Optimal Image Stitching for Virtual Microscopy”, in Digital Image Computing: Techniques and Applications (DICTA'05), Queensland, Australia, 2005, pp. 44-44.
  • [10] Sun C., Beare R., Hilsenstein V., Jackway P., “Mosaicing of microscope images with global geometric and radiometric corrections”, Journal of Microscopy, vol. 224, pp. 158-165, 2006.
  • [11] Loewke K. E., Camarillo D. B., Piyawattanametha W., Mandella M. J., Contag C. H., Thrun S., Salisbury J. K.,“In vivo micro-image mosaicing” IEEE Transactions on Biomedical Engineering, vol. 58, pp. 159-171, 2011.
  • [12] Wu Y., Fang Y., Liu X., Ren X., Guo J., Yuan X., “Millimeter scale global visual field construction for atomic force microscopy based on automatic image stitching”, In Manipulation, Automation and Robotics at Small Scales (MARSS), 2017, pp. 1-5.
  • [13] Hsu W. Y., Poon W. F., Sun Y. N., “Automatic seamless mosaicing of microscopic images: enhancing appearance with colour degradation compensation and wavelet‐based blending”, Journal of Microscopy, vol. 231, pp. 408-418, 2008.
  • [14] Thévenaz P., Unser M., “User‐friendly semiautomated assembly of accurate image mosaics in microscopy”, Microscopy Research and Technique, vol. 70, pp. 135-146, 2007.
  • [15] Legesse F. B., Chernavskaia O., Heuke S., Bocklitz T., Meyer T., Popp J., Heintzmann R., “Seamless stitching of tile scan microscope images”, Journal of Microscopy, vol. 258, pp. 223-232, 2015.
  • [16] Han S., Yang J., Wan H., “An automated wide-view imaging system of pathological tissue under optical microscopy”, in Biomedical Image and Signal Processing (ICBISP), 2017, pp. 1-6.
  • [17] Forster B., Van De Ville D., Berent J., Sage D., Unser M., “Extended Depth-of-Focus for Multichannel Microscopy Images: A Complex Wavelet Approach”, in International Symposium on Biomedical Imaging: Nano to Macro, 2004, pp. 660-663.
  • [18] Choi H., Cheng S., Wu Q., Castleman K. R.,Bovik A. C., “Extended depth-of-field using adjacent plane deblurring and MPP wavelet fusion for microscope images”, in 3rd IEEE International Symposium on Biomedical Imaging: Nano to Macro, 2006, pp. 774-777.
  • [19] Tessens L., Ledda A., Pizurica A., Philips W., “Extending the Depth of Field in Microscopy Through Curvelet-Based Frequency-Adaptive Image Fusion”, in International Conference on Acoustics, Speech and Signal Processing - ICASSP ’07, 2007, pp. 861-864.
  • [20] Doğan H., Baykal E., Ekinci M., Ercin M. E., Ersöz Ş., “Optimal focusing with extended depth of focus in microscopic systems”, in 25th Signal Processing and Communications Applications Conference (SIU), Antalya, 2017, pp. 1-4.
  • [21] Li S., Kang X., Fang L., Hu J., Yin H., “Pixel-level image fusion: A survey of the state of the art”, Information Fusion, vol. 33, pp. 100-112, 2017.
  • [22] Sahu A., Bhateja V., Krishn A., Himanshi, “Medical image fusion with laplacian pyramids”, In 2014 International Conference on Medical Imaging, m-Health and Emerging Communication Systems (MedCom), 2014, pp. 448-453.
  • [23] Petrovic V.S., Xydeas C.S., “Gradient-based multiresolution image fusion”, IEEE Transactions on Image Processing, vol. 13, pp. 228-237, 2004.
  • [24] Denipote J.G., Paiva M.S.V., “A fourier transform-based approach to fusion high spatial resolution remote sensing images”, In 2008 Sixth Indian Conference on Computer Vision, Graphics Image Processing, 2008, pp. 179-186.
  • [25] Naidu V., “Discrete cosine transform-based image fusion”, Defence Science Journal, vol. 60, pp. 48-54, 2010.
  • [26] Pajares G., de la Cruz J.M., “A wavelet-based image fusion tutorial”, Pattern Recognition, vol. 37, pp. 1855-1872, 2004.
  • [27] Chai Y., Li H., Zhang X., “Multifocus image fusion based on features contrast of multiscale products in nonsubsampledcontourlet transform domain”, Optik – International Journal for Light and Electron Optics, vol. 123, pp. 569-581, 2012.
  • [28] Nejati M., Samavi S., Shirani S., “Multi-focus image fusion using dictionary-based sparse representation”, Information Fusion, vol. 25, pp. 72-84, 2015.
  • [29] Xia X., Yao Y., Liang J., Fang S., Yang Z., Cui D., “Evaluation of focus measures for the autofocus of line scan cameras”, Optik - International Journal for Light and Electron Optics, vol. 127, pp. 7762-7775, 2016.
  • [30] Krotov E.P.,“Active computervisionbycooperativefocusand stereo”, New York:Springer-Verlag, 1989.
  • [31] Kailath T., “The Divergence and Bhattacharyya Distance Measures in Signal Selection”, IEEE Transactions on Communication Technology, Vol. 15, pp. 52-60, 1967.
  • [32] Lowe D.G., “Object recognition from local scale-invariant features”, in International Conference on Computer Vision, 1999, pp. 1150-1157.
  • [33] Fischler M. A., Bolles R. C., “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography”, Comm. of the ACM, vol. 24, pp. 381-395, 1981.
  • [34] Doğan H., Baykal E., Ekinci M., ErcinM. E.,Ersöz Ş., “Determination of optimum auto focusing function for cytopathological assessment processes”,in 2017 Medical Technologies National Congress (TIPTEKNO), Trabzon, Turkey, 2017, pp. 1-4.
  • [35] Crete-Roffet F., Dolmiere T., Ladret P., Nicolas M.,“TheBlur Effect: Perceptionand Estimation with a New No-Reference Perceptual Blur Metric”, In SPIE ElectronicImaging Symposium Conf. Human Vision and Electronic Imaging, 2007, pp. 6492-16.

High Resolution Automatic Panoramic Imaging by Maintaining Optimal Range for Cytopathological Analysis

Year 2018, Volume: 11 Issue: 2, 1 - 12, 15.11.2018

Abstract

Since the microscope has a small field of view,
pathologists can only see a certain part of the specimen during the
cytopathological analysis process. In order to see the whole area of the
sample, they scan the sample in three dimensions by moving the microscope
platform in the X-Y-Z direction. The aim of the study is to obtain a high
resolution panoramic image of the sample by automating the process of
cytopathological analysis. Literature studies for the automation of the
panoramic imaging process do not take into account the depth of focus measured
in microns, which is present in the microscope, while creating images with the
same field of view. This results in differences in focus between the images and
during the scanning process blurred images are obtained. In order to solve this
problem, the depth of focus is extended to produce optimum focused images. To
evaluate the success of the proposed method, panoramic images were obtained
using two different scanning processes suggested in the literature. The
generated panoramic images are compared using the metrics without requiring
reference image and the success of the proposed method is proved by both
quantitative and visual results

References

  • [1] Schneider T. E., Bell A. A., Meyer-Ebrecht D., Böcking A., Aach T. “Computer aided cytological cancer diagnosis: cell type classification as a step towards fully automatic cancer diagnostics on cytopathological specimens of serous effusions”, Medical Imaging, International Society for Optics and Photonics, vol.6514, pp. 6514-6524, 2007.
  • [2] Doğan H., Ekinci M., “Automatic panorama with auto-focusing based on image fusion for microscopic imaging system”, Signal, Image and Video Processing, vol. 8, pp. 5-20, 2014.
  • [3] Born M., Wolf E., “Principles of Optics (7th Ed)”, Cambridge University Press, 1999.
  • [4] Goldsmith N.T., “Deep focus; a digital image processing technique to produce improved focal depth in light microscopy”, Image Analysis – Stereology, vol. 19, pp. 163-167, 2011.
  • [5] Piccinini F., Tesei A., Zoli W., Bevilacqua A., “Extended depth of focus in optical microscopy: Assessment of existing methods and a new proposal”, Microscopy Research and Technique, vol. 75, pp. 1582-1592, 2012.
  • [6] Forster B., Van De Ville D., Berent J., Sage, D., Unser M., “Complex wavelets for extended depth-of-field: A new method for the fusion of multichannel microscopy images”, Microscopy Research and Technique, vol. 65, pp. 33–42, 2004.
  • [7] Ma B., Zimmermann T., Rohde M., Winkelbach S., He F., Lindenmaier W., Dittmar K. E., “Use of autostitch for automatic stitching of microscope images”, Micron, vol. 38, pp. 492-499, 2007.
  • [8] Yang F., Deng Z. S., Fan Q. H., “A method for fast automated microscope image stitching”, Micron, vol. 48, pp. 17-25, 2013.
  • [9] Appleton B., Bradley A. P., Wildermoth M., “Towards Optimal Image Stitching for Virtual Microscopy”, in Digital Image Computing: Techniques and Applications (DICTA'05), Queensland, Australia, 2005, pp. 44-44.
  • [10] Sun C., Beare R., Hilsenstein V., Jackway P., “Mosaicing of microscope images with global geometric and radiometric corrections”, Journal of Microscopy, vol. 224, pp. 158-165, 2006.
  • [11] Loewke K. E., Camarillo D. B., Piyawattanametha W., Mandella M. J., Contag C. H., Thrun S., Salisbury J. K.,“In vivo micro-image mosaicing” IEEE Transactions on Biomedical Engineering, vol. 58, pp. 159-171, 2011.
  • [12] Wu Y., Fang Y., Liu X., Ren X., Guo J., Yuan X., “Millimeter scale global visual field construction for atomic force microscopy based on automatic image stitching”, In Manipulation, Automation and Robotics at Small Scales (MARSS), 2017, pp. 1-5.
  • [13] Hsu W. Y., Poon W. F., Sun Y. N., “Automatic seamless mosaicing of microscopic images: enhancing appearance with colour degradation compensation and wavelet‐based blending”, Journal of Microscopy, vol. 231, pp. 408-418, 2008.
  • [14] Thévenaz P., Unser M., “User‐friendly semiautomated assembly of accurate image mosaics in microscopy”, Microscopy Research and Technique, vol. 70, pp. 135-146, 2007.
  • [15] Legesse F. B., Chernavskaia O., Heuke S., Bocklitz T., Meyer T., Popp J., Heintzmann R., “Seamless stitching of tile scan microscope images”, Journal of Microscopy, vol. 258, pp. 223-232, 2015.
  • [16] Han S., Yang J., Wan H., “An automated wide-view imaging system of pathological tissue under optical microscopy”, in Biomedical Image and Signal Processing (ICBISP), 2017, pp. 1-6.
  • [17] Forster B., Van De Ville D., Berent J., Sage D., Unser M., “Extended Depth-of-Focus for Multichannel Microscopy Images: A Complex Wavelet Approach”, in International Symposium on Biomedical Imaging: Nano to Macro, 2004, pp. 660-663.
  • [18] Choi H., Cheng S., Wu Q., Castleman K. R.,Bovik A. C., “Extended depth-of-field using adjacent plane deblurring and MPP wavelet fusion for microscope images”, in 3rd IEEE International Symposium on Biomedical Imaging: Nano to Macro, 2006, pp. 774-777.
  • [19] Tessens L., Ledda A., Pizurica A., Philips W., “Extending the Depth of Field in Microscopy Through Curvelet-Based Frequency-Adaptive Image Fusion”, in International Conference on Acoustics, Speech and Signal Processing - ICASSP ’07, 2007, pp. 861-864.
  • [20] Doğan H., Baykal E., Ekinci M., Ercin M. E., Ersöz Ş., “Optimal focusing with extended depth of focus in microscopic systems”, in 25th Signal Processing and Communications Applications Conference (SIU), Antalya, 2017, pp. 1-4.
  • [21] Li S., Kang X., Fang L., Hu J., Yin H., “Pixel-level image fusion: A survey of the state of the art”, Information Fusion, vol. 33, pp. 100-112, 2017.
  • [22] Sahu A., Bhateja V., Krishn A., Himanshi, “Medical image fusion with laplacian pyramids”, In 2014 International Conference on Medical Imaging, m-Health and Emerging Communication Systems (MedCom), 2014, pp. 448-453.
  • [23] Petrovic V.S., Xydeas C.S., “Gradient-based multiresolution image fusion”, IEEE Transactions on Image Processing, vol. 13, pp. 228-237, 2004.
  • [24] Denipote J.G., Paiva M.S.V., “A fourier transform-based approach to fusion high spatial resolution remote sensing images”, In 2008 Sixth Indian Conference on Computer Vision, Graphics Image Processing, 2008, pp. 179-186.
  • [25] Naidu V., “Discrete cosine transform-based image fusion”, Defence Science Journal, vol. 60, pp. 48-54, 2010.
  • [26] Pajares G., de la Cruz J.M., “A wavelet-based image fusion tutorial”, Pattern Recognition, vol. 37, pp. 1855-1872, 2004.
  • [27] Chai Y., Li H., Zhang X., “Multifocus image fusion based on features contrast of multiscale products in nonsubsampledcontourlet transform domain”, Optik – International Journal for Light and Electron Optics, vol. 123, pp. 569-581, 2012.
  • [28] Nejati M., Samavi S., Shirani S., “Multi-focus image fusion using dictionary-based sparse representation”, Information Fusion, vol. 25, pp. 72-84, 2015.
  • [29] Xia X., Yao Y., Liang J., Fang S., Yang Z., Cui D., “Evaluation of focus measures for the autofocus of line scan cameras”, Optik - International Journal for Light and Electron Optics, vol. 127, pp. 7762-7775, 2016.
  • [30] Krotov E.P.,“Active computervisionbycooperativefocusand stereo”, New York:Springer-Verlag, 1989.
  • [31] Kailath T., “The Divergence and Bhattacharyya Distance Measures in Signal Selection”, IEEE Transactions on Communication Technology, Vol. 15, pp. 52-60, 1967.
  • [32] Lowe D.G., “Object recognition from local scale-invariant features”, in International Conference on Computer Vision, 1999, pp. 1150-1157.
  • [33] Fischler M. A., Bolles R. C., “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography”, Comm. of the ACM, vol. 24, pp. 381-395, 1981.
  • [34] Doğan H., Baykal E., Ekinci M., ErcinM. E.,Ersöz Ş., “Determination of optimum auto focusing function for cytopathological assessment processes”,in 2017 Medical Technologies National Congress (TIPTEKNO), Trabzon, Turkey, 2017, pp. 1-4.
  • [35] Crete-Roffet F., Dolmiere T., Ladret P., Nicolas M.,“TheBlur Effect: Perceptionand Estimation with a New No-Reference Perceptual Blur Metric”, In SPIE ElectronicImaging Symposium Conf. Human Vision and Electronic Imaging, 2007, pp. 6492-16.
There are 35 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Makaleler(Araştırma)
Authors

Hülya Doğan

Elif Baykal This is me

Murat Ekinci

Mustafa Emre Ercin

Şafak Ersöz This is me

Publication Date November 15, 2018
Published in Issue Year 2018 Volume: 11 Issue: 2

Cite

APA Doğan, H., Baykal, E., Ekinci, M., Ercin, M. E., et al. (2018). Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme. Türkiye Bilişim Vakfı Bilgisayar Bilimleri Ve Mühendisliği Dergisi, 11(2), 1-12.
AMA Doğan H, Baykal E, Ekinci M, Ercin ME, Ersöz Ş. Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme. TBV-BBMD. November 2018;11(2):1-12.
Chicago Doğan, Hülya, Elif Baykal, Murat Ekinci, Mustafa Emre Ercin, and Şafak Ersöz. “Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme”. Türkiye Bilişim Vakfı Bilgisayar Bilimleri Ve Mühendisliği Dergisi 11, no. 2 (November 2018): 1-12.
EndNote Doğan H, Baykal E, Ekinci M, Ercin ME, Ersöz Ş (November 1, 2018) Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme. Türkiye Bilişim Vakfı Bilgisayar Bilimleri ve Mühendisliği Dergisi 11 2 1–12.
IEEE H. Doğan, E. Baykal, M. Ekinci, M. E. Ercin, and Ş. Ersöz, “Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme”, TBV-BBMD, vol. 11, no. 2, pp. 1–12, 2018.
ISNAD Doğan, Hülya et al. “Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme”. Türkiye Bilişim Vakfı Bilgisayar Bilimleri ve Mühendisliği Dergisi 11/2 (November 2018), 1-12.
JAMA Doğan H, Baykal E, Ekinci M, Ercin ME, Ersöz Ş. Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme. TBV-BBMD. 2018;11:1–12.
MLA Doğan, Hülya et al. “Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme”. Türkiye Bilişim Vakfı Bilgisayar Bilimleri Ve Mühendisliği Dergisi, vol. 11, no. 2, 2018, pp. 1-12.
Vancouver Doğan H, Baykal E, Ekinci M, Ercin ME, Ersöz Ş. Sitopatolojik Değerlendirme Süreçleri için Optimum Aralığın Korunmasıyla Yüksek Çözünürlüklü Otomatik Panoramik Görüntüleme. TBV-BBMD. 2018;11(2):1-12.

Article Acceptance

Use user registration/login to upload articles online.

The acceptance process of the articles sent to the journal consists of the following stages:

1. Each submitted article is sent to at least two referees at the first stage.

2. Referee appointments are made by the journal editors. There are approximately 200 referees in the referee pool of the journal and these referees are classified according to their areas of interest. Each referee is sent an article on the subject he is interested in. The selection of the arbitrator is done in a way that does not cause any conflict of interest.

3. In the articles sent to the referees, the names of the authors are closed.

4. Referees are explained how to evaluate an article and are asked to fill in the evaluation form shown below.

5. The articles in which two referees give positive opinion are subjected to similarity review by the editors. The similarity in the articles is expected to be less than 25%.

6. A paper that has passed all stages is reviewed by the editor in terms of language and presentation, and necessary corrections and improvements are made. If necessary, the authors are notified of the situation.

0

.   This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.