• Media type: E-Article; Text
  • Title: Semantic segmentation of non-linear multimodal images for disease grading of inflammatory bowel disease: A segnet-based application
  • Contributor: Pradhan, Pranita [Author]; Meyer, Tobias [Author]; Vieth, Michael [Author]; Stallmach, Andreas [Author]; Waldner, Maximilian [Author]; Schmitt, Michael [Author]; Popp, Juergen [Author]; Bocklitz, Thomas [Author]
  • Published: [Sétubal] : SCITEPRESS - Science and Technology Publications Lda., 2019
  • Issue: published Version
  • Language: English
  • DOI: https://doi.org/10.34657/9257; https://doi.org/10.5220/0007314003960405
  • ISBN: 978-989-758-351-3
  • Keywords: Konferenzschrift ; Semantic Segmentation ; Non-linear Multimodal Imaging ; Inflammatory Bowel Disease
  • Origination:
  • Footnote: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Description: Non-linear multimodal imaging, the combination of coherent anti-stokes Raman scattering (CARS), two-photon excited fluorescence (TPEF) and second harmonic generation (SHG), has shown its potential to assist the diagnosis of different inflammatory bowel diseases (IBDs). This label-free imaging technique can support the ‘gold-standard’ techniques such as colonoscopy and histopathology to ensure an IBD diagnosis in clinical environment. Moreover, non-linear multimodal imaging can measure biomolecular changes in different tissue regions such as crypt and mucosa region, which serve as a predictive marker for IBD severity. To achieve a real-time assessment of IBD severity, an automatic segmentation of the crypt and mucosa regions is needed. In this paper, we semantically segment the crypt and mucosa region using a deep neural network. We utilized the SegNet architecture (Badrinarayanan et al., 2015) and compared its results with a classical machine learning approach. Our trained SegNet mod el achieved an overall F1 score of 0.75. This model outperformed the classical machine learning approach for the segmentation of the crypt and mucosa region in our study.
  • Access State: Open Access
  • Rights information: Attribution - Non Commercial - No Derivs (CC BY-NC-ND)