Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Improving MR image quality with a multi-task model, using convolutional losses
Umeå universitet, Medicinska fakulteten, Institutionen för strålningsvetenskaper.
Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, Netherlands.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0001-7119-7646
Umeå universitet, Medicinska fakulteten, Institutionen för strålningsvetenskaper.ORCID-id: 0000-0002-0532-232X
Visa övriga samt affilieringar
2023 (Engelska)Ingår i: BMC Medical Imaging, E-ISSN 1471-2342, Vol. 23, nr 1, artikel-id 148Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

PURPOSE: During the acquisition of MRI data, patient-, sequence-, or hardware-related factors can introduce artefacts that degrade image quality. Four of the most significant tasks for improving MRI image quality have been bias field correction, super-resolution, motion-, and noise correction. Machine learning has achieved outstanding results in improving MR image quality for these tasks individually, yet multi-task methods are rarely explored.

METHODS: In this study, we developed a model to simultaneously correct for all four aforementioned artefacts using multi-task learning. Two different datasets were collected, one consisting of brain scans while the other pelvic scans, which were used to train separate models, implementing their corresponding artefact augmentations. Additionally, we explored a novel loss function that does not only aim to reconstruct the individual pixel values, but also the image gradients, to produce sharper, more realistic results. The difference between the evaluated methods was tested for significance using a Friedman test of equivalence followed by a Nemenyi post-hoc test.

RESULTS: Our proposed model generally outperformed other commonly-used correction methods for individual artefacts, consistently achieving equal or superior results in at least one of the evaluation metrics. For images with multiple simultaneous artefacts, we show that the performance of using a combination of models, trained to correct individual artefacts depends heavily on the order that they were applied. This is not an issue for our proposed multi-task model. The model trained using our novel convolutional loss function always outperformed the model trained with a mean squared error loss, when evaluated using Visual Information Fidelity, a quality metric connected to perceptual quality.

CONCLUSION: We trained two models for multi-task MRI artefact correction of brain, and pelvic scans. We used a novel loss function that significantly improves the image quality of the outputs over using mean squared error. The approach performs well on real world data, and it provides insight into which artefacts it detects and corrects for. Our proposed model and source code were made publicly available.

Ort, förlag, år, upplaga, sidor
BioMed Central (BMC), 2023. Vol. 23, nr 1, artikel-id 148
Nyckelord [en]
Image artefact correction, Machine learning, Magnetic resonance imaging
Nationell ämneskategori
Radiologi och bildbehandling
Identifikatorer
URN: urn:nbn:se:umu:diva-215277DOI: 10.1186/s12880-023-01109-zPubMedID: 37784039Scopus ID: 2-s2.0-85173046817OAI: oai:DiVA.org:umu-215277DiVA, id: diva2:1805543
Forskningsfinansiär
Cancerforskningsfonden i Norrland, LP 18-2182Cancerforskningsfonden i Norrland, AMP 18-912Cancerforskningsfonden i Norrland, AMP 20-1014Cancerforskningsfonden i Norrland, LP 22-2319Region VästerbottenTillgänglig från: 2023-10-17 Skapad: 2023-10-17 Senast uppdaterad: 2024-07-04Bibliografiskt granskad
Ingår i avhandling
1. Contributions to deep learning for imaging in radiotherapy
Öppna denna publikation i ny flik eller fönster >>Contributions to deep learning for imaging in radiotherapy
2023 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Alternativ titel[sv]
Bidrag till djupinlärning för bildbehandling inom strålbehandling
Abstract [en]

Purpose: The increasing importance of medical imaging in cancer treatment, combined with the growing popularity of deep learning gave relevance to the presented contributions to deep learning solutions with applications in medical imaging.

Relevance: The projects aim to improve the efficiency of MRI for automated tasks related to radiotherapy, building on recent advancements in the field of deep learning.

Approach: Our implementations are built on recently developed deep learning methodologies, while introducing novel approaches in the main aspects of deep learning, with regards to physics-informed augmentations and network architectures, and implicit loss functions. To make future comparisons easier, we often evaluated our methods on public datasets, and made all solutions publicly available.

Results: The results of the collected projects include the development of robust models for MRI bias field correction, artefact removal, contrast transfer and sCT generation. Furthermore, the projects stress the importance of reproducibility in deep learning research and offer guidelines for creating transparent and usable code repositories.

Conclusions: Our results collectively build the position of deep learning in the field of medical imaging. The projects offer solutions that are both novel and aim to be highly applicable, while emphasizing generalization towards a wide variety of data and the transparency of the results.

Ort, förlag, år, upplaga, sidor
Umeå: Umeå University, 2023. s. 100
Serie
Umeå University medical dissertations, ISSN 0346-6612 ; 2264
Nyckelord
deep learning, medical imaging, radiotherapy, artefact correction, bias field correction, contrast transfer, synthetic CT, reproducibility
Nationell ämneskategori
Radiologi och bildbehandling
Identifikatorer
urn:nbn:se:umu:diva-215693 (URN)9789180701945 (ISBN)9789180701952 (ISBN)
Disputation
2024-01-26, E04, Norrlands universitetssjukhus, Umeå, 13:00 (Engelska)
Opponent
Handledare
Tillgänglig från: 2023-11-08 Skapad: 2023-10-25 Senast uppdaterad: 2024-07-02Bibliografiskt granskad

Open Access i DiVA

fulltext(2628 kB)94 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 2628 kBChecksumma SHA-512
1b626aba266938c6e6e45a09b1c28bca51ab45bcad98c81ad52b53135e9697eee707919a5b2fdcdfe92b2a4fe8c262597c1bb505a23988fec1bac30ac3bf6cbe
Typ fulltextMimetyp application/pdf

Övriga länkar

Förlagets fulltextPubMedScopus

Person

Simkó, AttilaLöfstedt, TommyGarpebring, AndersNyholm, TufveBylund, MikaelJonsson, Joakim

Sök vidare i DiVA

Av författaren/redaktören
Simkó, AttilaLöfstedt, TommyGarpebring, AndersNyholm, TufveBylund, MikaelJonsson, Joakim
Av organisationen
Institutionen för strålningsvetenskaperInstitutionen för datavetenskap
I samma tidskrift
BMC Medical Imaging
Radiologi och bildbehandling

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 94 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

doi
pubmed
urn-nbn

Altmetricpoäng

doi
pubmed
urn-nbn
Totalt: 365 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf