SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Vitzthum Lucas) "

Search: WFRF:(Vitzthum Lucas)

  • Result 1-2 of 2
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Liu, Lianli, et al. (author)
  • Volumetric MRI with sparse sampling for MR-guided 3D motion tracking via sparse prior-augmented implicit neural representation learning
  • 2024
  • In: Medical physics (Lancaster). - : John Wiley & Sons. - 0094-2405. ; 51:4, s. 2526-2537
  • Journal article (peer-reviewed)abstract
    • BackgroundVolumetric reconstruction of magnetic resonance imaging (MRI) from sparse samples is desirable for 3D motion tracking and promises to improve magnetic resonance (MR)-guided radiation treatment precision. Data-driven sparse MRI reconstruction, however, requires large-scale training datasets for prior learning, which is time-consuming and challenging to acquire in clinical settings.PurposeTo investigate volumetric reconstruction of MRI from sparse samples of two orthogonal slices aided by sparse priors of two static 3D MRI through implicit neural representation (NeRP) learning, in support of 3D motion tracking during MR-guided radiotherapy.MethodsA multi-layer perceptron network was trained to parameterize the NeRP model of a patient-specific MRI dataset, where the network takes 4D data coordinates of voxel locations and motion states as inputs and outputs corresponding voxel intensities. By first training the network to learn the NeRP of two static 3D MRI with different breathing motion states, prior information of patient breathing motion was embedded into network weights through optimization. The prior information was then augmented from two motion states to 31 motion states by querying the optimized network at interpolated and extrapolated motion state coordinates. Starting from the prior-augmented NeRP model as an initialization point, we further trained the network to fit sparse samples of two orthogonal MRI slices and the final volumetric reconstruction was obtained by querying the trained network at 3D spatial locations. We evaluated the proposed method using 5-min volumetric MRI time series with 340 ms temporal resolution for seven abdominal patients with hepatocellular carcinoma, acquired using golden-angle radial MRI sequence and reconstructed through retrospective sorting. Two volumetric MRI with inhale and exhale states respectively were selected from the first 30 s of the time series for prior embedding and augmentation. The remaining 4.5-min time series was used for volumetric reconstruction evaluation, where we retrospectively subsampled each MRI to two orthogonal slices and compared model-reconstructed images to ground truth images in terms of image quality and the capability of supporting 3D target motion tracking.ResultsAcross the seven patients evaluated, the peak signal-to-noise-ratio between model-reconstructed and ground truth MR images was 38.02 ± 2.60 dB and the structure similarity index measure was 0.98 ± 0.01. Throughout the 4.5-min time period, gross tumor volume (GTV) motion estimated by deforming a reference state MRI to model-reconstructed and ground truth MRI showed good consistency. The 95-percentile Hausdorff distance between GTV contours was 2.41 ± 0.77 mm, which is less than the voxel dimension. The mean GTV centroid position difference between ground truth and model estimation was less than 1 mm in all three orthogonal directions.ConclusionA prior-augmented NeRP model has been developed to reconstruct volumetric MRI from sparse samples of orthogonal cine slices. Only one exhale and one inhale 3D MRI were needed to train the model to learn prior information of patient breathing motion for sparse image reconstruction. The proposed model has the potential of supporting 3D motion tracking during MR-guided radiotherapy for improved treatment precision and promises a major simplification of the workflow by eliminating the need for large-scale training datasets.
  •  
2.
  • Zakeri, Kaveh, et al. (author)
  • Predictive classifier for intensive treatment of head and neck cancer
  • 2020
  • In: Cancer. - : Wiley-Blackwell. - 0008-543X .- 1097-0142. ; 126:24, s. 5263-5273
  • Journal article (peer-reviewed)abstract
    • Background This study was designed to test the hypothesis that the effectiveness of intensive treatment for locoregionally advanced head and neck cancer (LAHNC) depends on the proportion of patients' overall event risk attributable to cancer. Methods This study analyzed 22,339 patients with LAHNC treated in 81 randomized trials testing altered fractionation (AFX; Meta-Analysis of Radiotherapy in Squamous Cell Carcinomas of Head and Neck [MARCH] data set) or chemotherapy (Meta-Analysis of Chemotherapy in Head and Neck Cancer [MACH-NC] data set). Generalized competing event regression was applied to the control arms in MARCH, and patients were stratified by tertile according to the omega score, which quantified the relative hazard for cancer versus competing events. The classifier was externally validated on the MACH-NC data set. The study tested for interactions between the omega score and treatment effects on overall survival (OS). Results Factors associated with a higher omega score were a younger age, a better performance status, an oral cavity site, higher T and N categories, and a p16-negative/unknown status. The effect of AFX on OS was greater in patients with high omega scores (hazard ratio [HR], 0.92; 95% confidence interval [CI], 0.85-0.99) and medium omega scores (HR, 0.91; 95% CI, 0.84-0.98) versus low omega scores (HR, 0.97; 95% CI, 0.90-1.05;Pfor interaction = .086). The effect of chemotherapy on OS was significantly greater in patients with high omega scores (HR, 0.81; 95% CI, 0.75-0.88) and medium omega scores (HR, 0.86; 95% CI, 0.78-0.93) versus low omega scores (HR, 0.96; 95% CI, 0.86-1.08;Pfor interaction = .011). Conclusions LAHNC patients with a higher risk of cancer progression relative to competing mortality, as reflected by a higher omega score, selectively benefit from more intensive treatment.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-2 of 2

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view