P Villard
Improving the modeling of medical imaging data for simulation
Villard, P; Littler, P; Gough, V; Vidal, F; Hughes, CJ; John, N; Luboz, V; Bello, F; Song, Y; Holbrey, R; Bulpitt, A; Mullan, D; Chalmers, N; Kessel, D; Gould, D
Authors
P Littler
V Gough
F Vidal
Dr Chris Hughes C.J.Hughes@salford.ac.uk
Interim Deputy Dean
N John
V Luboz
F Bello
Y Song
R Holbrey
A Bulpitt
D Mullan
N Chalmers
D Kessel
D Gould
Abstract
PURPOSE-MATERIALS: To use patient imaging as the basis for developing virtual environments (VE).
BACKGROUND: Interventional radiology basic skills are still taught in an apprenticeship in patients, though these could be learnt in high fidelity simulations using VE. Ideally, imaging data sets for simulation of image-guided procedures would alter dynamically in response to deformation forces such as respiration and needle insertion. We describe a methodology for deriving such dynamic volume rendering from patient imaging data.
METHODS: With patient consent, selected, routine imaging (computed tomography, magnetic resonance, ultrasound) of straightforward and complex anatomy and pathology was anonymised and uploaded to a repository at Bangor University. Computer scientists used interactive segmentation processes to label target anatomy for creation of a surface (triangular) and volume (tetrahedral) mesh. Computer modelling techniques used a mass spring algorithm to map tissue deformations such as needle insertion and intrinsic motion (e.g. respiration). These methods, in conjunction with a haptic device, provide output forces in real time to mimic the ˜feel” of a procedure. Feedback from trainees and practitioners was obtained during preliminary demonstrations.
RESULTS: Data sets were derived from 6 patients and converted into deformable VEs. Preliminary content validation studies of a framework developed for training on liver biopsy procedures, demonstrated favourable observations that are leading to further revisions, including implementation of an immersive VE.
CONCLUSION: It is possible to develop dynamic volume renderings from static patient data sets and these are likely to form the basis of future simulations for IR training of procedural interventions.
Presentation Conference Type | Other |
---|---|
Conference Name | United Kingdom Radiology Congress (UKRC) |
Start Date | Jan 1, 2008 |
Publication Date | Jan 1, 2008 |
Deposit Date | Feb 18, 2019 |
Additional Information | Event Type : Conference |
You might also like
Interactive Storytelling with Gaze-Responsive Subtitles
(2025)
Presentation / Conference Contribution
Real-Time Mobile Transition Matrix Entropy Based on Eye and Head Movements
(2025)
Presentation / Conference Contribution
Subtitles in VR 360° video. Results from an eye-tracking experiment
(2023)
Journal Article
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Presentation / Conference Contribution
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search