Dr Christopher Hughes C.J.Hughes@salford.ac.uk
Director of CSE & Strategic Change
A generic approach to high performance visualization enabled augmented reality
Hughes, CJ; John, N; Riding, M
Authors
N John
M Riding
Abstract
Traditionally registration and tracking within Augmented Reality (AR) applications have been built
around limited bold markers, which allow for their orientation to be estimated in real-time. All
attempts to implement AR without specific markers have increased the computational requirements
and some information about the environment is still needed. In this paper we describe a method that
not only provides a generic platform for AR but also seamlessly deploys High Performance
Computing (HPC) resources to deal with the additional computational load, as part of the
distributed High Performance Visualization (HPV) pipeline used to render the virtual artifacts.
Repeatable feature points are extracted from known views of a real object and then we match the
best stored view to the users viewpoint using the matched feature points to estimate the objects
pose. We also show how our AR framework can then be used in the real world by presenting a
markerless AR interface for Transcranial Magnetic Stimulation (TMS).
Citation
Hughes, C., John, N., & Riding, M. A generic approach to high performance visualization enabled augmented reality. Presented at Proceedings of the UK e-Science All Hands Meeting 2006, Nottingham, UK
Presentation Conference Type | Other |
---|---|
Conference Name | Proceedings of the UK e-Science All Hands Meeting 2006 |
Conference Location | Nottingham, UK |
Deposit Date | Feb 18, 2019 |
Publicly Available Date | Feb 18, 2019 |
Additional Information | Additional Information : ISBN: 0-9553988-0-0 Event Type : Conference Funders : Engineering and Physical Sciences Research Council (EPSRC) Projects : An Advanced Environment for Enabling Visual Supercomputing Grant Number: GR/S46567/01 |
Files
AHM06.pdf
(144 Kb)
PDF
You might also like
Subtitles in VR 360° video. Results from an eye-tracking experiment
(2023)
Journal Article
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Conference Proceeding
VR 360º subtitles: designing a test suite with eye-tracking technology
(2022)
Journal Article
Universal access : user needs for immersive captioning
(2021)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search