Dr Christopher Hughes C.J.Hughes@salford.ac.uk
Director of CSE & Strategic Change
A flexible approach to high performance visualization enabled Augmented Reality
Hughes, CJ; John, NW
Authors
NW John
Contributors
IS Lim
Editor
D Duce
Editor
Abstract
Commonly registration and tracking within Augmented Reality (AR) applications have been built around computervision techniques that use limited bold markers, which allow for their orientation to be estimated in real-time. Allattempts to implement AR without specific markers have increased the computational requirements and someinformation about the environment is still needed. In this paper we describe a method that not only provides aflexible platform for supporting AR but also seamlessly deploys High Performance Computing (HPC) resourcesto deal with the additional computational load, as part of the distributed High Performance Visualization (HPV)pipeline used to render the virtual artifacts. Repeatable feature points are extracted from known views of a realobject and then we match the best stored view to the users viewpoint using the matched feature points to estimatethe objects pose. We also show how our AR framework can then be used in the real world by presenting a marker-less AR interface for Transcranial Magnetic Stimulation (TMS).
Citation
Hughes, C., & John, N. A flexible approach to high performance visualization enabled Augmented Reality. Presented at EG UK Theory and Practice of Computer Graphics 2007, University of Wales, Bangor, UK
Presentation Conference Type | Other |
---|---|
Conference Name | EG UK Theory and Practice of Computer Graphics 2007 |
Conference Location | University of Wales, Bangor, UK |
End Date | Jun 15, 2007 |
Online Publication Date | Jun 1, 2007 |
Publication Date | Jun 14, 2007 |
Deposit Date | Dec 4, 2020 |
Publicly Available Date | Dec 4, 2020 |
Book Title | Theory and Practice of Computer Graphics 2007 |
ISBN | 9783905673630 |
DOI | https://doi.org/10.2312/LocalChapterEvents/TPCG/TPCG07/181-186 |
Publisher URL | https://diglib.eg.org/handle/10.2312/LocalChapterEvents.TPCG.TPCG07.181-186 |
Related Public URLs | http://diglib.eg.org/ http://www.eguk.org.uk/TPCG07/programme.html |
Additional Information | Access Information : This paper may be used for non-commercial purposes. Event Type : Conference |
Files
181-186.pdf
(248 Kb)
PDF
Version
Accepted version with branding
You might also like
Subtitles in VR 360° video. Results from an eye-tracking experiment
(2023)
Journal Article
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Conference Proceeding
VR 360º subtitles: designing a test suite with eye-tracking technology
(2022)
Journal Article
Universal access : user needs for immersive captioning
(2021)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search