We have developed a holistic framework for the design of Intelligent AR interfaces. Using our proposed taxonomy of everyday AR contexts we proposed a framework for the design of an intelligent AR interface to infer users’ wants and needs and predict the desired adaptations to the AR interface design dimensions, virtual content, and interaction techniques. Depending on the context, the intelligent interface may make general adaptations to the whole system, or to individual apps.
This work is in preparation for publication.
Shakiba Davari
[DC] Context-Aware Inference and Adaptation in Augmented Reality Conference
2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2022.
@conference{Davari2022DC,
title = {[DC] Context-Aware Inference and Adaptation in Augmented Reality},
author = {Shakiba Davari },
doi = {DOI 10.1109/VRW55335.2022.00320},
year = {2022},
date = {2022-03-16},
pages = {938-939},
publisher = {2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
abstract = {The recent developments in Augmented Reality (AR) eyeglasses promise the potential for more efficient and reliable information access. This reinforces the widespread belief that AR Glasses are the next generation of personal computing devices, providing efficient information access to the user all day, every day. However, to realize this vision of all-day wearable AR, the AR interface must address the challenges that constant and pervasive presence of virtual content may cause. Throughout the day, as the user's context switches, an optimal all-day interface must adapt its virtual content display and interactions. The optimal interface, that is the most efficient yet least intrusive, in one context may be the worst interface for another context. This work aims to propose a research agenda to design and validate different adaptation techniques and context-aware AR interfaces and introduce a framework for the design of such intelligent interfaces.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The recent developments in Augmented Reality (AR) eyeglasses promise the potential for more efficient and reliable information access. This reinforces the widespread belief that AR Glasses are the next generation of personal computing devices, providing efficient information access to the user all day, every day. However, to realize this vision of all-day wearable AR, the AR interface must address the challenges that constant and pervasive presence of virtual content may cause. Throughout the day, as the user's context switches, an optimal all-day interface must adapt its virtual content display and interactions. The optimal interface, that is the most efficient yet least intrusive, in one context may be the worst interface for another context. This work aims to propose a research agenda to design and validate different adaptation techniques and context-aware AR interfaces and introduce a framework for the design of such intelligent interfaces.