Skip to main content
Thesis defences

PhD Oral Exam - Majid Pourmemar, Computer Science

Predicting User Performance and the Evaluation of User Interfaces in Immersive Augmented Reality

Date & time
Thursday, January 18, 2024
1 p.m. – 4 p.m.

This event is free


School of Graduate Studies


Nadeem Butt



When studying for a doctoral degree (PhD), candidates submit a thesis that provides a critical review of the current state of knowledge of the thesis subject as well as the student’s own contributions to the subject. The distinguishing criterion of doctoral graduate research is a significant and original contribution to knowledge.

Once accepted, the candidate presents the thesis orally. This oral exam is open to the public.


Augmented Reality (AR) is defined as the enhancement of the user's view with interactive, computer-generated content, and arises in several forms, including Mobile AR, Spatial AR, and notably, Immersive AR. Among these, immersive AR has piqued considerable attention from both academia and industry. It enables users to immerse themselves in virtual content registered to real-world objects through head-mounted displays (HMDs) and optics technology. Providing robust interaction with virtual objects has necessitated immersive headset designers and producers to develop a broad array of interaction techniques and input modalities, enabling users to experience both a natural interaction with virtual objects and a profound sense of presence. Well-known companies in this domain, such as Microsoft, Meta, Apple, and Magic Leap, have pioneered various interactions and input modalities such as hand gestures, head pointing, eye tracking, and voice commands. The current state of the art indicates that specific input modalities, like hand gestures and head pointing, can increase both the physical and mental workload on users, subsequently increasing the error margins in object selection and manipulation within immersive environments. In response, this research adeptly applies statistical and contemporary machine learning techniques to establish guidelines for immersive AR environment designers and developers, aiming to mitigate the effects of physical and mental workload on users, particularly during hierarchical menu selection tasks, a commonplace activity in various computer applications, including immersive AR.

This thesis, embodied by four research papers, extends guidelines and recommendations for designers and developers of immersive AR headsets, endeavoring to alleviate workload and error rates induced by natural interactions such as hand gestures and head pointing while performing hierarchical menu selection. The initial research focused on identifying the most efficacious combination of hierarchical menu types, like radial and drop-down menus, and input modalities, such as hand gestures and head pointing in terms of workload and performance measures. The subsequent study deployed a machine learning approach, leveraging Large Language Models (LLMs) and the standard cognitive performance test, WAIS-IV, to predict human performance in an immersive AR environment during hierarchical menu selection tasks. The third research introduces an analysis and index for error rate in hierarchical menu selection, utilizing both subjective and objective data derived from the users. The final research is an analysis of head pointing and hand gesture path data during menu selection tasks, elucidating an index and its relationship with both subjective and objective methods for calculating workload and error rate in immersive AR.

Back to top

© Concordia University