MEinVR: Multimodal Interaction Paradigms in Immersive Exploration. Yuan, ZY; Liu, Y and Yu, LY. 21st IEEE International Symposium on Mixed and Augmented Reality (ISMAR) Adjunct 2022, pp.85-90. [PDF]
Existing NLIs for visualization were developed for/in desktop environments. These include:
MEinVR builds upon these existing methods for a VR environment, with design goals to:
Questions:
Oculus Quest 2 using ChimeraX VR + speech recognition
The authors basically punt by saying:
“Further studies are required to explore the prominent advantages of
each interaction input in different exploration tasks for various data.
Moreover, a comprehensive user study needs to be conducted to evaluate
the usability and effectiveness of our method in data exploration.”
I also didn't see anything about availability of their code, but the corresponding author's email address is given.