diff --git "a/-9E4T4oBgHgl3EQf4A3h/content/tmp_files/load_file.txt" "b/-9E4T4oBgHgl3EQf4A3h/content/tmp_files/load_file.txt" new file mode 100644--- /dev/null +++ "b/-9E4T4oBgHgl3EQf4A3h/content/tmp_files/load_file.txt" @@ -0,0 +1,594 @@ +filepath=/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf,len=593 +page_content='A Framework for Active Haptic Guidance Using Robotic Haptic Proxies Niall L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Williams1, Jiasheng Li1, and Ming C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Lin1 https://gamma.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='umd.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='edu/active haptic guidance/ Abstract— Haptic feedback is an important component of creating an immersive virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Traditionally, haptic forces are rendered in response to the user’s interactions with the virtual environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this work, we explore the idea of rendering haptic forces in a proactive manner, with the explicit intention to influence the user’s behavior through compelling haptic forces.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To this end, we present a framework for active haptic guidance in mixed reality, using one or more robotic haptic proxies to influence user behavior and deliver a safer and more immersive virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We provide details on common challenges that need to be overcome when implementing active haptic guidance, and discuss example applications that show how active haptic guidance can be used to influence the user’s behavior.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Finally, we apply active haptic guidance to a virtual reality navigation problem, and conduct a user study that demonstrates how active haptic guidance creates a safer and more immersive experience for users.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' I.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' INTRODUCTION In mixed reality (MR), the user is at least partially im- mersed in a 3D, computer-generated environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Included within the mixed reality spectrum are augmented reality and virtual reality (VR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A major factor that makes MR a unique medium is that it is interactive—the user is able to interact with the virtual environment (VE) through position- tracking sensors that update the VE according to the user’s movements in the physical environment (PE).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For example, when the user moves their head in the real world, the position of the camera in the virtual world moves as well.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Interactions like these help to make users feel like they are really in the VE that they see through the head-mounted display (HMD).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' One key component to increasing the user’s sense of presence in a VE is to improve the perceptual stimuli matching [8], wherein the user is provided with perceptual information that matches their actions (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' the viewing perspective updates as the user moves their head).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this work, we focus on the sense of touch provided by mechanical haptic feedback and how we can use robots to provide more realistic haptic sensations to improve the sense of immersion and safety in mixed reality.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Robotic technology has in fact been used to provide haptic feedback in MR to improve the sense of virtual touch and virtual manipulation [10].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For example, MR can enhance robotics via telepresence, wherein humans can remotely operate robot to high precision using immersive controls afforded by VR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' This work is partially supported by National Science Foundation and Lin’s professorship.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1Authors are with the Department of Computer Science, University of Maryland, College Park.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' {niallw, jsli, lin}@umd.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='edu Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' An image of a user in the physical environment (left) and virtual environment (right) in our implementation of active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The user is tethered to a robot in the physical environment and to a virtual dog companion in the virtual environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The robot provides haptic feedback to the user according to the virtual companion’s movements, which improves the user’s sense of presence in the virtual world and encourages the user to avoid the boundaries of the virtual reality system’s tracked space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this paper, we introduce the possibility of using robots to enhance the virtual experience through haptic feedback.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Specifically, we use robots to guide the user as they navigate through a VE, and reconfigure and virtually expand the PE to align with the VE;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' we achieve this through manual haptic feedback that directs the user’s locomotion behavior in the VE, thereby making the virtual experience more immersive and safer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To this end, we introduce the concept of active haptic guidance, which describes the problem of reconfiguring one or multiple robots in the PE in real time such that they provide haptic feedback to guide the user and influence their actions and motion in the VE, with the ultimate goal of improving the user’s safety or level of immersion in MR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' One major challenge with robots for active haptic feedback in MR is that the physical robots and their virtual counterparts must be co-located relative to the user, in order to provide the correct haptic feedback that aligns with the virtual counterpart.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' This problem can be exacerbated when the environments/interactions are dynamic (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' the physical and virtual haptic proxy must move synchronously) or when there is a decoupling between the user’s physical and virtual locations (as is common with some VR interaction techniques like redirected walking [24]).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Main contributions: We introduce the concept of ac- tive haptic guidance for improved virtual locomotion, and conduct a user study to show an example of how active haptic guidance can be used to improve a user’s safety and arXiv:2301.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='05311v1 [cs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='RO] 12 Jan 2023 feelings of immersion in a virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Our framework is general, so it can be applied to use cases other than locomotion, and we provide examples of other possible use- cases for active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Our main contributions include: A formal description of the active haptic guidance problem and details on common challenges that are faced when implementing active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ac- tive haptic guidance involves using robots to provide realistic haptic feedback to users in mixed reality, with the goal of influencing users’ behaviors to improve their safety and/or sense of presence in the virtual environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' An prototype realization and user study showing the benefits of active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In our study, par- ticipants completed a virtual navigation task using real walking, either with or without active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Our results show that active haptic guidance can signif- icantly improve the virtual experience by reducing the number of “breaks in presence” and keeping them a safe distance away from physical objects for longer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' II.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' BACKGROUND AND RELATED WORK Haptic feedback can be utilized in any mixed reality setting, but in this work we mainly discuss applications of haptics to virtual reality (VR) settings, since our implementa- tion was done in VR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In VR, the user wears a head-mounted display (HMD) through which they view a 3D, computer- generated virtual environment (VE) [15].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The user’s position in the physical environment (PE) is tracked, so that whenever the user updates their position in the PE, the position of the virtual camera updates accordingly to provide an accurate viewing perspective of the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' VR is an interactive experience, meaning that the user does not passively observe the virtual content, but instead the environment changes in response to the user’s actions and movements.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When the virtual experience feels sufficiently real, the user experiences a sense of presence, which describes the subjective feeling of really being in the environment [31].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Factors that contribute to a user’s feelings of presence and immersion in a VE include the HMD refresh rate [3], the environment realism and visual quality [34], and perceptual stimuli matching [8], [33] (the process of providing users with perceptual information that matches their actions in the VE).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this paper, we focus on providing haptic stimuli for perceptual stimulus matching to improve the user’s experience in VR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Haptic feedback can be provided in a passive or an active manner.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' With passive haptics, objects are placed in the PE such that they align with the locations of objects in the VE, resulting in haptic feedback when the user tries to touch objects in the VE [11].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Conversely, active haptics involves a haptic proxy that dynamically alters its configuration in real time to provide the appropriate haptic force feedback, depending on the user’s interactions with the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' It is common to use robotic systems to render haptic forces.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For example, Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [36] used a robotic arm to provide haptic feedback during object assembly by aligning the arm’s end effector with the handheld proxy.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Siu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [28] used an array of actuated pins to match the contours of virtual objects.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Similarly, Zhao et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [37] used robotic assembly to construct tangible representations of virtual objects, made from magnetically attached blocks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To recreate the feelings of grasping virtual objects, Kovacs et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [18] used a wrist-worn device to provide on-demand haptic feedback when users grip virtual objects, while Sinclair et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [27] used a force- resisting, handheld controller to render haptic forces for rigid and compliant objects.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Suzuki et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [32] used mobile robots to rearrange physical furniture to align with virtual furniture as the user moved through a virtual world.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Robotic systems have also been used to aid in navigation through VEs, via handheld canes that use vibrations to provide information about the VE [38], [19], [29], mechanical staircases [13] to simulate uneven virtual terrain, or mobile tiles that simulate infinite walking in any direction [12].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The majority of prior work on active haptics for mixed reality requires the user to initiate interactions with the VE before the haptic forces are rendered.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' That is, the haptic forces are triggered by the user’s interactions with the VE, so it is the user’s actions that dictate when haptic forces are rendered.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this work, we make the distinction of using active haptics specifically to direct the user and influence their behavior in the VE (in addition to providing a more immersive experience, as all haptics aims to do).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We define this use of haptics as active haptic guidance, since it is the haptic forces that direct the user’s behaviors, rather than the other way around.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We note that there already exists research on “haptic guidance,” which Feygin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' use to refer to haptic feedback that is used to help people learn motor skills [7].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The distinction between our work on active haptic guidance and Feygin et al.’s work is that we use haptic feedback to discreetly influence the user’s behavior in an effort to enhance their feelings of presence and level of safety in a mixed reality experience, while Feygin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' use haptics to teach people motor skills.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' III.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' PROBLEM DESCRIPTION Here we describe the active haptic guidance problem, as well as constraints that need to be satisfied to effectively utilize haptics to guide users in MR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Definitions In virtual reality, the user is located in a physical envi- ronment (PE) and a virtual environment (VE) at the same time.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Each environment consists of objects (either physical objects or virutal objects represented by textured meshes) and agents (the users and robots).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Note that it is common to refer to virtual humans and animals as agents, but in this work we will consider all components of the VE as generic objects for simplicity, and we use “agents” to refer only to humans and robots in the PE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Let O = {o1, o2, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', oi} be a set of polygonal objects, where each object o is a mesh with vertices in R3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Let U = {u1, u2, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', uj} be the set of users in an environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Here, u represents the user’s state in an environment, and usually describes their position and orientation in said envi- ronment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For example, we can define u = {p, θ}, where p ∈ R2 represents their position in the 2D plane and θ ∈ [0, 2π) represents their orientation in the environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Let R = {r1, r2, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', rk} be the set of robots in an environment, and let A = {U ∪ R} be set of all agents.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Each of these sets O, U, R, and A may be empty.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We define an environment E as a set of obstacles and agents;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' that is, E = {O, A}.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To differentiate between the PE and VE, we denote the PE as EP = {OP , AP } and the VE as EV = {OV , AV }.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For each user in virtual reality, they will have a representation in both the PE and VE, so |UP | = |UV | = n, where n is the number of users in virtual reality.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Since we only consider agents to be users and robots in this work, |AV | = n (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', the only agents in the VE are the users).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In the VE, there are some objects that the user is likely to interact with, which will improve their sense of presence in the environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We define this set of “objects of interest” O ⊂ OV as the set of virtual objects for which we render haptic forces when the user interacts with them.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' With these definitions of the PE and VE, we can now de- scribe the two main conditions that need to be met to provide active haptic guidance to users in MR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' First, the robots in the physical environment need to provide the appropriate haptic feedback to influence the user’s configuration.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Second, we need to ensure that the robots that provide haptic feedback are co-located (relative to the user) with the virtual objects of interest with which the haptic forces are associated.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Influential Haptics Constraint The first condition that needs to be met in order to implement active haptic guidance is that the rendered haptic forces should influence the user’s behavior such that they update their physical and virtual configurations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We dub this constraint the influential haptics (IH) constraint.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For simplicity, we formalize this constraint using one user, one robot, and one virtual object of interest, but this constraint applies to any group of agents and virtual objects for which we render haptic forces.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Given the user’s physical and virtual configurations uP and uV , a virtual object of interest o, and a robot r that provides haptic feedback for o, we wish to render a haptic force F that compels the user to update uP and uV to some goal configurations u∗ P and u∗ V .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Thus, fulfilling the IH constraint requires completing the following steps: 1) Compute the goal configurations u∗ P and u∗ V .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2) Detect or initiate an interaction I between o and uV .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3) Update the configuration of r to render a haptic force F(I, uV , uP , u∗ P , u∗ V , r) that minimizes an objective function f(uV , uP , u∗ P , u∗ V ).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In practice, computing F(I, uV , uP , u∗ P , u∗ V , r) depends heavily on the mechanics of the haptic proxy r and the ob- jective function f(uV , uP , u∗ P , u∗ V ).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The objective function is usually a distance function that measures the error between uP and uV , and it depends on the user’s configuration space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' By rendering F, the user hopefully updates their configuration such that they move closer to u∗ P and u∗ V .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Computing u∗ P and u∗ V is a matter of determining how we want the user to behave.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In mixed reality (MR), two main reasons to influence the user’s behavior are to ensure their safety and to deliver a more immersive experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In MR systems, the user tries to navigate through the PE and the VE at the same time, but the PE is partially or fully occluded.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Thus, in order to prevent the user from bumping into physical objects that they cannot see, locomotion interfaces for MR usually display a notification that prompts them to reposition themself to a safer position away from nearby objects.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' By using haptics to warn users (either overtly or subtly), we can decrease the likelihood that the user collides with unseen physical obstacles or exits the designated tracking area.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In addition to ensuring user safety, influencing the user’s behavior can be useful for improving the user’s sense of presence in the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In MR, providing perceptual stimuli that align with the content rendered on the visual display enhances the user’s feeling that they are really in the VE that they are seeing.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To this end, haptic feedback can significantly improve the user’s sense of presence in the VE [11].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In the case of active haptic guidance, the haptic feedback can be used as an additional narrative element that encourages users to explore a particular area or interact with particular objects in the VE (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' pairing visual distractors [21] with haptic feedback to direct the user’s attention).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Relative Co-location Constraint The second main constraint that should be met when using active haptic guidance is that the physical robots that render the haptic forces and their associated virtual objects should be co-located relative to the user.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' That is, the position of the robot and the virtual object should be the same relative to the user’s configuration in the PE and VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' This is done to ensure that the user perceives a congruent VE that is augmented by haptic forces, rather than perceiving a VE along with misaligned haptic forces, which may break their sense of presence in the virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We call this the relative co-location (RC) constraint.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Given the user’s physical and virtual configurations uP and uV , a virtual object of interest o, and a robot r that provides haptic feedback for o, we wish to update r such that we minimize the error in the relative positions between uV and o and uP and r.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Fulfilling the RC constraint requires completing the following steps: 1) Compute the configurations of o and r relative to uV and uP , respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Usually, these are just positions po and pr of o and r relative to the user in the respective environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2) Compute a goal configuration r∗ for the haptic proxy that minimizes an objective function f(po, pr).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3) Update the configuration of r to move it towards r∗.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In practice, updating the robot’s configuration in step #3 is a motion planning problem where we aim to find a path through the configuration space that brings r close to r∗, and it depends on the mechanics of the haptic proxy.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Since MR is an interactive technology, the relative posi- tions po and pr are constantly changing as the user explores and interacts with the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Thus, evaluating and fulfilling the RC constraint must be done constantly to ensure that any per- ceptual stimuli mismatch is minimized.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Failure to adequately meet this constraint can degrade the user experience, since it increases the likelihood that the user notices a discrepancy between visual stimuli and the haptic stimuli [14], [20].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Furthermore, knowing how much error between their relative positions the user will tolerate is a subjective measure [2], [17], so it is usually not the case that the robot must reach r∗ exactly.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Note that this relative co-location constraint is not unique to the active haptic guidance problem (unlike subsection III-B);' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' other work on active haptics for virtual reality also has to deal with the problem of ensuring the co-location of robotic agents and their virtual counterparts.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IV.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' PROTOTYPE REALIZATION EXAMPLES In this section, we provide details on our prototype im- plementation of an application of active haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In particular, we implement an active haptic-driven locomotion application to provide a safer and more immersive virtual navigation experience for users.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We discuss other potential use-cases for active haptic guidance in the supplementary materials posted on our project page.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Natural Walking in Virtual Reality In VR, it is common for the PE to be much smaller than the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To enable users to explore large VEs, many different locomotion interfaces such as teleportation, joystick naviga- tion, and walking-in-place have been developed [6].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ideally, users explore the VE using natural, everyday walking since it improves their sense of presence [33] and performance in tasks [9], [22], [26].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' One technique that enables natural walking in VR is redirected walking (RDW) [24].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' RDW works by slowly rotating the VE around the user’s virtual camera while they walk, which causes them to unconsciously adjust their physical trajectory to counteract the VE rotations and remain on their intended path in the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' It works because the human perceptual system tends to believe the user’s visual stimuli over other stimuli (proprio- ceptive, vestibular, etc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=') when they conflict, as is the case in RDW [23].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Using RDW, we can steer the user along paths in the PE that direct them away from objects and edges of the tracked space, resulting in a safer and more immersive virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To help mask the VE rotations, researchers make use of distractors which grab the user’s attention to decrease the likelihood that they attend to the rotations of the VE [4], [21], [35].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In our prototype implementation, we use a virtual dog as a distractor in conjunction with a RDW algorithm known as steer-to-center, which rotates the VE such that the user is steered towards the center of the PE at all times [23].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Virtual Experience and Equipment For our implementation, a user u1 completed a navi- gation task in a virtual city and had a virtual dog as a companion (only a single user participated at a time, so |UP | = |UV | = 1).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Additionally, u1 held a position- tracked leash that was tethered to a differential wheeled robot r1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The PE was an empty rectangular room with four walls (represented by the boundaries of the VR tracking space).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Thus, EP = {OP , AP }, where AP = {u1, r1}.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The virtual dog served as a distractor and was the only object of interest in EV (|O| = 1), meaning that the robot only rendered haptic forces associated with the virtual dog.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Our application was implemented using one HTC VIVE Cosmos VR HMD with two VIVE trackers, and one robot car (ELEGOO UNO Robot Car kit).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We attached one VIVE tracker to the robot to track its location and orientation data, and the other was attached to the leash handle to calculate the distance between u1 and r1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The robot was equipped with an HC-06 Bluetooth LE adapter, which connected to the PC to transmit robot movement commands.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The Unreal 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='22 game engine was used to render the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Virtual Companion and Robot Behavior Here we describe the behavior of the virtual dog com- panion and how the robot matches the virtual companion’s movements and provides haptic feedback.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1) Virtual Dog Companion Behavior: The virtual dog has two main behavior states: following and distracting.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When the user walks around and is not at risk of leaving the tracking space, the dog is in follow mode.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In this mode, the dog walks slightly ahead of the user as they walk, and remains in one spot when the user stands still.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When the user reaches a boundary of the tracked space, the VR system initiates what is called a reset, wherein the user reorients themself such that they face towards the inside of the tracking space in the PE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To ensure that their orientation in the VE is not altered, the VR system applies redirection that effectively cancels out their physical rotation in the virtual space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When a reset is initiated, the virtual dog enters distract mode.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In distract mode, we compute a goal position in the VE for the dog to move towards.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The idea behind distract mode is that the user is likely to pay attention to the virtual dog as it runs to a goal position, which allows the system to apply stronger redirection (away from the obstacles in the PE) without interfering with the user’s experience [21].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' During a reset, the goal position is selected by first computing the vector from the user towards the center of the physical space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The goal position is then determined to be either the endpoint of this vector in the VE, or a virtual object near the vector’s endpoint that was labeled as a potential goal position during development.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Potential goal positions are virtual objects that a dog would be likely to interact with, such as a fire hydrant or a lamp post.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' If the vector intersects with a virtual object (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' a virtual building) and there are no potential goal objects nearby, the goal position is simply the point furthest along the vector that does not intersect with any objects.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' See Figure 2 for a visualization of this process.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2) Robot Haptic Proxy Behavior: The physical robot’s main purpose is to provide haptic feedback to make the user’s virtual experience feel more immersive and to encourage the user to walk away from nearby objects or tracking space boundaries in the PE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In both follow and distract mode, the physical robot needs to update its position such that it is Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Our method of automatically choosing a suitable virtual goal position for the virtual companion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When the user gets close to a boundary of the physical space, they need to be reoriented away from the boundary before they continue walking.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In order to pick a goal destination for the virtual companion and robotic haptic proxy, we cast a ray from the physical user to the center of the tracked space and then superimpose this vector onto the user’s virtual position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' If the endpoint of this vector is near a pre-defined potential goal position, that is chosen as the current goal position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Otherwise, we choose the furthest point along the vector that does not intersect with any objects in the virtual environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' aligned with the position of the virtual dog, relative to the user in either environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Checking if a position update is necessary is easily achieved by computing the vector from the virtual user to the virtual dog and comparing it to the vector from the user’s HMD and the robot.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To compute the trajectory that the robot will follow, we compute a circular arc path based on the robot’s position, forward direction, and destination position (determined by the relative position of the virtual dog and user).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The ideal path for a differential drive robot is a circular arc since it only requires one set of wheel velocities [5].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The wheel velocities are computed with the ratio 2rd 2r−d, where r is the arc radius and d is the distance between the robot wheels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Note that we do not use typical PID-based drift correction due to possible unexpected complications that may arise from the tethering to the user [1], [25], [30].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Maintaining Active Haptic Guidance Constraints This section describes how our active haptic-drive loco- motion application satisfies the IH and RC constraints.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1) Directing Users With Haptic Feedback: Since the virtual object of interest is a dog, the user is attached to the robot by an elastic tether that resembles a leash.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' When the robot moves away from the user in the PE, it simulates the sensation of a dog tugging on its leash, thereby improving the realism of the virtual experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Additionally, this tugging encourages the user to follow the robot rather than “fight” it, allowing us to further influence the user’s movement patterns in the PE and VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' By triggering the robot to move away from the user and towards the center of the PE when they get too close to the tracking space boundaries, the tugging force on the leash encourages the user to turn and walk towards the robot and away from the tracked space boundaries.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2) Maintaining Co-location: Normally, maintaining rela- tive co-location between a haptic proxy and a virtual object is a matter of updating the position of the haptic proxy whenever the virtual object’s position changes.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We also do this in our implementation by updating the position of the robot to match the movements of the virtual dog.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' However, our implementation requires additional work to maintain co- location due to a new problem which we refer to as the haptic proxy distortion (HPD) problem.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Virtual environment before rotation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Virtual environment after rotation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Physical environment and superimposed virtual relative positions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A visualization of the haptic proxy distortion problem.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Left: Initially, the virtual user and virtual companion have a particular relative position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Middle: After rotating the virtual environment around the virtual user, the relative position of the companion changes since the companion is rotated along with the rest of the environment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Right: In the physical space, the haptic proxy has not been updated, so its position coincides with the virtual companion’s relative position before rotation (opaque robot and vector).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The new relative position of the virtual companion, which the haptic proxy needs to match, is shown as the translucent robot and dashed-line vector.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In our implementation, we make use of a locomotion interface called redirected walking (RDW) that enables nat- ural walking in VR.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' RDW works by rotating the entire VE around the virtual camera that represents the user’s viewpoint in the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Consequently, the virtual dog companion may change its position relative to the virtual user without the dog actually moving to a new destination in the VE (see Figure 3).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Thus, as we apply redirection, the relative position of the virtual dog changes constantly, while the relative position of the physical robot does not.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To resolve this discrepancy in relative position, we check the relative positions of the virtual dog and physical robot on each frame, and update the robot’s destination in the PE to minimize the difference in relative position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The user will perceive this as the haptic proxy “sliding” across the floor around them, which might result in unsmooth motion that may detract from the user experience.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In practice, this did not seem to be a major problem for users, but we acknowledge that there may be better solutions to the HPD problem, and leave that for future work.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' This HPD problem adds onto the errors in relative co- location between the haptic proxy and the virtual companion, which makes it harder to satisfy the RC constraint.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Note that the HPD problem is not specific to our implementation;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' this problem is present in any application that uses haptic proxies and creates a mismatch between the user’s positions in the physical and virtual environments,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' as is common for Physical Environment Virtual Environment Virtual Environment Virtual Environment Virtual Environment UTU User reached the tracked space boundary,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' so a Superimpose the physical user-to-center If there is a potential pre-defined goal If there is no pre-defined goal position near If the superimposed user-to-center vector reorientation is required.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Compute the vector vector onto the virtual user to determine the position near the endpoint of the user-to- the endpoint of the user-to-center vector, intersects with a virtual object, use the from the user to the center of the physical goal position of the virtual companion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' center vector (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', a fire hydrant), set that as use the vector endpoint as the goal position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' furthest non-intersecting point along the " vector).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' the goal position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' vector as the goal position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='locomotion interfaces for mixed reality.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' V.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' EXPERIMENTS & RESULTS A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Experiment Design and Procedure To evaluate the effectiveness of our implementation of active haptic-driven locomotion prototype, we conducted a user study where participants completed a navigation task.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The study design was approved by our university’s Insti- tutional Review Board.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The goal of our user study was to evaluate how effective the haptic guidance was at improving users’ sense of presence in the VE and keeping users away from the boundaries of the VR system’s tracked space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We used a between participants design, where one group of participants completed a navigation task with active haptic guidance enabled, and the other group completed the same task without any haptic guidance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The navigation task had a time limit of 5 minutes and 30 seconds, after which the experiment ended regardless of if the participant reached the goal destination.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Participants were unaware of this time limit so that they did not rush to complete the task.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' We recruited 20 participants (13 male, 5 female, 2 participants did not report) through online advertising and oral recruitment.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Participants’ ages ranged from 18 to 28 (µ = 24.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='59, σ = 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='37).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' All participants were able to walk without any assistance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The study consisted of three sections, and lasted about 15 minutes for each participant.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' First, we debriefed participants on the experiment procedures and had them complete a pre- study Simulator Sickness Questionnaire (SSQ) [16].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Next, the user put on the HMD and completed the task in the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The VE was a city environment with several streets and blocks, and was populated with common objects such as bus stops, stores, park squares, and virtual humans that roamed around the environment (see Figure 1 and the supplementary video).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' To mask any potentially distracting noises from the robot as it moves, participants wore headphones and back- ground music was played for the duration of the experiment task.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Participants started the task at one intersection in the city, and their task was to reach a green question mark in the environment that indicated their destination, which was one block away from the their starting position.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' During the experiment, we recorded how many times users reached the bounds of the PE and the time taken to complete the task.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Once participants finished the task, they completed another SSQ survey and a questionnaire with questions on a 7-point Likert scale that measured their sense of presence in the VE (7 = high presence, 1 = low presence).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Finally, the experiment was ended with open-ended questions where participants could provide additional comments.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Results The metrics we used to measure the effectiveness of our active haptic-driven locomotion interface were the number of breaks in presence (BiPs), the completion rate and time taken to complete the task, and participants’ subjective feelings of presence in the VE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A BiP is incurred when the user reaches the boundaries of the tracking space and they are forced to reorient away from the boundary before continuing to walk.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' BiPs Time (s) Presence Completed Haptics µ σ µ σ µ σ Total # With 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='90 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='74 195.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='20 22.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='25 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='63 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='77 10 Without 18.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='90 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='17 309.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='40 65.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='14 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='57 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='64 1 TABLE I Performance results from our user study.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' THE “WITH HAPTICS” GROUP OF PARTICIPANTS INCURRED SIGNIFICANTLY fewer BREAKS IN PRESENCE (“BIPS” COLUMN), COMPLETED THE EXPERIMENT MUCH more quickly (“TIME” COLUMN) AND WITH MUCH higher SUCCESS RATES (“COMPLETED” COLUMN), AND REPORTED A higher SENSE OF PRESENCE IN THE VIRTUAL EXPERIENCE (“PRESENCE” COLUMN).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' THESE RESULTS SHOW THAT HAPTIC GUIDANCE CAN BE EFFECTIVE FOR IMPROVING USERS’ VIRTUAL EXPERIENCE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Based on the results in Table I, the presence of our active haptic guidance companion resulted in significantly fewer BiPs, notably lower completion times and higher completion rates, and slightly higher (and above-average) presence lev- els.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Meanwhile, participants who completed the navigation task without any haptic guidance incurred a large number of BiPs, did not finish the task in time, and reported below- average levels of presence.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' These results support the notion that active haptic guidance can be used to help keep users safe and feel more immersed in mixed reality experiences.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' VI.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' CONCLUSIONS & FUTURE WORK In this work, we presented the active haptic guidance problem for mixed reality (MR), which describes the use of one or more robots to provide haptic feedback to users in order to create a richer virtual experience for the user, while also influencing the user’s behavior to improve their safety and immersion in the virtual world.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' As a prototype realization, we implemented active haptic guidance in a VR locomotion application that enables the user to explore a large VE while located in a much smaller PE.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' By combining active haptic guidance and redirected walking, we increased the effective area of the PE while also decreasing the likelihood that the user exits the VR system’s tracked area.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The concept of active haptic guidance is general and can be applied MR applications other than locomotion;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' we discuss other potential use cases for active haptic guidance in the supplementary materials on our project page.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Limitations and Future Work: One limitation of our work is the haptic proxy distortion problem, in which the haptic proxy and the associated virtual object can become desynchronized due to mismatches between the user’s phys- ical and virtual configurations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Solving this problem requires continuously updating the position of the haptic proxy, and our proposed solution in this work is likely not the most optimized solution.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Additionally, our system uses only a rough estimation of drift to readjust the haptic proxy position, instead of a more accurate method like PID-based drift correction.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Future work in this area should investigate the use of more realistic companions and behavior models, and should explore how active haptic guidance can be applied to other types of VR experiences with different applications, such as social mixed reality settings with other users.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' REFERENCES [1] K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='-E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' ˚Aarz´en, “A simple event-based pid controller,” IFAC Proceedings Volumes, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 32, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 2, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 8687–8692, 1999.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [2] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Azmandian, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hancock, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Benko, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ofek, and A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Wilson, “Haptic retargeting: Dynamic repurposing of passive haptics for en- hanced virtual reality experiences,” in Proceedings of the 2016 chi conference on human factors in computing systems, 2016, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1968– 1979.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [3] W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Barfield and C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hendrix, “The effect of update rate on the sense of presence within virtual environments,” Virtual Reality, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3–15, 1995.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [4] H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Chen and H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Fuchs, “Supporting free walking in a large virtual environment: imperceptible redirected walking with an immersive distractor,” in Proceedings of the Computer Graphics International Conference, 2017, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [5] H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Chitsaz, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' LaValle, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Balkcom, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Mason, “Min- imum wheel-rotation paths for differential-drive mobile robots,” The International Journal of Robotics Research, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 28, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 66–80, 2009.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [6] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Di Luca, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Seifi, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Egan, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Gonzalez-Franco, “Locomotion vault: the extra mile in analyzing vr locomotion techniques,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [7] D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Feygin, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Keehner, and R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Tendick, “Haptic guidance: Experi- mental evaluation of a haptic training method for a perceptual motor skill,” in Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' HAPTICS 2002.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE, 2002, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 40–47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [8] C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hendrix and W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Barfield, “Presence within virtual environments as a function of visual display parameters,” Presence: Teleoperators & Virtual Environments, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 5, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 274–289, 1996.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [9] E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hodgson, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bachmann, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Waller, “Redirected walking to explore virtual environments: Assessing the potential for spatial interference,” ACM Transactions on Applied Perception (TAP), vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 8, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 4, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–22, 2008.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [10] W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hoenig, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Milanes, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Scaria, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Phan, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bolas, and N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ayanian, “Mixed reality for robotics,” in 2015 IEEE/RSJ International Confer- ence on Intelligent Robots and Systems (IROS).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE, 2015, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 5382–5387.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [11] B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Insko, Passive haptics significantly enhances virtual environ- ments.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The University of North Carolina at Chapel Hill, 2001.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [12] H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Iwata, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Yano, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Fukushima, and H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Noma, “Circulafloor: A locomotion interface using circulation of movable tiles,” in IEEE Virtual Reality 2005.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE Computer Society, 2005, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 223–230.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [13] H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Iwata, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Yano, and F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Nakaizumi, “Gait master: A versatile locomotion interface for uneven virtual terrain,” in Proceedings IEEE Virtual Reality 2001.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE, 2001, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 131–137.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [14] G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Jansson and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' ¨Ostr¨om, “The effects of co-location of visual and haptic space on judgments of form,” in EuroHaptics.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Citeseer, 2004, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 516–519.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [15] J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Jerald, The VR book: Human-centered design for virtual reality.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Morgan & Claypool, 2015.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [16] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kennedy, N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Lane, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Berbaum, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Lilienthal, “Sim- ulator sickness questionnaire: An enhanced method for quantifying simulator sickness,” The international journal of aviation psychology, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 203–220, 1993.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [17] L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kohli, “Redirected touching,” Ph.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' dissertation, The University of North Carolina at Chapel Hill, 2013.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [18] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kovacs, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ofek, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Gonzalez Franco, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Siu, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Marwecki, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Holz, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Sinclair, “Haptic pivot: On-demand handhelds in vr,” in Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, 2020, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1046–1059.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [19] A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kunz, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Miesenberger, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zeng, and G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Weber, “Virtual navigation environment for blind and low vision people,” in International Con- ference on Computers Helping People with Special Needs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Springer, 2018, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 114–122.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [20] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ocampo and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Tavakoli, “Visual-haptic colocation in robotic rehabilitation exercises using a 2d augmented-reality display,” in 2019 International Symposium on Medical Robotics (ISMR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE, 2019, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–7.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [21] T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Peck, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Fuchs, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Whitton, “Evaluation of reorientation techniques and distractors for walking in large virtual environments,” IEEE transactions on visualization and computer graphics, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 15, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 383–394, 2009.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [22] ——, “An evaluation of navigational ability comparing redirected free exploration with distractors to walking-in-place and joystick locomotio interfaces,” in 2011 IEEE Virtual Reality Conference.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' IEEE, 2011, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 55–62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [23] S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Razzaque, Redirected walking.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' The University of North Carolina at Chapel Hill, 2005.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [24] S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Razzaque, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kohn, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Whitton, “Redirected walking,” in Proceedings of EUROGRAPHICS, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Citeseer, 2001, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 105– 106.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [25] D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Rivera, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Morari, and S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Skogestad, “Internal model control: Pid controller design,” Industrial & engineering chemistry process design and development, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 25, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 252–265, 1986.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [26] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ruddle and S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Lessels, “The benefits of using a walking interface to navigate virtual environments,” ACM Transactions on Computer- Human Interaction (TOCHI), vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 16, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–18, 2009.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [27] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Sinclair, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ofek, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Gonzalez-Franco, and C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Holz, “Capstan- crunch: A haptic vr controller with user-supplied force feedback,” in Proceedings of the 32nd annual ACM symposium on user interface software and technology, 2019, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 815–829.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [28] A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Siu, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Gonzalez, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Yuan, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ginsberg, and S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Follmer, “Shapeshift: 2d spatial manipulation and self-actuation of tabletop shape displays for tangible and haptic interaction,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [29] A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Siu, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Sinclair, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kovacs, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Ofek, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Holz, and E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Cutrell, “Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds,” in Proceedings of the 2020 CHI conference on human factors in computing systems, 2020, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [30] S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Skogestad, “Simple analytic rules for model reduction and pid controller tuning,” Journal of process control, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 13, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 4, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 291– 309, 2003.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [31] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Slater and S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Wilbur, “A framework for immersive virtual envi- ronments (five): Speculations on the role of presence in virtual en- vironments,” Presence: Teleoperators & Virtual Environments, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 6, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 6, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 603–616, 1997.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [32] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Suzuki, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Hedayati, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zheng, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bohn, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Szafir, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='-L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Do, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Gross, and D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Leithinger, “Roomshift: Room-scale dynamic haptics for vr with furniture-moving swarm robots,” in Proceedings of the 2020 CHI conference on human factors in computing systems, 2020, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–11.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [33] M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Usoh, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Arthur, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Whitton, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bastos, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Steed, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Slater, and F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Brooks Jr, “Walking > walking-in-place > flying, in virtual environments,” in Proceedings of the 26th annual conference on Computer graphics and interactive techniques.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' ACM Press/Addison- Wesley Publishing Co.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=', 1999, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 359–364.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [34] R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Welch, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Blackmon, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Liu, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Mellers, and L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Stark, “The effects of pictorial realism, delay of visual feedback, and observer interactivity on the subjective sense of presence,” Presence: Teleoperators & Virtual Environments, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 5, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 263–273, 1996.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [35] N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Williams and T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Peck, “Estimation of Rotation Gain Thresh- olds Considering FOV, Gender, and Distractors,” IEEE Transactions on Visualization and Computer Graphics, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 25, no.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 11, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 3158– 3168, 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [36] L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zhang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Liu, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bai, Q.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zou, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Chang, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' He, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Wang, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Billinghurst, “Robot-enabled tangible virtual assembly with coor- dinated midair object placement,” Robotics and Computer-Integrated Manufacturing, vol.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 79, p.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 102434, 2023.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [37] Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zhao, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Kim, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Wang, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Le Goc, and S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Follmer, “Robotic assembly of haptic proxy objects for tangible interaction and virtual reality,” in Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, 2017, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 82–91.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' [38] Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Zhao, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Bennett, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Benko, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Cutrell, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Holz, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Morris, and M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Sinclair, “Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation,” in Proceedings of the 2018 CHI conference on human factors in computing systems, 2018, pp.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' 1–14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' APPENDIX Additional Applications of Active Haptic Guidance: Here we discuss other potential applications of active haptic guid- ance for immersive applications: Wood Carving Application: In wood carving, the grain of the wood will impact the direction in which the artist carves the wood.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' That is, sometimes the artist will carve “with the grain” and sometimes will carve “against the grain.” Using active haptics, one could accurately render the different resistance forces that arise from carving with or against the grain of a virtual wooden block, which will in turn influence the way in which the user carves their virtual wooden sculpture.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' In addition to providing a more realistic experience, this could be used to guide the user to create a more appealing final sculpture (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' by altering the direction of the grain to subtly change their hand movements, which will change the shape of the final carved surface).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Immersive Cooperative Application: A major appeals of mixed reality experiences is the ability to connect with other users in shared virtual experiences.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Important to these shared experiences is the ability to touch the other person, which can provide a greater sense of companionship and connection between users.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Haptic forces can be used to encourage users to interact with or follow other users who are also present in their virtual experience, which may improve the users’ sense of presence in the experience due to the enhanced realism.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' Virtual Cooking Training Application: Given a seated VR experience where the user is practicing their cook- ing skills in a virtual environment, a mobile, tabletop robot can provide haptic feedback that represents feed- back provided by cooking utensils.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' For example, when spreading brownie batter in a baking pan, the user will feel haptic forces when the virtual spreading utensil gets too close to the edges of the virtual baking pan.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'} +page_content=' These forces could be rendered using a mobile robot with a flat surface that serves as a wall that the user’s physical hand will bump into.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/-9E4T4oBgHgl3EQf4A3h/content/2301.05311v1.pdf'}