CHS: Small: Direct physical grasping, manipulation, and tooling of simulated objects

  • Healey, Christopher C.G. (PI)
  • St. Amant, Robert R.A.S. (CoPI)

Project Details

Description

This project will develop and evaluate an approach to physical grasping, manipulating, and applying tools to simulated objects, and a means of exploring of three-dimensional information. The project is founded on the concept of tool use, in which handheld tool objects are used to modify the properties or appearance of target objects. The project integrates a wide range of findings in human-computer interaction and visualization, from bimanual and tangible user interfaces to augmented reality. At the core of this research endeavor is a desktop augmented reality system with a stereoscopic or monoscopic display, a haptic pointing device, and a camera focused on the user's hands. In one hand the user holds a physical wireframe cube that contains virtual objects, and in the other hand the pointing device, its tip visually augmented to show its function, which will relate to one of several possible tools, including: (a) a probe for pointing at, selecting, and moving objects, (b) a magnifying or semantic lens for filtering, recoding, and elaborating information, and (c) a cutting plane that shows slices or projection views. On the display, users watch the immediate, direct effects of their actions with the tools on the simulated object. The system will support visualization with fluid and natural interaction techniques, improving the ability of users to explore and understand 3D objects to some extent as if they were holding the objects in their hands. The project will provide societal benefits by generating theoretical and practical advances across multiple disciplines, such as a deeper understanding of the psychological theory and computer algorithms that are needed to give a person a seamless impression that he or she is manipulating an object that appears to be physically in front of the person, but which is only there virtually. It should be relatively straightforward to transition the practical outcome of this work to low-cost hardware components such as head-mounted 3D displays, such that the project could ultimately provide everyday users with a means of directly creating and modifying objects that they have in mind for 3D printing. University students will help to develop and evaluate the system, thus exposing these students to a new and potentially transformative approach to augmenting a simulated space with physical tools, and thus providing a larger population of students with opportunities for exciting hands-on computer science learning. Orthopedic surgeons have been recruited to assist in exploring the use of the system for preoperative surgical planning such as by permitting the exploration of complex 3D bone structures.

Development and evaluation of the novel interaction techniques in this hands-on augmented reality system will lead to a better understanding of how performance may be improved in data exploration with augmented tool use. The research will investigate: the spatial collocation of user actions and system effects; physical constraints between the hands, objects, and the environment; and a greater role for proprioception. Experimental results will give insight into questions that cross the boundaries between relatively disparate areas of research in HCI and visualization, specifically the potential benefits of proprioception in bimanual tasks, the extent to which mechanical constraints and stabilization can improve performance in precise interaction tasks, and how these factors may compensate for the visual errors associated with presenting 3D information on a stereo or monoscopic display. The research will extend beyond the laboratory to a real and important medical domain, which will help validate our work in practice.

StatusFinished
Effective start/end date1/8/1431/7/18

Funding

  • National Science Foundation: US$496,858.00

ASJC Scopus Subject Areas

  • Human-Computer Interaction
  • Computer Science(all)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.