The race is on to find ways of delivering virtual touch. Virtual reality systems have so far concentrated on vision and hearing but some set-ups now include sensors on gloves that allow monitoring of movements, and possibly some production of movements. There are also some systems that, with feedback from remote sites, allow surgery across thousands of miles via telerobotics. One such system is based in Perth and was recently demonstrated to Queen Elizabeth II. She was reportedly "amazed". However, there is a long way to go before she or any of us can enjoy the experiences available on the hollidecks of Federation starships depicted in the TV series Star Trek.
Impressive as some virtual reality systems are with respect to vision and hearing, creating virtual touch is a challenge we have hardly begun to face. We don't even agree on what touch is. Some of us use the word to include only the skin (cutaneous) sensations such as pressure, shear, tickle, itch, temperature change, and chemical reactions. Others include in touch the position senses of kinesthesis and proprioception, which terms are themselves used to mean different things by
different people.
To achieve realism in a virtual system we shall need to capture all the interacting features of our senses, regardless of what we call them. In view of this, I will side-step labelling issues for now and use the term 'haptics' to cover all senses to do with the reception or production of cutaneous and movement/position information.
What determines how real a percept is? One feature important for realism is common to all senses and is called externalization.
The University of Sydney acknowledges that its campuses and facilities sit on the ancestral lands of Aboriginal and
Torres Strait Islander peoples, who have for thousands of generations exchanged knowledge for the benefit of all.
Learn more