InnerSense
A multi-sensory system for inner-VR-headset experiences.
We feel the real-world through a range of senses. Similarly, experiencing a virtual environment through multiple sensory modalities may augment both our presence within a scenario and our reaction to it. Researchers and developers have been long interested in adding more dimensions to the VR experience, such as haptics, stereo audio, and even olfaction. Most of these add-ons, especially haptic devices, however, are based on hands. This is reasonable - for the majority of the time, we touch the world with our fingertips. They ignored that our eye area, which plays an important role in ambient perception, is very sensitive to a range of stimuli such as temperature and load, because the area is covered by the headset and there is limited space to facilitate any simulation. Plus the field of view(FOV) of a VR headset is very narrow, compared to our eye vision. This lead to missing information about the virtual experience.
To address these issues, we managed to insert tiny actuators inside this space, introducing InnerSense - a multi-sensory system for inner-VR-headset experiences, including Inner-headset air flow, thermal feedback on cheeks, and luminance enhancement. The system was controlled by an Arduino Mega. The headset we used for experimentations was Oculus Quest 2. We also developed several scenarios by Unity.
Sensation on Face
The above two pictures show the high-density thermal sensitivity maps of the human face area and mean tactile sensitivity maps of the whole human body. These indicate that our cheek is sensitive to warming and very sensitive to cooling. From the right picture, our eye area displays relatively high sensitivity towards tactile and nociceptive stimulations. Thus, the area that is covered by the headset demonstrates a promising reaction if certain types of ambient stimulations are applied.
Vision of a VR Headset
The FoV of the Oculus Quest 2 is about 104° in horizontal and 96° in vertical. Its horizontal FoV is noticeably narrow than that of our eyes, which is 124°. The vertical FoV of our eyes is 120° which is massively wider than the Quest 2 can provide. We thus came up with the idea that using extra LEDs compensated for the limited visual field of the headset display.
Fabrication and Early Experimentations
System Overview
The hardware was built on Arduino Mega. The electronic components were soldered on a shield. The system was loaded with a 12V power source and was connected to a MacBook installed with Unity.
The thermal feedback was provided two individual heating pads, each of which was controlled by a IRFZ44N N-Channel MOSFET Transistor. The heating pads and the facial interface were stitched. To facility the inner headset airflow, two independent centrifugal fans were controlled by a L293D motor drive. Luminance enhancement was achieved by a 6 LED array.
Thermal Feedback on Cheeks
Two individual heading pads work independently, this potentially could provide directional clues. The temperature rises from room temperature (about 26°) to the peak value of 69°. By loading various voltage through the MOSFET Transistors, the temperature can reach 6 steady steps:
Level 1: Room temperature 26°
Level 2: Body temperature 37°
Level 3: 45°
Level 4: 53°
Level 5: 60°
Level 6: Peak - 69°
Inner Headset Airflow
Two customized rings were fixed to the lens. One end is connected to a centrifugal fan tightly. On the other end, seven holes are allocated equally on the ring, allowing air to flow from the outside headset to the inner side.
When a 12V power is loaded, fans work with a full capacity, producing full wind power. When adjusting PMW, the rpm of the centrifugal fan reduces, leading to a mild wind experience. Since two fans can work separately, with different airflow strengths, there is a potential to produce a directional clue for users.
luminance Augmentation
The luminance array consists of six LEDs. 4 of them were stuck to the rings, and 2 are linked to the inner side of the headset edges by 2 pairs of magnet. Such a design has got the advantage of each assembly. By controlling the PMW, each LED is able to produce gradual luminance.
4 LEDs locate at the top/bottom of the left/right ring respectively, another 2 locate at the left of the left ring and right of the right ring respectively. By judging the relevant location regarding the virtual light source and the user, the LED array is able to provide extra luminance to fill the gap between the FoV of eyes and the display - this can be used for different purposes such as navigation, providing extra information, atmosphere rendering.
Applications
We create different different scenes on Unity to test our prototype.
Directional clues
Each module can provide a directional clue in the virtual world. Heat and airflow could usually deliver non-visual directional information to the users. From our experimentations, if only one fan is on, or if there is a noticeable RPM difference between two fans, participants can instantly felt such a difference and can tell where the “wind“ was from. Similarly, once there was a temperature difference between two heating pads(more than one level gap), participants could tell if they were approaching a fireplace from a certain direction. LEDs served as a more straight clue for the participants, with a variety of brightness combinations, participants can get extra information. Furthermore, to integrate both or all of the modules, InnerSense created multi-modal clues for VR users. For example, with airflow and heating, participants described a sensation of hot wind.
Immersive entertainment
The combination of the modules can not only provide direction clues, but also can render a certain atmosphere, thus influencing users’ affections. For instance, LEDs and heading pads were good at creating extreme strong lights(sun dazzling) and sparks. In the roller coaster scene, with the wind blowing onto participants’ eyes, the immersive experience was dramatically enhanced.
By Ling Qin, Chirstian Rottger, Jonathan Tang.
Jan. 2021 - July 2021
Sponsored by: Gravity Sketch, Dyson School of Design Engineering at Imperial College London, and Department of Mechanical Engineering at Technical University of Munich