In the last few years, more and more people have started talking about “Virtual Reality”. The possibility of completely immersing in a virtual world via new technologies like e.g. the Oculus Rift fascinates gamers, developers and UX-Designers alike. Looking around a virtual environment by just turning your head, or moving virtual objects with your own hands, offers a completely new and extremely direct way of interacting. In consequence, many users of VR applications really feel like being inside of the virtual environment. It is exactly this feeling, called “immersion”, which makes users expect to be able to really interact with the virtual objects, just as naturally as they would with real ones. But unfortunately, this is not possible with the contemporary setups. VR-glasses just offer visual access to the virtual world. Hence, a user touching a virtual object will not feel any haptic feedback.
To discover how the integration of the tactile sense into a virtual reality application affects the immersion of their users, we at Centigrade developed the prototype “DeepGrip” – an application combining visual and haptic feedback in a virtual reality.
Modelling, testing, tricking and tinkering – Force Feedback and Virtual Reality
With DeepGrip we wanted to develop a prototype allowing users to interact with objects in a virtual reality and to receive haptic feedback from these objects. Therefore, at the beginning of the project there was the big question of which virtual objects we wanted to simulate. Both the simulation of fixed resistance as well as weight and other physical forces seemed to be interesting. For combining as many of these aspects as possible we finally decided that users of DeepGrip should be interacting with virtual magnets.
Repulsion as well as attraction of different magnets placed in a virtual room could be simulated when users approached them. Furthermore, users holding a magnet in their own virtual hand could pick up a magnetic object and feel both the touch of this object and its weight. Thus, the challenge in the development of the prototype was to create a simulation of different magnetic forces as realistic as possible. Thereby, our project goal was to imitate the behavior of a real magnet by combining both feeling and seeing in such a realistic manner, that users should not be able to distinguish between virtual and real magnetic forces.
For the realization of the prototype we combined the head-mounted Display Oculus Rift, DK2 with the force feedback device Novint Falcon. Both devices originate from the gaming industry and were originally developed to make the experience of a game as intensive as possible.
Since the Oculus Rift has not been released to the public yet, the Development Kit 2, provided for developers, had to be used and adapted for the project. Additionally, waiting for the Novint Falcon was especially exciting – it took its good time arriving in Germany from America. Would it meet our expectations towards a realistic force-simulation? How would it feel to move around in a virtual world by using its control element?
When the Falcon arrived, we were a little disappointed. The included tutorial was obsolete, so that we were not able to test the device accurately. Only the test environment gave us a little foretaste of the possibilities the Falcon could open up. At least we could convince ourselves that the Falcon was able to create strong physical forces in a 3D-range of 10x10x10 cm. In fact, the created forces could be so strong, that a user holding the control element would have difficulties in fighting against them. Like that, the most important preconditions were met and we could start developing DeepGrip.
For developing the application we used the game engine Unity3D. Since this engine is made for combining as many in- and output devices as possible to give their developers as much freedom as they want, integrations for the gaming devices Oculus Rift and Novint Falcon were already available. That’s why the first successful combination of these two devices could be realized with surprisingly little effort.
The preparation of the 3D-scene in Unity3D could also be realized easily, because all necessary models, animations and textures could be created and then simply imported into Unity. As a testing scenario we chose a warehouse, with users slipping into the role of a virtual warehouse worker. From the perspective of this worker users should find themselves standing in the warehouse in front of a virtual table. On this table different magnetic objects were placed to interact with. In addition, users should hold a magnet in their virtual hand.
Since the Oculus Rift provided the needed integration for Unity, the project’s visual component could be implemented very quickly. After creating the warehouse and placing the model of a male worker we just had to position the camera on the workers neck. For this purpose we removed the models head, so that users wearing the Oculus should have the possibility to look at their virtual body from the workers perspective.
At this point the Novint Falcon came into play, letting the user control the movements of the worker in addition to his sight. Our goal was to give users the possibility of moving the worker’s arm just as if it was their own. Like that they should control the virtual arm by moving their own one. The user’s movements should be transferred from the control element of the Novint Falcon to Unity. To make this connection between reality and virtual world as convincing as possible, the virtual worker should not hold the magnet directly in his hand, but rather hold a handle fixed to the magnet. Therefore we attached a grip-like handle to the Falcon (which was a real piece of handiwork, too!), so that each user would feel the same thing in his real hand, as when looking at his virtual hand.
Now, for simulating the magnetic forces, we defined the spheres of the magnetic fields in Unity and specified their strength.
Soon we noticed that the Novint Falcon needed a really high frame rate to ensure a realistic force simulation. As soon as the Oculus Rift was connected, the integration-scripts forced a frame rate of 75 FPS. This might seem high at first glance. But the Novint Falcon expected up to 1000 FPS to realistically map the humans tactile sense. With only 75 FPS a realistic force simulation could not be implemented – when tested, one could only feel a strong jerk when approaching a magnet.
For this reason we had to adjust the relevant scripts from the producers and completely deactivate vertical synchronization. Thereby we could increase the frame rate to 130 FPS. By foregoing dynamically generated shadows another 20 FPS could be gained. Of course 150 FPS is still a far cry from the 1000 FPS the Novint Falcon asks for. But surprisingly, it was actually enough to replicate a real magnets behavior. Repulsion, attraction and the simulation of weight could be realized with this frame rate. Unfortunately, fixed resistances like for instance the table the worker was standing in front of could not be simulated at this rate. Since we did not succeed in improving the rate much more, we had to trick the system a little bit at that point. To simulate the strike on the table, we constructed a platform out of Lego bricks and positioned it directly underneath the Novint Falcon. Since the users are shielded from their real environment by the Oculus Rift while working with the application, they can’t perceive this “trick” visually, but they can feel the real resistance when touching the virtual table.
Real or simulated?
Finally, to determine how realistic the simulation of the magnetic forces really was, we carried out a study with 38 subjects. Since we wanted to know whether the participants were able to make a statement about the authenticity of the magnetic forces, we split them up into two groups. The participants of the first group should work with the prototype and in consequence only feel the simulated magnetic forces. For the second group we created a test setup with real magnets. Of course the test subjects were not informed to which of the two groups they belonged, they only knew both groups existed. After the test we wanted to ask each participant if he believed that he had been working with real or simulated magnets during the test.
To make sure that the statements of both groups were comparable, the same scenario had to be used for both test groups. That means, that the participants in both groups were wearing the Oculus Rift during the test and both found themselves in the same virtual environment. Also, in both groups the movement of the workers arm was controlled with the Novint Falcon. Just that in the group working with real magnets, the Falcon only functioned as the object to be moved and to track the movement of the hand – not for any force simulations.
Thus, the only difference between the two test setups was the platform below the Novint Falcon. This is where we placed real magnets for the second group. In contrast, for the group working with the simulation we only used the Lego platform.
The test participants were from different age groups and professions. About half of the subjects had plenty of experience in informatics and computer games, while the other half had only little computer knowledge. Nevertheless the test could be done by all participants without any problems.
The test always started with a few explanations – what is the goal of this test, what is the task of the participant, what should he or she concentrate on etc. After that the subject was asked to put on the Oculus Rift and the application was started. At the beginning of the test we gave the participant a few seconds to look around the virtual room and to inspect the virtual body. At this point especially the workers shoes often provided amusement – nearly every participant shouted out “hey, my shoes are orange!” while looking down his virtual body.
When the test persons had reoriented themselves in the virtual environment, the test leader guided their hand to the Novint Falcon and gave the instruction to slowly move the arm upwards. Even though some participants felt a kind of uncertainty during the first seconds of the upward movement, everyone intuitively recognized how the system works the moment they noticed that the virtual arm of the worker followed their own arm movements. After that, the subjects had the possibility to grab a magnetic object from the table with the magnet in their virtual hand and then move it to a repulsive and an attractive magnetic area. In a second test round, all participants were asked to concentrate on the magnetic forces and ponder whether they felt real or simulated.
After each test we conducted an interview with the subject that was opened with exactly this question: real or simulated? Some questions about the general impression and the system operation followed. Afterwards every participant filled out a questionnaire, in which they could assess the experienced degree of reality and the control of the system in detail.
Especially older subjects with only a limited knowledge of and affinity to computers described the system control as “simple” and were partly surprised of how easily they could orientate themselves in the virtual environment. Many test persons said that they had the feeling of actually “being in this room”. Besides, more than half of the participants were of the opinion, that especially the identification with the virtual model and the possibility of a direct interaction with the objects had strengthened this impression. In particular the handle, as a connecting element between reality and virtual world, was positively highlighted by some of the subjects. One of the participants said: “When I felt the handle in my hand, I completely immersed into that world.” The control as well as the immersion in the scene where positively highlighted in both groups, which again transpired during in the analysis of the questionnaire.
Asked to judge the authenticity of the tested magnet, most of the subjects needed a little time to reflect. Although many came to a statement quickly, they were often unsure if they were really right. In both groups there were people who could not decide for one of the answers until the end of the interview. Some of them even changed their mind during the interview. Exactly this kind of ambivalence in both groups indicates that the loss of a direct, visual connection to an object in your own hand can make it really difficult to assess its authenticity.
The evaluation of the final answers of all subjects resulted in a nip-and-tuck race between “real” and “simulated”.
Thus, in the group working with real magnets, 9 persons believed that the magnetic forces were simulated, while 10 thought that they were real. In the other group, which worked with the simulation, 9 subjects judged the forces as real, while 10 classified them as simulated. What a great result! We were almost surprised ourselves, how similar the subjects in both groups had experienced the magnetic forces. Thereby we could clearly verify that through combining visual and haptic feedback in a virtual reality, magnetic forces can be simulated in such a realistic manner, that subjects are not able to identify the simulation as one.
With the prototype DeepGrip it is possible to simulate realistic magnetic forces in a virtual environment. Moreover, it allows users to interact with objects in a virtual world in an intuitive way. Through the combination of visual and haptic feedback DeepGrip succeeds in strengthening the immersion of its users.
The result of the study on the combination of sight combined with touch clarifies, that virtual environments can strongly benefit from complementing them with a convincing haptic component. After participating in the test many subjects expressed their wish for more applications in this field.
Virtual reality applications combining seeing and feeling could be profitably used in many different areas. Training and simulation for students both in medical and industrial areas could be improved. For example soon-to-be doctors could train performing surgery in a virtual environment during their education, without the risk of causing real damage to a real patient. Workers in the industrial area could learn how to handle special machines through visual-haptic VR-applications, because there their movements can be guided and thus trained. Moreover, remote control of machines could be facilitated and improved, because with visual-haptic VR users could have the possibility to handle machines all over the world as if they were directly in front of them.
With the many possibilities it offers and the increase of quality for virtual environments displayed in this study, it becomes clear how important visual-haptic VR will become in the next few years. Although until now barely considered outside of the gaming context the impact to industry and medicine will be an exciting one to watch and participate in.
DeepGrip shows, that it is possible to develop a VR-application convincing users of the authenticity of simulated physical forces, even with a relative small amount of budget and time. This is why we at Centigrade are really curious about the possibilities, opportunities and projects, in which this combination of seeing and feeling in a virtual world will be used in the future.