The subject of capacity planning is often a challenging task in medium-sized or large companies. With a growing number of employees, it can become difficult at some point to distribute them effectively and above all efficiently to task areas or projects. For such a planning a lot of data is used (“Big Data”), which is often difficult to visualize and interpret.
We at Centigrade have been working with new technologies like Virtual Reality (VR), Augmented Reality (AR) and other forms of 3D visualization for quite some time. This opens up completely new fields of application. One of these new fields of application could be the personnel capacity planning with the support of Augmented Reality, or in a further expansion stage also the planning of machines and material. By extending the data visualization from a 2D representation to 3D projections in real space, completely new interaction and design concepts can be applied, allowing the user to explore the data in a natural way. This could not only be more enjoyable, but it would also make more efficient data analysis possible. It would be particularly exciting to also feed the usage data of such planning rounds into the Big Data pool and thus to arrive at predictive capacity planning over time with ever better suggestions. However, the use of the still quite new technology raises questions:
- Can an AR application, unlike conventional desktop solutions, provide more effective capacity planning and efficiently support the HR department?
- What is the added value in terms of usability and utility when using such an application?
- How efficient is this in its handling?
- How do users rate the user experience?
As part of my master thesis I developed a holographic application in the last half year to find answers to these questions – DeepData. A project report.
Microsoft HoloLens AR glasses and web technology
For this application the Microsoft HoloLens AR glasses Microsoft were used. They allow the viewer to view computer generated 3D graphics and geometries through semitransparent displays. This is made possible by various sensors that analyze the room in real time and project 3D objects into the viewer’s field of vision based on this information. Microsoft uses two “light engines” for this. These work similar to small beamers and transport the image via total reflection through the display to the eye. The objects are then correctly displayed in perspective in the field of vision and remain in place in the room on the basis of the tracking data, so that they can be viewed from all sides.
Because of the hardware used, several variants of the development were possible: the creation of the application with the help of DirectX, Unity 3D or web technologies. We decided on web technologies. This meant that the application could be changed quickly after the fact and developed across platforms. For example, employees can also use the application without a HoloLens via a smartphone or laptop. We also wanted to find out whether it is possible to develop AR applications with web technologies in general and how well this works.
Understanding HR employees
In order to develop the application according to the needs of the user, we first had to understand them: who is a typical user? What tasks does this person have and what is their goal in planning meetings? These questions should be answered with the help of a requirements analysis based on a fictional employee.
This approach enabled us to derive individual requirements for the software. In addition, it quickly became clear at this stage that the employees attached great importance to effectiveness and efficiency in capacity planning. One of the requirements for the application was that it had to be clear which employees were overloaded or where planned and completed hours differ greatly. The last of these was to be implemented first.
New concepts and interaction designs for capacity planning
After we were able to get an idea of the future user, we developed the first concept drawings and interaction designs. Not only simple scribbles, but also 3D wireframes were used. These were created with the help of Virtual Reality.
The concept is fundamentally based on two views, each consisting of simple bars and axes of coordinates. The right-hand display shows how much capacity the individual employees can spend in the individual weeks and what hours have already been planned for them – a typical application in many capacity planning rounds. The icons for the employees can then be used to open a menu by selecting individual projects. Then the new project display (on the left side of the picture) opens with the planned and completed hours per employee and week.
The control concept of the entire application is based on gaze and gesture control. If the user wants to perform an interaction, he can either simply look at an object or click on it by executing an air-tap gesture. With this kind of interaction he does not need any additional hardware and can work with the program in a natural way.
Efficient, clear, playful …
… this is how the prototypical application should work. Users should be able to identify directly which employees are over- or underworked. The arrangement of the elements should be clear and well structured, so that the personnel planners would not lose the overview and would instead enjoy using the application.
In order to get a better idea of the 3D objects, I implemented the concept directly in a 3D modeling program. This allowed the individual elements to be viewed directly from several perspectives and gave a better feel for the effect on the user.
As you can see above, we changed the concept so that the user could see all employees of the company in one view (blue & green bars), using the whole room for information retrieval. This should allow the user to explore data in a new and natural way. If she now looks at an avatar or a bar in a view, a menu opens with the actual data for this bar. If she clicks on the avatar, a menu for selecting individual projects also appears.
Performance in the AR application: From code to 3D
The designed concept graphics now had to be converted into a real application on the HoloLens. A big problem at the beginning of the implementation was the performance of the AR glasses. In first tests it quickly became clear that the tracking and the creation of spatial data cost a lot of performance and therefore no complex geometries with fancy shadows or the like could be used. In order to solve this problem, I created my own geometries from scratch. I started with a simple cube consisting of single points and polygons. This could then be dynamically changed and generated due to its full parametrization. In order to improve the performance and to display edges better, I developed a simple shader, which made simple and performant shading possible.
After the subsequent creation of the individual views and their geometries, I was able to put the interaction design into action. I used different principles from the field of computer games such as raycasting to be able to interact with individual objects. For example, Raycasting made it possible to calculate which cube the user was currently viewing and which actions should be triggered accordingly within the application. This made it possible to re-color individual cubes or change the position of objects based on the user’s gaze. This also made interactions possible by means of gestures by generating individual menus as soon as the user looked at a particular object and executed the air-tap gesture.
The concept study – usability tested according to ISO standard
In order to be able to answer the research questions asked at the beginning, the users themselves had to be involved again at this point. They had to test not only the application but also the previous solutions in order to compare the two types of software. Not only the actual users were interviewed, but also employees who had nothing to do with HR or capacity planning. This allowed both groups to be compared and general statements to be made about the software.
I had divided the study itself into four different sections. We started with general questions to find out more about the subjects themselves. These included, for example, whether they were planning personnel at all or whether they had already had experience with Augmented Reality. We then measured the processing time of questions that might occur in planning meetings. These questions were to be answered once with the old solutions and once with the DeepData. We wanted to learn more about the efficiency of the holographic application. Finally, the study was rounded off by the standardized questionnaire of ISO Standard 9241 Section 110, which contained 21 questions on seven dialogue principles for good usability, and an interview on the user experience with the DeepData. The latter was intended to enable the ten respondents to provide objective assessments of the software.
The general questions showed that five of the ten participants plan staffing capacities every week, for the sake of simplicity I will call them team leaders at this point. In addition, eight of the test persons have already used an Augmented Reality device once. These were mostly HoloLens from Microsoft. However, other devices such as the DAQRI smart helmet or smartphones were also used.
The time measurements of the questions from planning meetings showed that area managers were faster with the previous solutions in some areas, among other things due to the existing experience with the software. However, a clear efficiency advantage could be observed in the case of extensive questions or questions that previously required several programs. The participants were able to achieve better results with the DeepData, especially in terms of recognizing hours worked and hours planned.
The data also showed that normal employees were generally faster with DeepData. This was due to the fact that the previous solutions were very confusing and the data was better processed in the holographic application. In addition, it could also be deduced from the collected times that users without experience with AR could obtain information just as quickly or faster after a little training time.
After the questionnaire had been reviewed, several things became apparent: Firstly, there are several areas in which the software performs well overall. This includes task adequacy, controllability, expectation conformity and learning facilitation. On the other hand, however, there are areas in which the developed application has major weaknesses. This includes the dialogue principle of self-descriptiveness, since one always had to open a legend at the beginning in order to find out which information is currently to be seen. But other areas such as fault tolerance or individualisation had also performed poorly. The former, because the user had to put a lot of effort into correcting errors on his part and the latter for example due to the non-existent filter settings.
Further analysis of the individual principles of dialogue in relation to different groups of users has led to the following findings: especially the adequacy of the tasks was rated as bad by the group of team leaders. This was mainly due to the fact that the application could only be used to obtain information, but no real planning could be carried out. In order to really distribute employees to individual projects, one still has to use the previous solutions at the current time. In addition, it could be seen that employees without experience in AR had difficulties in controlling the application. This was often due to the fact that there were problems with gesture control, for example when the system did not recognize the hand of a test person. In order to recognize the gesture correctly, the hand must be held in a certain sensor area.
At the end of the study, I conducted an interview with each test person about the software used. A lot of interesting information about Augmented Reality and this software could be collected. Each participant had different views regarding the DeepData and their use for capacity planning:
“I think it’s great that you can now see how many hours an employee has worked.”
“The visualization was very intuitive and it was fun to gather information through movement, but the field of view of the glasses was a bit small to see all the information at once.”
“I really liked the control with my head in interaction with the gestures. But after a while the glasses get heavier and heavier.”
“I found it great to be able to get such an overview of all employees and had fun learning the software.”
“I can’t imagine Augmented Reality with current hardware as a replacement for desktop tasks because I’m simply faster with normal 2D programs and they’re easier to use.”
“I could imagine that an AR application would eventually replace a normal desktop application. In addition, the hardware should be a little more comfortable to carry and it could probably only be used in special cases.”
“One could meet the dry topic personnel planning with more joy and completely new organization possibilities were revealed by AR. However, I wish for a filtering of the data and a possibility for active planning of the employees”.
The opinions about the created application are partly positive but also negative, from which many implications for the future use of AR in capacity planning, HR and other areas can be derived.
Future of AR in capacity planning and HR
The “DeepData” application from Centigrade makes it possible to obtain information about the workload of employees and their planned and completed hours in an augmented reality environment. This is a first step that makes the dry topic of capacity planning more fun to deal with. During the project I was able to gain a lot of information about developing Augmented Reality applications but also about programming web-based HoloLens applications in general.
I was able to show that such an application, which is connected to the company’s own Big Data pool and uses AR, is more efficient and faster than existing solutions in many areas, especially since it can display more information. The usability of the holographic application is on an acceptable but still very expandable level. Here are further steps to go. Above all, usability could be significantly increased by incorporating functionalities such as filter options or displaying critical values in the DeepData.
In general I can therefore say that capacity planning can be supported with Augmented Reality and personnel data can be explored more naturally. This results in a higher fun factor and the motivation of the employees can be increased. However, it must be mentioned that this is linked to certain prerequisites. On the one hand, the control options must be kept very simple and the hardware must become even more powerful and convenient. Here I am curious about the planned release of “HoloLens 3”, the new version of Microsoft’s AR glasses.
In a further development step, the application should also be better adapted to its tasks so that it can be used even more effectively. Overall, I can imagine that with a further developed version of the software, the entire planning of personnel, machines and material could be carried out in AR. The expansion into predictive capacity planning could also be particularly exciting and make work easier by linking the usage data from the planning rounds with further information from the Big Data pool of the respective company. Therefore, I look forward to the future of Augmented Reality in view of such applications, but also with regard to other fields of application in industry.
If you are interested in further AR and VR projects or if you want to inform yourself about our services, you will find more information about Augmented Reality, Virtual Reality and 3D visualizations on our service page.