HOLOLENS USED FOR PRECISE POSITION TRACKING OF THE THIRD PARTY DEVICES - AUTONOMOUS VEHICLES

A completely new area of HoloLens usage is proposed. The Hololens is an augmented reality device, which provides the high precision location information. Such an information is normally used to accurately position holograms within the real space with respect to the viewer (user of HoloLens). The information is precise enough to use it for reporting the position for the purpose of autonomous driving. Several experiments have been executed in vast areas (20 m x 40 m) in order to find out the potential error coming from vibrations or other effects when moving the HoloLens. The results show that the technology can be used for spaces, which are previously known by the system - pre-scan of the space is needed. The big advantage of the system is its readiness for indoor positioning applications with no additional infrastructure needed, simultaneous localization and mapping, complex space mapping and reached precision. The disadvantage is mainly the costs.

hence readily employable for even low cost small autonomous vehicles -Figure 1.
Disadvantage of the LIDARs can be the fact that the devices scan the environment usually only in one plane and they can have problems with glass obstacles (imagine e.g.glass doors) [10].The LIDARs, when correctly used with appropriate software for the SLAM task, provide reliable information on the current position with the high level of accuracy typically in the range of millimeters (defined as a mean error distance between the estimated location and the real location) [11].
An interesting device, for very precise position tracking using the depth camera, is Intel RealSence D435 -Figure 2. The measuring depth reaches as far as 10 m.In [12] it was used even for the health treatment outcome (gait analysis and facial tracking).
There are other proposed ways how to accurately, to a certain level, track the position using some kind of small infrastructure -RFID [13], RSSI [14].
A promising technology for position tracking is utilization of the ultra-wide band (UWB) localization.One of the best working devices is SEWIO [15].As reported, since it is a relatively new system, the accuracy and tag's battery life has to be further improved.The UWB technology needs a grid of anchors, whose proper placement is crucial for obtaining good positioning results [16].The worst scenario for the UWB localization systems is a narrow corridor.The estimated position of the robot from the UWB system is fluctuating in the order of tens of centimeters [16].Remedy to this can be a denser anchor grid.Nevertheless, the needed infrastructure is relatively easy to implement.

Introduction
The precise position tracking is an important thing on the way to the so-called Industry 4.0.The device, which is being operated autonomously, needs to know its current position and according to required trajectory adjust its movement.Positioning is a hot topic outdoors and indoors.Outdoors positioning relies mainly on the built infrastructure of satellite belts around Earth -Glonass, Beido, GPS or Galileo.The precision of the GPS, with cooperation of other global positioning systems, can achieve the order of tens of millimeters [1].The trend of combining the multiple systems in order to enhance accuracy of positioning information is an interesting concept, especially in urban areas [2].The demand for precise positioning is mainly due to the expansion of autonomous vehicle control into real life.The precise outdoor and indoor positioning requires very often some kind of infrastructure (external referencing systems).In the outdoor case, that is mainly global positioning system, mobile network [3] or other built speciality proprietary infrastructure [4].Such systems are susceptible to failures due to unavailability of the required infrastructure [5].
As a result, there is a demand to solve the precise positioning for various purposes without any additional infrastructure [6].This demand is accented especially for indoor position tracking.There is a huge potential of autonomous vehicle control in wellknown spaces for tedious manipulation.One can name a few examples: transportation of medical material within hospitals [7] or paper manipulation in data centers [8].In [7] a powerlink interface for a fast data transmission was used.The Pathfinder used front and rear lidar and special software available from GitHub for the precision position tracking.
The LIDARs (light detection and ranging) are often used as a part for autonomous SLAM (simultaneous localization and mapping) [9].The LIDARs costs can be as small as 100$ and Hence, one can scan the environment in which the vehicle is moving using the HoloLens.A special program in Unity and Microsoft Visual Studio was created and loaded into the HoloLens.The program was taking system generated location information from the HoloLens and sending it to a Ubuntu based server where a Node-RED server was installed.
A simple code, generating actual position string, is added to the Main Camera object within the Unity scene.This GameObject is moving with the move of the glasses.
A string was created, which is composed of position (x, y, z) and rotation (a, b, c) information.This string is sent both to glasses (in order the user to see his actual position) and also to server via the UDP.
For sending information, a UDP packet was preferred since it is a simple connectionless protocol without any handshaking.There was no need for any guarantee of delivery, ordering or protection, since the packets are sent with high frequency and hence few packets loss is not crucial.
In the code, the SendUDPMessage method was used from the UDPCommunication class, which is defined to be available for no Unity environments -hence the statement "#if !UNITY_ EDITOR".When the code is loaded to HoloLens all the code within pragmas "#if !UNITY_EDITOR" is processed.
The UDPCommunication class is all in the "#if !UNITY_ EDITOR" statement, since this functionality is not supported in the Unity (but it is supported in HoloLens).For sending an async and threading principles are used.A helper function for memory stream was used, as well.
The messages are sent to a convenient server where the Node-RED is installed.In a simple flow, it was possible to parse the received data and send it immediately to a MSSQL database.The Node-RED interface (Figure 4) was chosen for easy implementation ability and availability of immediate responses to certain not valid data obtained from the UDP messages.Moreover, it enables easily debugging of the whole application.
any additional infrastructure.Based on the above search for up-todate used technologies was decided to examine another SLAM device, which was to the best of authors' knowledge, not yet used.
In the past, some SLAM methodologies were proposed and implemented on autonomous robot, such as LSD-SLAM, REMODE, ORB-SLAM.Their differences can be distinguished based on the type of method, which ORB and REMODE use the feature-based, but LSD-SLAM implements direct method.The ORB and LSD visualization are built-in, while the REMODE utilizes ROS/R Vis feature.The REMODE has no camera trajectory module, but ORB and LSD SLAM do have [17].
The Microsoft HoloLens, Figure 3, which is direct based, has built-in visualization and no camera trajectory module; is a headmounted device used to project augmented reality (AR) into the users' field of view.The HoloLens is an optical see-through device with many sensors -gyroscope, magnetometer and accelerometer.It also includes the positional tracking subsystems -depth camera with 120° x 120° FOV (field of view) and four grayscale cameras.Nowadays, the big limitation of the HoloLens is its cost (thousands of USD) and from the user point of view very disturbing small field of view 30°H and 17.5°V.It is used for many purposes ranging from educational deployments to easy robot manipulation programming [18].
The HoloLens has a database of environments in which the device was operated.If a new environment is presented it is added to the database list.The HoloLens possesses a unique algorithm, which is based on the depth camera and four grayscale cameras with cooperation from other sensors (gyroscope, accelerometer).The device is constantly monitoring its neighborhood as far as 5 meters away from its cameras.It uses a proprietary correlation algorithm to find the actual position based on the data coming from various sensors.
In this project, the focus was on an autonomous vehicle control in known environment.As already previously stated such a scenario is widely implementable for many real world usages.
(start of red dots -column 2) and looking directly to the uppermost column (column 3).In such a way it was certain that the coordinate system is positioned well.The vestibule and the connecting corridors were prior scanned also with the help of the HoloLens.It was tried to move the vehicle close to the walls of the corridors in order to evaluate the measured position.As visible from Figure 6 the positions are quite precise following the walls of the corridor.

Tests
The preciseness of position determination was tested.On the testing polygon in vestibule of Institute for Nanomaterials, Advanced Technologies and Innovation, Liberec a sufficiently big space (roughly 50 x 30 m) was at disposition for testing autonomous vehicles.The polygon is shown in Figure 5.The application was started when leaning on the middle column The measurement in the free space -vestibule, was continued.The vehicle was moving again along the walls in order to be able to compare measured positions with the real ones.The trajectory is visible in Figure 8.The position error was up to ± 28 cm in x and y axes.
The potential of monitoring the elevation of autonomous vehicle was further tested.One can imagine the use in the case as when the vehicle is driving to elevated areas for freight loading.In this case, the HoloLens only mounted on the head when moving on the staircase was used.The results are shown in Figure 9 as In these tests the focus was on movement within corridors.As stated in [16] such a setup is challenging for precise position monitoring for the UWB based positioning devices.The corridor was 2.5 m wide.Compared to the results from [16] the better deviations from the real movement were achieved.The movement was a simple inline movement along the walls.Figure 6 shows similar results of maximum deviation not exceeding approximatelly ± 20 cm from the projected trajectory.Red and green lines are measurements when the vehicle was moving along the "horizontal" long corridor and blue and black lines are measurements when moving along the "vertical" short corridorsee Figure 5.
The measurements were repeated many times.The trajectory was starting always at one of the columns (column 2 from Figure 5) and finishing at the same place.The error was measured as an absolute value of the difference of x and y position coordinates x-y and x-z rectangular projections.The 3D trajectory is clearly visible.The measured and real elevations were compared and the fluctuations were no more than 32 cm -Figure 10.
Several tests were performed in order to find out the feasibility of using the HoloLens for position tracking.This approach was used in order to construct an autonomous vehicle with the new concepts of controlling.The next task will be a design of a system for controlling more vehicles in one area.The idea is to develop an industrial autonomous vehicle and for such cases cooperation of several units is a necessity [19].It is perceived that proposed solution can be in these days costly but a strong decline in the price is anticioated as the AR platform spread accross.

Conclusion
The achieved experiment showed feasibility of the HoloLens for easy and readily available position tracking.A program for the HoloLens was presented.The program is reporting the current

Figure 4 Figure 5 Figure 6
Figure 4The Node-RED interface for sending data to database

Figure 7 Figure 8 Figure 9
Figure 7 Errors in position location at the start/finish column (green x direction, red y direction)

Figure 10
Figure 10 Measured points when moving along the staircase; detail the points depicted in the graph