With the contribution of the LIFE programme of the European Union - LIFE14 ENV/GR/000611
You will no longer be lost in the big cities
The “Augmented Reality“ of Google Maps is already available for so called “local guides” (registered users, supporting the development of the services of Google Maps). Limited access for the general public to the AR function is because the continuous development of it. The function is still in “Alpha” state and has been developed.
The meaning of the function is to help the navigation of the pedestrians in the densely built-up localities of the urban environments, where due to the high presence of ferroconcrete, which can jam the signal; it is sometimes problematic for Google Maps to define the exact position of the pedestrians. Also the disruptive influence of the electromagnetic emissions from all others electronic devices can´t be neglected, because of their great quantity in high populated city districts. It is often not possible to define the exact location of the pedestrians, which are moving nearby massive buildings (skyscrapers for example), so happens their actual locations to be displayed on spot different than it should be. Than the proposed route can be also mismatched, or at least will be not the optimal one.
It can´t be forget the human factor influence – the users receive the relevant and proper information from the navigator, but they are not able to interpret it correct. For example it could happen if they are not able to convert the route drawn in 2D to the 3D of the urban reality. To be in unfamiliar surroundings can confuse a lot of people.
The mentioned function of “Augmented Reality” should be the so long awaited solution. The first announcement about this project is dated in May 2018.
The standard navigation interface of the application of Google Maps remains same and is showed in horizontal position of the smartphone, but if the device is flipped to vertical position, the drawn map is reduced to the bottom part of the display, the main camera is switched on and on the free part of the screen is shown the recorded environs. After a few seconds, when the application “orientate itself” according what it “see” by the camera, the application projects onto the recorded view of the environs animated arrows and other helpful information, which helps to simplify and to define unambiguously the suggested route for the pedestrians. The application can even ask the user “to take a look around”, if can´t “see” any known objects, which can be used to locate the exact position of the user. It could happened if something (for example a tree or a big lorry) is covering the view, or just the sidewalk with a small part of the buildings are visible, than the recorded view is insufficient and irrelevant. For the proper localization in the urban land shaft it is necessary for the application to “see” objects, which are already “known” – mostly the facades of the buildings with all their unique permanent identifications (for example they could be significant show-cases, permanent corporate and shops signs, etc.), or the street name signs, of course. It is obviously, that if some “known” object changes significant its view (if some building undergone reconstruction so substantial, which totally changes its view), it could happened, that this object become “unknown”. Than the application “asks” for additional information – “requests”, that user “looks around” – to “find” another “known” object.
After the pedestrian walks again and if he still holds the phone in vertical position (augmented mode of the navigator), for security sake pops-up warning to flip the phone to horizontal position (navigator is switched again into the classic map mode). The reason is, that the pedestrian could not pay attention, because the incoming information on the display (recorded view augmented by animated navigation information) and thereby walking to put in danger himself or other pedestrians. Augmented mode is made for finding the right direction during the confusion moments, when the pedestrian is still standing and is trying to find his way around.
The company tried different types of animated navigation elements, but best solution was arrows drawn horizontally, in the high of the smartphone above the recorded surroundings. It has been tested also another variants as for example blue navigation line, but all of them was not so successful, because the users have were having subconscious efforts to follow exactly the animated guide elements and walk them, which was risky for the users.
Another improvement for the users is the option for them to study walked surroundings enabled by the augmented mode of the navigator, which displays not only the animated navigation arrows of the direction to walk, but also shows additional information about the objects, visible in the recorded actual surroundings. Those information pop-up in interactive balloons, which are linked to the information descriptions of those objects (photo gallery, reviews,…) from the Google database.
The technical background of this helper, which can be a bright future predicted, is based on principles of self-educational elements of the system, especially with the database of Street View (from Google) and the service Visual Positioning Service (developed by Google too). The service Visual Positioning Service compares Identification elements (significant parts of the buildings and the other chosen objects, which are “known”) from its database with “seen” surroundings through the smartphone camera. When accordance is found, than in conformity with the topographic coordinates connected to the recognized objects sets the topographic position of the device, from which the application “looks” (in fact it means also the coordinates of the user of the device). In near future this will be the way used by Google to reach their plan to provide to the general public a new generation of complex planning, informative and navigation services, called by Google as “global localization”. It is another step in development of Internet of Things (IoT).