Prototypes of the Google AR glasses take to the streets

Google's AR glasses that translate, transcribe and navigate in the streets were tested in early August in real life.

The glasses will be visible on the streets in United States

Google's AR glasses had been announced for the first time only a few months ago. They were still in the design phase but we knew they would allow us to translate and transcribe dialogues between people using augmented reality.

On Tuesday 19 July, the American group finally announced that the project called Iris will be tested on the streets of the United States. Google wants to identify the progress and problems that users will encounter in their daily lives. "And as we develop experiences such as AR navigation, it will help us take into account factors such as weather and busy intersections - which can be difficult, sometimes impossible, to recreate entirely indoors," explains Juston Payne in a blog post.

Contact

A life-size test of the regulated AR glasses

To reassure, Google explained that the glasses look like ordinary glasses but will have an internal screen as well as visual and audio sensors but that a light will come on when images are recorded. In addition, all image data is recorded for analysis purposes only and can be deleted on request.

A Google support page also explains that the test areas are pre-selected and strictly delimited and the activities that testers can do are fully regulated. Finally, testers will have to undergo training on the use of the AR glasses, the protocols to be followed as well as on confidentiality and security.

3 functions in the heart of the testing of prototypes

The American giant wants to analyse the functioning and performance of 3 major functionalities. 

The first two are the translation and transcription of a dialogue between two people of different languages. Indeed, these were the two primary objectives of these AR glasses. Thanks to the internal screen, the glasses will be able to project translated or untranslated text in augmented reality in the environment of the user of the glasses. Thus, he will be able to follow and exchange with a person who does not speak his language. 

The final feature is the GPS in augmented reality. Indeed, the glasses will be able to project navigation directions to a destination. Users will no longer need to look at their phone to get around and will just have to follow the precise directions. The glasses will be able to recognise street changes and crossings.

Street testing will make it easier for Google to improve the various features as they will be able to see bugs and problems arising through the recording of image data.

en_GBEN