We are now all familiar with Alexa from Amazon as potentially a unique very easy universal interface for all my applications. (with this ethical caveat: the AI tool is misleading because Alexa is not here to facilitate my life … but foremost to increase Amazon business).
The next step which will be proposed very soon, will be to have your Alexa smart phone directly, … in your glasses in your Augmented Reality (AR) glasses. Always “on”, this will enable you :
- to quickly give instructions to various applications of your smartphone as you can do today with Siri,
- and also to give order to your headset with instruction like : “ show me that”, “guide me to there” etc..
But soon, this universal Voice interface will be replaced by a new killer Brain Computer Interface (BCI) with your brain. With this new ultimate interface you will get reed of this low speed interface which is your mouth which has a limited bandwidth and you will connect your high speed brain to an high speed AI in the cloud which will understand all your thought and instruction almost as the speed of light.
At the 2018 SXSW, we could see the first convincing real life demo by Neurable. A player on stage with a VR set equipped with 6 BCI captors, could play a high speed war game without any joystick. The AI on top of the 6 captors could read the player brain wavelength and translate those into executable instructions at a speed much faster than he could have done with a joystick. As an example, and at one point into the game, he was attacked by 4 laser beams launched by the enemy and was capable to destroy all of them as rapidly as he thought to do it !
Neurable is also promising us that in the next 2-3 years, this 6 BCI captors in the headset – which are not so nice looking today on your head – will be replaced by small captors at the end of the branch of our AR glasses… like for the alexa glasses.
Today VR applications and usages are limited by the quality of the rendering in the headset. One need a better image without any pixel showing as of today. Companies like Apple are working on next generation of headsets with 8K or even 16K resolution image.
Therefore in the immediate future, most usages will be in augmented reality (AR) where numerous applications has been developed. They have been many AR application in the B2B area with productivity gains in the range of 25%. For a change, and as a counterexample to the fact that “technology comes from the C”, this time in AR a new technology came from the B2B area before eventually penetrating the B2C. This might be linked to the mis launch of Google Glass four years ago to the general public.
The best “concept demo” is from Keiichi MATSUDA on YouTube on “hyperReality”. You see all the potential with your vision potentially covered up to 80% of the image you are watching right now. This can be for some people a dream and for other a nightmare.
It will raise the question about what is the reality for us. For the young generation the reality will be “augmented”. If there will not be a portion of augmentation in a spot, at a location, they will be at a lost. As a consequence if there will not be any superimposed information about the product on my AR glasses including the coupon and the best fit for me, most probably the young generation will not shop at your place. Transverse usage and transposition will demande it.
On the bright side, with such BCI killer interface, the fear expressed by Elon MUSK – of insufficient bandwidth to be able communicate with the cloud and the Artificial Intelligence above – should be overcome; let’s face it, it is more easy to remove your glass when you do not want this BCI any longer, than to go back to surgery to remove the brain implant Elon MUSK is suggesting we should ware. (Neuralink)
InnoCherche – May 2018