By Bertrand Braunschweig (Scientific coordinator of the “Confiance.ai” project at SystemX Technology Research Institute)
As our lives are increasingly connected and governed by digital devices, the question of how to interact with these devices is becoming increasingly important. The desktop analogy with its windows, icons, and folders is about fifty years old, just like the mouse invented by Doug Engelbart in 1968. Today, there is no longer any question for most of us to interact with a computer at “command entry” as it was in the beginning of computers under MS-DOS !
We have a variety of interaction modes, from the most frequently used Windows icon folder system to voice control (especially for personal assistants like Alexa, Siri, etc.), to touch screens, virtual reality, and haptic interfaces that react to our movements. Nous benéficions aussi de claviers intelligent qui se reconfigurent en fonction des applications and qui devinent les mots que nous voulons saisir en se basant sur des fréquences de cooccurrence (par exemple, quand je tape “propose demain” sur mon mobile Android, me le me Morning or evening “).
“Certified gamer. Problem solver. Internet enthusiast. Twitter scholar. Infuriatingly humble alcohol geek. Tv guru.”