How does AI help user experience improvement?

Portable technology continues improving the design and functionality of devices. Applying Artificial Intelligence (AI) in wearable development allows to create new sophisticated algorithmic models that help to improve the user experience. Machine learning, computer-vision or facial recognition are some tools that AI works with; they collect valuable data that will be used to optimize devices.

If you still don’t know what wearables are and how they work, we invite you to read our previous entries.

What Artificial Intelligence is?

Artificial Intelligence is understood as the use of algorithmic combinations in order to equip machines with capabilities similar to those of human beings. Although it seems that we are talking about science fiction, this technology is present in our day to day. Stuart Rusell and Peter Novig, experts in computer science, propose four types of AI:

  • Systems that think as humans. They automate actions like decision-making, problem solving and learning.
  • Systems that act as humans, like robots.
  • Systems that think rationally simulating the logical thinking of humans.
  • Systems that act rationally imitating human behavior.

Nowadays, AI is present in many everyday devices such as mobiles, through facial recognition, virtual voice assistants such as Siri or Alexa, bots for customer services or mobile applications.

AI in wearables market

According to the study Wearable Artificial Intelligence (AI) Market Size, Share & Trends Analysis Report and Forecast, 2019-2025: “AI finds its most important application in the electronics consumer segment, due to the increasing consumer preference for devices smart laptops to improve their living standards.” Following the results of this research; health field and the sports industry are the main areas of development and use of wearable Artificial Intelligence.

There are many AI applications in wearables, some of them with a great impact on people’s lives. Such for example, the ability of some systems to learn visual signals and to transmit this information; so that sighted people can recognize buildings, roads, street furniture, pavements, sidewalks, curbs and corners. Also the machine learning by monitoring the physiological and emotional arousal signals of children with autism spectrum disorder or epilepsy, that allows the creation of emergency protocols. We also find other examples beyond the health field, such as the natural language processing algorithms that allow to translate messages in real face to face conversation without the need of a translator.

Below, we share two examples of wearebles that use Artificial Intelligence to improve their benefits and functionalities.


This smartglasses helps blind and visually impaired people to “read their surroundings“. Through computer vision, gestural recognition and natural language processing; they identify and analyze the different objectives that make up the environment, becoming it in auditive information for the user.


Empower Me is an application aimed at people with autism. It has the objective to help them with social and cognitive skills. Through Google Glass hardware and Affectiva emotional recognition software, they help this collective to address issues such as social interaction, language, behavioral self-control and work skills.