Researchers at the University of California, Berkeley, introduced a new device equipped with wearable biosensors and artificial intelligence (AI) software. It works by gauging the hand gesture a person intends to make based on the forearm’s electrical signal patterns.

The UC Berkeley website’s research states how devices play an essential role in building “better prosthetic control and seamless interaction” with electronic devices. This newest technology shows how complex robotic medical procedures or even some other daily tasks like typing, gaming can be performed without using hands.

The research team has successfully taught the algorithm to recognize 21 individual hand gestures such as a fist, thumbs-up, a flat hand, holding up with individual fingers, and counting numbers. Researchers reported that devices are not yet ready for commercial usage but will be made available soon with a few tweaks.

The paper also holds some other measures that can improvize human-computer interaction through cameras and computer vision. Still, the new device ensures an individual’s privacy as it stores all data locally. According to engineers, this speeds up the computing time and ensures that the personal biological data remains protected.

“When Amazon or Apple creates their algorithms, they run a bunch of software in the cloud that creates the model, and then the model gets downloaded onto your device,” said Jan Rabaey, Professor of Electrical Engineering at UC Berkeley and Senior Author of the paper. “In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick. You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it,” he noted.

After collaborating with Ana Arias, a Professor of Electrical Engineering at UC Berkeley, the team built the hand gesture recognition system. They designed a flexible armband that could read the electrical signals located at 64 different points on the forearm.

The electrical chip absorbs the electrical signals, where the chip is programmed with an AI algorithm that can associate unique hand movements with these signal patterns in the forearm. Additionally, the device uses a type of advanced AI called a ‘hyperdimensional computing algorithm’ to make changes according to the new information.

“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model. We were able to greatly improve the classification accuracy by updating the model on the device,” said Ali Moin, who helped design the device as a doctoral student.

Lastly, researchers claim that the device’s uniqueness is that it integrates signal processing and interpretation, biosensing, and artificial intelligence into a single system that is relatively small and flexible with a low-power budget.