ylliX - Online Advertising Network
Artificial intelligence improves lung cancer diagnosis

Researchers leveraging AI to train (robotic) dogs to respond to their masters


An international collaboration seeks to innovate the future of how a mechanical man’s best friend interacts with its owner, using a combination of AI and edge computing called edge intelligence.

The project is sponsored through a one-year seed grant from the Institute for Future Technologies (IFT), a partnership between New Jersey Institute of Technology (NJIT) and Ben-Gurion University of the Negev (BGU).

Assistant Professor Kasthuri Jayarajah in NJIT’s Ying Wu College of Computing is researching how to design a socially assistive model of her Unitree Go2 robotic dog that will dynamically adapt its behavior and nature of interactions based on the characteristics of the people with whom it interacts.

The overarching project goal is to make the dog come “alive” by adapting wearable-based sensing devices that can detect physiological and emotional stimuli inherent to one’s personality and traits, such as introversions, or transient states, including pain and comfort levels.

The invention will have an impact on home and healthcare settings in battling loneliness in the elderly population and be an aid in therapy and rehabilitation. Jayarajah’s initial work where robotic dogs understand and respond to gestural cues from their partners will be presented at the International Conference on Intelligent Robots and Systems (IROS) later this year.

Co-principal investigator Shelly Levy-Tzedek, associate professor in the Department of Physical Therapy at BGU, is an experienced researcher and leader in rehabilitation robotics, with a focus on studying the effects of age and disease on the control of the body.

The researchers note that wearable devices are increasingly accessible, and everyday models such as earphones can be repurposed to extract wearers’ states such as brain activity and micro expressions. The project aims to combine such multimodal wearable sensors with traditional robot sensors (e.g. visual and audio) to objectively and passively track user attributes.

According to Jayarajah, while the concept of socially assistive robots is exciting, long-term sustained use is a challenge due to cost and scale. “Robots like the Unitree Go2 are not yet up for big AI tasks. They have limited processing power compared to big GPU clusters, not a lot of memory and limited battery life,” she said.

Initial steps in the project include building on traditional sensor fusion, as well as exploring carefully designed deep-learning based architectures that will assist in developing commodity wearable sensors for extracting user attributes and adapting motion commands.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *