Will AI Be Helping Us to Get Dressed in the Morning?

Most of us have experienced a small cut or a blister that makes it difficult and uncomfortable to dress ourselves in the morning. We gingerly cajole a piece of clothing over the affected limb or body part, hoping to minimize the inevitable discomfort. Now scientists and engineers are looking at whether AI can help with the task of getting dressed in order to help injured and sick patients in putting their clothes on.

In a study of Medicare beneficiaries, a team evaluated the types of assistance required in daily living activities. Of all the activities examined, it was dressing that was found to have the highest burden on caregivers, while simultaneously having the lowest offering of assistive technologies. With a shortfall in human help, some sort of robotic support would be beneficial in hospitals, care homes and in the home.

To evaluate the possibilities, a team at the Georgia Institute of Technology examined whether a robot, powered by AI, could dress a human in a gown. One of the challenges with this task was how to assess the impact of the gown’s cloth may be experienced by the patient. Visual sensors, such as cameras, provide limited insights in this case, mostly due to the fact that the clothing, the robot’s arms, and the patient’s arm continuously occlude the very objects the robot is trying to manipulate.

Humans have the upper hand here as we already know how to dress ourselves. Based upon this knowledge, we can predict how someone else will perceive the experience, while also compensating for any injury or pain they may be suffering.

Rather than work with cameras, the team focused purely on a feedback sensor in the robot arm’s gripper. This can be considered analogous to a human’s sense of feel and touch in the hands. Leveraging the power of NVIDIA Tesla V100 GPUs on the Amazon Web Services cloud, deep learning networks learned from 11,000 simulations of pulling the sleeve of a gown over a human’s arm. From the examples, the AI started to determine the maximum allowable forces that could be applied when drawing the cloth over the surface of the patient’s arm and past potential hinderances, such as the elbow and shoulder joints. Incredibly, the learning process only took a single day to complete.

The robot, named PR2, is now capable of drawing the sleeve of a gown up to the shoulder on one side of the body. The process takes around 10 seconds. Developing this further to implement the full dressing procedure will take quite some work, but at least much of the groundwork has been achieved.

Another aspect to bringing the solution to market, is figuring out how to bring down the overall cost. In a separate research, a team at the Bristol Robotics Laboratory used machine learning together with low-cost force sensors in a similar experiment. Their solution could detect between many different cloth types, compensating for the resultant snagging when one material was being pulled and drawn over another.

With millions of Americans each year requiring help and support in getting dressed due to old age, disease or injury, an AI-based robot could be just the thing to give them back their dignity.

Varsha Shivam

Varsha Shivam

Varsha Shivam is Marketing Manager at Arago and currently responsible for event planning and social media activities. She joined the company in 2014 after graduating from the Johannes Gutenberg University of Mainz with a Master’s Degree in American Studies, English Linguistics and Business Administration. During her studies, she worked as a Marketing & Sales intern at IBM and Bosch Software Innovations in Singapore.

View all posts by