Diligent Robotics Brings Socially Intelligent Robots to Healthcare Teams

Share this content

Published:
September 24, 2018

Picture your typical hospital scene: Patients being admitted at the front desk, doctors performing consultations, nurses administering medicine … and robots wandering the hallways toward the supply closet?

Robots in the storeroom may not be the norm quite yet, but it’s happening in Austin thanks to WNCG professor Andrea Thomaz and her company, Diligent Robotics.

Dr. Thomaz launched Diligent Robotics with co-founder Dr. Vivian Chu based on the idea of a “human-centered robotics company.” They aim to create technology that fosters collaboration between humans and robots, using technology to support—not replace—human tasks. To this end, Thomaz and Chu are working specifically to develop artificial intelligence that allows robots to participate smoothly in everyday situations.

The busy milieu of a hospital is a great environment to put these concepts to the test, and it’s also a field that stands to benefit from this technology in an immediately observable way.

“The average nurse spends up to 28% of their time performing non-value-added labor, per various studies,” says Thomaz, referring to menial tasks such as tracking down a needed piece of equipment or retrieving supplies of medicines or other materials. Here’s where she and Chu saw a chance to leverage the benefits of human-robot teamwork. “Our goal is to help nurses make full use of their specialized skills, letting robots handle tedious fetching tasks and other routine work.”

Stepping—or, rather, rolling—up to the task is their newly announced robot, Moxi. Described as “friendly, sensitive and intuitive,” Moxi is built to completely integrate into a hospital support team. This is accomplished in large part by two major features: Firstly, Moxi’s face is specifically programmed to react in ways that communicate the robot’s intentions as well as convey specific social cues. Though it sounds simple, being able to see these visual expressions is instrumental in creating a sense of trust for patients and staff.

The second key element is Moxi’s arm. Though other companies have hospital service robots in development, Diligent Robotics has developed Moxi with greater influence over its environment, thanks to a flexible arm. Rather than being a simple vehicle to load up and send supplies back and forth, Moxi is meant to autonomously do tasks like picking out the items to build a stock of supply kits, or fetch and deliver an item without needing someone to load/unload items on either side.

Moxi’s first run of trials has just begun at a number of hospitals, including Texas Health Dallas, The University of Texas Medical Branch (UTMB Health) and Houston Methodist Hospital.

 

Moxi-your friendly neighborhood healthcare support robot.
Photo by Daniel Cavazos, courtesy of Diligent Robotics

 

Early feasibility tests at Austin’s Seton Medical Center were quite promising. Moxi’s predecessor, Poli, was able to use an internal floor map to locate the supply closet to assemble supply kits for delivery. The robot was also being trained to locate stands for IV lines and check equipment maintenance stickers.

If Moxi can successfully master these and other basic logistical chores, healthcare professionals can actually put more time toward directly serving their patients. More time means more thorough focus on the patient, which promotes safety as well as improved quality of care.

But how do we get there? What is needed in order to get a robot—essentially a computer—to integrate seamlessly with our everyday lives?

Robots are already in wide use today for complex tasks, mainly in manufacturing. However, they benefit from operating in a kind of closed universe, sans humans. Robots work in a structured environment and in most instances do a single, repeated job. Everything is preset and pre-determined, and the machine has only to do exactly what was programmed.

Putting robots in a collaborative setting with humans changes that dynamic. In order to change a solo robot into more of a “team player,” Thomaz and Chu are developing technology that will improve upon the robots’ artificial intelligence (AI). More specifically, the answer lies in the ways we can teach machines to have “social intelligence.” While most people have heard of AI, the idea of social intelligence may be less familiar. Broadly, this concept encompasses the types of knowledge that allow people to interact. We deal with these ideas every day, but teaching a machine exactly how we do that is much harder.

“One of the biggest challenges facing the field of robotics and artificial intelligence is figuring out how to get robots into human environments,” Thomaz explained during a presentation on social robots for TEDxPeachtree. “This is a dynamic situation—the robot has to be spontaneous [and] figure out what to do as it’s going.”

Even a small task such as fetching supplies requires a fair amount of spontaneity that we tend to overlook. A short walk down a hallway sounds easy for a typical person, but it’s actually a complex process to teach to a robot. The robot must recognize what item it should fetch, navigate a busy corridor without getting in the way, pick out the supplies, and bring them where they are needed.

How will it react to obstacles in its path or meeting someone coming out of the same doorway it’s trying to enter? How can the robot recognize when it needs to stay out of the way of doctors rushing to a patient’s room?

 

[video:https://youtu.be/d5gHIGl10wY align:center autoplay:0]

Watch Moxi go on a supply run.
Video by Lyn Graft, courtesy of Diligent Robotics

 

These are skills that we as people learn from a young age over the course of repeated interactions with others. You can think of these as manners or social customs, rules that help us understand our environment and other people. This knowledge helps us carry out a social role in a meaningful way. Robots, however, have none of this background knowledge.

Social interaction has an inherent amount of uncertainty, unlike the fully-structured manufacturing environments robots are widely deployed in today. In this case, the hospital hallway is “semi-structured”—the beginning, middle and ends points of a stockroom, hallway, and a nurse’s station may not change, but the addition of people moving in unpredictable ways through that same path poses an extra challenge for the robot.

The difference may seem subtle, but it’s key to the problems Diligent Robotics is trying to solve. On their website, the company shared their hope to create robots so well integrated that “hospital staff view [the robots] as a competent and useful member of the care team.” Diligent Robotics is working to create the algorithms and the learning methods by which the robots will be able to accomplish this goal.

The company is looking forward to continued growth throughout 2018. In January, Diligent Robotics announced that they raised $2.1 million in seed funding. True Ventures led the investment round, with Pathbreaker Ventures, Boom Capital and Next Coast Ventures also joining.

Previously, Diligent Robotics won a Small Business Innovation Research grant from the National Science Foundation totaling $725,000 over two phases. The first phase in 2016 funded work on a prototype and allowed the aforementioned feasibility tests at three hospitals in Austin. The second phase, awarded in October 2017, funded continued work on developing prototypes, as well as going towards long-term pilot testing in acute care units.

“We’re specifically focusing on the development of socially intelligent robots that function in care-oriented environments such as hospitals,” the company states. “With robots as trusted and reliable members of a team, we hope to inspire people to use their ingenuity, passion and skills to address bigger and more pressing challenges.”

 

The Diligent Robotics team: (from left to right) CEO Andrea Thomaz,
Moxi, CTO Vivian Chu, and Head of Product Agata Rozga

More information

http://diligentrobots.com/