After working with healthcare professionals for years, Dr. Laurel Riek and her team have found that the busiest and most crowded areas in a hospital is the Emergency Department. In the "ED," a number of doctors and nurses have to quickly assess and treat a broad spectrum of illnesses and conditions.

"Even in pre-COVID times, patients may be situated in hallways for hours," Dr. Riek told Tech Briefs. "There are also many different people — physicians, nurses, technicians, EMTs, family members — which not only adds to the crowdedness, but also adds additional cognitive load to the people within it."

The scene is difficult enough for a human to navigate through, let alone a hospital's latest assistant: the humanoid robot.

Robots have been used more frequently in hospitals around the world, especially as doctors are looking for ways to provide COVID-19 care without spreading the illness via human contact.

In early 2020, a robot named "Tommy " was used to deliver supplies in a hospital located in Italy. The Spain-based PAL Robotics, similarly, placed their TiAGO Delivery and TiAGO conveyer robots  in two Barcelona healthcare facilities this year.

(Below: Watch Moxi the Robot operate at Texas Health Presbyterian Hospital Dallas.)

For robots like Tommy and TiAGO, however, to find their way through a busy Emergency Department, they must understand the environment, and the clusters of people found within it.

Computer scientists at the University of California San Diego  have developed a more accurate navigation system that allows robots to better negotiate busy clinical environments, especially the emergency room. Simply put, the "Safety Critical Deep Q-Network," or SafeDQN, spots a group doing critical work and steers around them.

SafeDQN features an algorithm that looks at two factors of a hospital cluster: how many people are in a group, and how quickly they're moving.

When a patient’s condition worsens, for example, a team immediately gathers around them to render aid. Clinicians’ movements in this case are quick, alert, and precise.

The navigation system detects this kind of situation, and then directs the robots to stay out of the way.

The UCSD trained the algorithm using videos from YouTube, including clips from documentaries and reality shows like “Trauma: Life in the ER” and “Boston EMS." The set of more than 700 videos is available for other research teams to train other algorithms and robots.

The team, led by Professor Laurel Riek and Ph.D. student Angelique Taylor, detail their findings in a paper for the International Conference on Robotics and Automation taking place May 30 to June 5 in Xi’an, China.

“Our system was designed to deal with the worst case scenarios that can happen in the ED,” said Taylor, who is part of Riek’s Healthcare Robotics lab at the UC San Diego Department of Computer Science and Engineering.

Researchers tested their algorithm in a simulation environment, and compared its performance to other state-of-the-art robotic navigation systems. The SafeDQN system generated the most efficient and safest paths in all cases, according to the UCSD engineers.

In a short Q&A with Tech Briefs below, Dr. Riek talks about the emerging role for robots in hospitals as they improve their navigation capabilities.

Tech Briefs: What are the parts of a busy hospital environment that have been challenging for robots to detect and navigate through, and how has your technology addressed those shortcomings? How is your robot able to navigate better than other robots?

Dr. Laurel Riek: In this paper, led by my PhD student Angelique Taylor, we explored the problem of how a robot could understand and model activity in the ED, particularly with regard to patient acuity. Here, we observed that a high-acuity patient (for example, someone having a heart attack or stroke) is likely to have a higher number of healthcare workers around them who are moving quickly. We used intuition to design our system, called SafeDQN, which allows robots to understand the kind of task healthcare workers are engaging in, so that they do not interrupt life-saving care delivery.

For our evaluation, we simulated four scenarios, or maps, where a robot was delivering supplies to a clinician in a busy ED. Each scenario included places where high-acuity patients were being treated in hallways, and others where clinicians might be treating low-acuity patients. The robot needed to determine the safest path (the one not interrupting care of high-acuity patients) while also the quickest.

Tech Briefs: Did you compare your system to other existing navigation methods?

Dr. Laurel Riek: We compared our system to three traditional navigation methods that do not take patient acuity level into account. These include: 1) RandomWalk, where a robot navigates by randomly selecting an action until it reaches its goal, 2) A*Search, which uses simple rules (heuristics) to find the shortest path, and 3) Dijkstra’s algorithm, which models the world as nodes in a graph, and then attempts to calculate the shortest graph.

We found that SafeDQN generates the safest, quickest paths for mobile robots when navigating in a simulated ED environment. It was significantly better than Random Walk, A*, and Dijstra. To our knowledge, this is the first work that presents an acuity-aware navigation method for robots in safety-critical settings.

Tech Briefs: In the next ten years, what role do you envision for robots in a hospital?

Dr. Laurel Riek: Robotic systems that are carefully co-designed with clinicians and patients, and that truly understand the context of care delivery, may be helpful for reducing the workload for healthcare workers and improving patient experience. Currently, ED workers get interrupted approximately every six minutes, which has a huge negative impact on patient safety, patent experience, and healthcare worker well-being. Here, robots might fetch supplies or equipment for healthcare workers, or deliver food and blankets to patients. However, it is important that the sensing and navigation systems on such robots take patient acuity into account when performing these tasks, so as not to be an additional source of interruption or be a safety risk themselves. Our work aims to address this gap.

It is also possible that robots may be able to support patients who are feeling isolated — perhaps by connecting them with family and friends (for example, a video chat on wheels), or by giving them updates on their care (“You’ll be taken for an X-Ray in 10 minutes.”). For some populations, pet-like robots may be able to provide comfort and support when there is no healthcare worker or volunteer available.

Tech Briefs: Where have you tested this system? What has been the reaction from both patients and doctors?

Dr. Riek: To date we have tested this system in simulation, but have plans to test it within a realistic medical training center this summer. Our clinical colleagues have been very supportive in helping shape our design of the system, and are eager to test it out. We are also looking forward to working with patients and family members, as they are also important stakeholders. It is important the robots we design are accessible, understandable, and user-friendly. We are exploring this question on several tangential projects.

Tech Briefs: How had/has COVID-19 influenced your work and demonstrated its importance?

Dr. Riek: Just after the pandemic began, I had six clinical colleagues, who all work in different specialties, write me to ask if I could build them robots to support telemedicine in hospitals. We thus started several new projects which are exploring how we can use robots in this way. Here, we are focused two key ideas: 1) Addressing health equity by designing tele-medical robots that are low cost so they are more accessible to lower-resourced hospitals and clinics, and 2) Building robots that can afford a sense of touch. Here it is important both for the clinician to be able to conduct physical exams remotely, as well as for an isolated patient to feel more connected to a clinician, beyond just a face on a tablet.

This work (SafeDQN) is especially relevant as it contextualizes care delivery in the ED, which is the front line for COVID-19. Considering this context, and how to build helpful technology, will be helpful both for the current pandemic and beyond.

What do you think about the emerging role of robots in hospitals? Share your questions and comments below.