When computers were first invented, they took up the space of their own room. We’ve all seen – and stifled a laugh – at black-and-white reels of newsreaders telling us about how groundbreaking these sophisticated machines are. Compare them with the phones we have in our pockets now, which are a hundredth the size and heaven knows how many times more powerful, and you can see how things have changed.

It’s become apparent that when things are first developed, they are often big and bulky, but as the technology becomes more workable and highly specialised, the machine gets streamlined and shrunk, turning into a sleek piece of kit that is a fraction of its previous size.

This is now happening in radiology. The world of medical devices related to cancer screenings and medical imaging has long been at the forefront of technological innovation and design – these machines may be large, but they are constantly shrinking and changing and, now, seem to have reached the point where their size is relevant to the levels of automation and robotics involved in them.

The technology in this sector is becoming more integrated into the machinery and, with this development, ergonomics and patient comfort are becoming a fully fledged design element.

Integrate, integrate, integrate

Jay Hill is the general manager for GE Healthcare’s imaging division and has been an electrical engineer for 25 years. Imaging is a $9-billion segment of GE Healthcare, and Hill is responsible for the central engineering teams that are developing platforms across those imaging technologies and product lines.

Hill defines his work as part of a series of modalities – common platform elements that can be used across networks and create a series of integrated ideas that are available to all the teams. The modalities work together to create systems that are applicable in every facet of the radiology system – particularly for the ability to automate.

But starting off broadly, we ask Hill what current imaging trends he’s been seeing, specifically for motion control.

“We’re not always getting the explicit ‘I want robotics’ request from our clients,” he responds over the phone from GE Healthcare’s base in Wisconsin. “But what we’re getting is problems that they have in the clinical setting and, in some of those cases, we think that motion control and robotics can help.” Doctors and imaging specialists are struggling with the machinery they have, as it’s impractically bulky and many of the programmes don’t work well together. The healthcare workers want products that are as light and  fast as possible with components that are the same.

The ways robotics can help ranges across many different areas. In imaging, modalities and workflow are a big deal
for the workers. Imaging staff want ease, speed and accuracy in positioning patients and taking a good image. The real goal, however, is twofold: the first is minimal disruption and highest comfort for the patient, and the second is higher productivity for their teams, meaning fewer errors in imaging modalities and processes.

“Say, I’m in a digital radiology department room. Staff want the ability to move the tube and detector to the area of interest, and to be able to accurately align and position them,” Hill explains. “For years, we’ve had automation and feedback controls in those systems so that when you hold one hole to the other tube or detector, the other one follows and auto-aligns. The staff want the ability to make sure the system is robust and check how people are positioned between the machine and detector.”

In other words, they want to make things simple and quick. The ability to bring high-quality, fast imaging to patients who can’t be moved in the emergency room or ICU is particularly important.

Moreover, comfort is an issue. “Where patient movement is a problem – especially in high-volume screening applications, like mammography – we’re hearing more about the comfort and experience for the patient being screened,” Hill reports. Those discomforts can be a factor in people not sticking with the screening regime, according to research on compliance rates for mammography. “One main reason cited is comfort or, rather, the perceived pain and discomfort of those exams. We chose control technologies, feedback loops and patient control over the positioning so that women can be more comfortable during the exam process.”

The theory is that if you make it more comfortable, it’s one reason fewer for a woman not to come back for screening. Hill sees all these things contributing to increasing overall productivity for the hospital staff..

R&D manufacturing

“We’re big believers of robotics in a sophisticated form of motor and motion control,” Hill says. Ergonomics and human factors are going to play a big part in the future of motor and motion control. Patients don’t want robotics necessarily, unless it will improve their comfort level, and doctors and staff don’t want added bells and whistles unless it will improve speed and workflow processes. Hill and his team, however, find that the answer is usually more automation.

“We continue to see advances in the supply base that we draw from – in smaller components, lighter components, to some extent, more power in a smaller package, and that’s all good for us, because the bulkiness of a system is one thing that can reduce access for a clinician,” he continues.

“We do a lot with industrial design, and in selecting and integrating the best components, but we see that the industry giving us smaller, lighter solutions does affect many aspects of what we need to do because it’s less obtrusive for the clinician and, generally, lighter means faster.” Lighter and smaller pieces allow things to move faster, which can help with precision and also with workflow. The clinician doesn’t want to wait for something to translate, so the faster a piece of equipment can get where it needs to be, the better.

A lighter touch

Sensors and controls are also important because, as the machines become more sophisticated, they can be more integrated, and this allows staff to do more with manual control.

“There’s a lot happening right now with gesture recognition,” Hill adds, speaking on the future of this sector. “I haven’t seen too many clinical implementations of that yet, but it’s an interesting area. There’s a lot of that folded into our system-design thinking and we are a system-design house, so we look at all these things and how we can pull those together, then how we can manage with robotics and systematic approaches where we have intelligence-led devices that are able to do some localised computation and control, and then connect back to central supervisory systems. It’s allowed us to be more flexible in how we build the systems.”

Hill believes they’re able to bring the actuation point closer to the point it needs to be moved to, which will enable savings on weight, and power and assembly costs.

Gesture recognition might sound like something from the movies, but Hill believes it’s viable and a great advance for the industry. “We’ve had some research looking into that. We’ve looked at many different aspects of how gesture recognition could allow you to guide motion control. I don’t think the industry is at the Minority Report level,” he adds with a laugh, “where being able to move your hands and have data visualisations occur or have equipment respond is possible quite yet, but I think it’s a potential area for the industry.”

A decade from now

Most in the industry think the robotic element of equipment will continue to grow. “I’ve seen customers who have robotic deployment or provisioning systems where they can take things from the supply room to the floor with robotic systems. We’re doing robotics in imaging systems and I don’t think we’re that far away, in the grand scheme of things, from systems that can dispatch themselves to where they’re most needed, which should help with asset performance and utilisation.”

He also sees the capabilities of small motion-control systems and use of sensors and motors growing, and predicts that it’ll proliferate in lower-priced points in more conventional imaging.

“I think there’s going to be more awareness in devices of each other,” with technology like internet of things (IoT) and advanced systemware. “So as things are becoming more connected together in the same clinical environment and controlled by the same procedure, they’ll recognise that and be able to configure and communicate more automatically.”

Hill believes that this awareness will project back to the supported infrastructure, like a service infrastructure maintained for customers. These advances will mean that the customer will be better off over time, and have better awareness, too, of how their machines are being used, and whether they’re being used well or not.

This integration for components could also be able to provide operational feedback to those customers so that they can make adjustments in how they’re using machinery (even automated tools) or how they are training people who use it.

At present, the rise of robotics and motorised automation in radiological equipment seems to be about providing ease of use. This means that the easier things are for the doctor, the easier things are for the patient, and vice versa. Bulky machines are gone, replaced by ergonomically designed, lightweight systems that allow swift motion and ease of use.

While this is great for the human factors of medical devices, it is also a great technological breakthrough. Fluid robotic processes are so reliable that they can be entrusted to provide services to the patient, and are able to provide faster workflow for the doctors and staff. What will occur next in this fast-moving process – or, as Hill puts, it, how can engineers continue to make adjustments in design, building and maintenance so that everyone sees better utility from the equipment investments they have made?

It’s an interesting question, and one that engineers and manufacturers will have to find answers for.