By Scott Simmie

 

Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 

 

THE IMPORTANCE OF FEEL

 

Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.

 

INVERSE3

 

That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:

 

WHAT DOES IT FEEL LIKE?

 

While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”

 

SIMULATION, TRAINING…AND MORE

 

The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:

THE HAPLY STORY

 

Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3

 

 

Haply Robotics Jessica

INDRO’S TAKE

 

It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.