Research at U of Alberta focuses on robotics for medical applications

Research at U of Alberta focuses on robotics for medical applications

By Scott Simmie

 

You’ve probably heard of the “Three Ds” by now: Robots are perfect for tasks that are Dirty, Dull and Dangerous. In fact, we recently took a pretty comprehensive look at why inspection robots can tick all of these boxes – while saving companies from unplanned downtime.

Generally, that maxim holds true. But a recent conversation with two researchers from the University of Alberta got us thinking that some innovative robotics applications don’t truly fit this description. Specifically, certain medical or healthcare use-cases.

The people we spoke to carry out their research under the umbrella of a body that intersects the robotics and healthcare sectors. It’s called the Telerobotic and Biorobotic Systems Group in the Electrical and Computer Engineering Department of the U of A. It’s under the direction of Prof. Mahdi Tavakoli, who is kind of a big name in this sector. Within that group, there are three separate labs:

  • CREATE Lab (Collaborative, Rehabilitation, Assistive robotics research
  • HANDS Lab (Haptics and Surgery research
  • SIMULAT-OR Lab (A simulated operating room featuring a da Vinci Surgical System)

Broadly, the research can be thought of as belonging to one of two realms: Rehabilitation/assistive and surgical. But what does that actually mean? And how has a robot from InDro been modified to become a smart device that can assist people with certain disabilities?

Let’s dive in.

Below: Could a robotic platform like the Ranger Mini be put to use helping someone with mobility issues? We’ll find out…

Ranger Mini 3.0

HELPING PEOPLE (AND EVEN SURGEONS)

 

We spoke with researchers Sadra Zargarzadeh and Mahdi Chalaki. Sadra is a Master’s student in Electrical and Computer Engineering and previously studied Mechanical Engineering at Iran’s Sharif University of Technology. Mahdi is also a Master’s student in the same department, and studied Mechanical Engineering at the University of Tehran.

Sadra’s research has focused on healthcare robotics with an emphasis on autonomous systems leveraging Large Language Model AI.

“I’ve always had a passion for helping people that have disabilities,” he explains. “And in the rehab sector we often deal with patients that have some sort of fine motor skill issue or challenge in executing tasks the way they’d like to. Robotics has the potential to mitigate some of these issues and essentially be a means to remove some of the barriers patients are dealing with – so I think there’s a very big potential for engineering and robotics to increase the quality of life for these people.”

That’s not dirty, dull or dangerous. But it is a very worthwhile use-case.

 

SMART WALKER

 

People with mobility and/or balance issues often require the help of walkers. Some of these devices are completely manual, and some have their own form of locomotion that keeps pace with the user’s desired speed. The direction of these is generally controlled with two hands on some form of steering device. Usually, equal pressure from each hand and arm are required in order to go in a straight line and by pushing harder on one side or another steering is achieved.

But what about someone who has had a stroke that has left them with partial paralysis on one side? They might well not be able to compensate, meaning despite their intent to carry out a straight path forward the device would turn. That’s where Mahdi’s research comes in.

“Robotic walkers or Smart Walkers have been studied for more than 20 years,” he says. “But in almost all of them, their controllers assume you have the same amount of force in both of your hands. And people with strokes often don’t have the same strength in one side of their body as they have on the other side.”

So how can robotics compensate for that? Well, using an AgileX Ranger Mini with InDro Commander from InDro Robotics as the base, Mahdi and others got to work. They built a steering structure and integrated a force sensor, depth perception camera, and some clever algorithms. That camera zones in on the user’s shoulders and translates movement into user intent.

“We know, for example, if you are just trying to use your right hand to turn left, the shoulder angle increases. If you’re trying to turn right, the shoulder angle on the right arm decreases.”

By interpreting those shoulder movements in conjunction with the force being applied by each hand, this Smart Walker translates that data into desired steering action. As a result, the user doesn’t have to push so hard with that compromised side and it also reduces cognitive load. The wrist torque required by the user drops by up to 80 per cent.

Of course, there’s much more to this device than we’ve outlined here. Enough, in fact, that a scientific paper on it can be found here. You can also check out the video below:

 

ROBOTS IN THE O-R

 

While the Smart Walker is a great example of robotics being put to use on the assistive and rehabilitation side of things, let’s not forget that the Telerobotic and Biosystems Research Group also carries out work on the surgical side. Sadra explains that robotic devices – particularly in conjunction with AI – could prove of great benefit assisting a surgeon.

“My research centres around the use of Generative AI. With the growth of Large Language Models (LLM) such as ChatGPT, we want to see how these AI tools can translate into the physical world in robots. A big section of my projects have focused on Generative AI for surgical autonomy.”

For example, a robotic device with plenty of AI onboard might be able to handle tasks such as suctioning blood. Machine Vision and Machine Learning could help that device determine where and how much suction needs to be applied. And, if you push this far enough, a surgeon might be able to initiate that process with a simple voice command like: “Suction.”

“How can we have task planners and motion planners through generative AI such that the surgeon would communicate with the robot with natural language – so they could ask the robot to complete a task and it would execute?” asks Sadra. “This would allow robots to become more friendly to the average individual who doesn’t have robotics knowledge.”

On the flip side of the coin, there’s also the potential for robotic devices to inform the surgeon of something that might require attention. In breast cancer surgery, for example, an AI-enhanced robot with realtime data from an imaging device might notice remaining tumour tissue and give the all-clear to close the incision only after all cancerous material has been excised.

In other words, some of the algorithms Sadra works on involve working on that human-robotic interface while leveraging powerful Large Language Model systems.

“Exactly. And we look at this process in three stages: We think about high-level reasoning and task planning, then mid-level motion planning, then lower-level motion control. This is not only for surgery; it’s a similar workflow for assistive robotics.”

The head of the lab, Professor & Senior University of Alberta Engineering
Research Chair in Healthcare Robotics Dr. Mahdi Tavakoli, describes AI in this field as “a game-changer,” enabling the next level of human-robotics interface.

“Our focus is clear: We’re building robots that collaborate with humans — robots that can understand our language, interpret context, and assist with the kinds of repetitive or physically demanding tasks that free people up to focus on what they do best: The creative, the social, the human. We see the future in ‘collaborative intelligence,’ where people stay in control and robots amplify human capabilities.”

Fun fact: The most powerful LLMs are known as Generative Pretrained Transformers – which is where ChatGPT gets its name.

 

WHAT’S NEXT?

 

We asked the researchers if the plan is to ultimately explore commercialisation. Apparently it’s a little more complex when it comes to surgery due to regulatory issues, but this is definitely on the roadmap. Sadra has been doing research through a program called Lab2Market and says there’s been very positive feedback from clinicians, physical and occupational therapists and manufacturers.

Program head Dr. Tavakoli says the lab is “thinking big” about how such innovations can help diversify the Canadian economy. In Alberta specifically, which has traditionally been a resource-dominated economy, he says robotics presents a huge opportunity for growth.

“That’s part of why we’ve launched Alberta Robotics: To build a regional ecosystem for robotics research, education, and innovation. So, the University of Alberta is open for business when it comes to robotics; people should be watching for what will come out of Alberta in robotics!”

Below: A promotional video for the da Vinci Surgical System. Will research at the U of A someday enable machines like this to take verbal commands from a surgeon?

INDRO’S TAKE

 

The research being carried out at the University of Alberta is both fascinating and carries with it huge potential in both the surgery and rehabilitation/assistive spheres. We’re pleased to know that three Ranger Mini platforms with InDro Commander are being put to work for this purpose – which is unlike any other use-case we’ve seen for our robots.

“I’m incredibly impressed with what they’re doing,” says InDro Founder and CEO Philip Reece. “It’s researchers like these, quietly carrying out advanced and focussed work, who make breakthroughs that ultimately become real-world devices and applications. We’re pleased to put a well-deserved spotlight on their work.”

You can check out a list of researchers and alumni – and see a photo of Sadra and Mahdi – right here.

Dual manipulator Rosie the robot used for Industry 4.0 research

Dual manipulator Rosie the robot used for Industry 4.0 research

By Scott Simmie

 

At least some of you will remember The Jetsons.

The television series, created by Hanna-Barbera Cartoons Inc., was a space-age version of The Flintstones (another Hanna-Barbera production). It originally aired in 1962-1963 with later episodes created in a reboot from 1985 to 1987.

But while Fred Flintstone drove a stone-age car (complete with stone wheels) that he powered by pushing his feet along the ground, George Jetson and his family lived in Orbit City, where Jetson commuted to his two-hour per week job via a flying car with a bubble top. And instead of having dinosaurs (including pterodactyls) help carry out tasks, The Jetsons live in a future where they’re surrounded by automated devices. You could think of their surroundings as the 1960s vision of the Smart Home.

And an integral part of that home? Well, that would be Rosey (later changed to ‘Rosie’) the robot.

Rosey was the family’s robotic maid. She carried out tasks that weren’t performed by the many other automatic conveniences that filled the Jetson’s home. She had two manipulator arms and an internally stored vacuum that be deployed on demand.

She was very useful around the house, carrying out tasks to save the family time.

And this story? Well, it’s about our own Rosie – which is also very space-age.

Below: A Rosie the robot publicity cel, signed by show creators William Hanna and Joseph Barbera. The cel was auctioned in 2018; image by Heritage Auctions

Rosie the Robot from The Jetsons Heritage Auctions image

THE ROSIE STORY

 

So. What is Rosie? We asked Head of R&D Sales Luke Corbeth for a snapshot.

“Rosie is a dual arm mobile manipulation robot designed for pick and place in an industry 4.0 setting,” he says. In other words, it has two arms and manoeuvres on a wheeled platform, and is capable of moving objects from one location to another or even manipulating a single object with both end effectors.

And Rosie has a few tricks up her sleeve. Or, more accurately, sleeves.

“The actual robot is very unique because it has six mounting points for the arms. So you can mount the arms on top, high on the side or low on the side to access shelving of different heights. In fact, you could actually mount one arm directly on the top right, for example, and then mount the second one on the bottom left. So you could grab something from the top of the shelf and from the floor at the same time, which is kind of cool, right?”

Yes, indeed.

Rosie’s home is not with the Jetsons (she has no vacuum cleaner) but in a new lab that hasn’t yet been officially launched at Polytechnique Montréal. It’s called the Intelligent-Cyber Physical System Lab, or I-CPS. So we contacted Lionel Birglen, a professor with the Department of Mechanical Engineering. We wanted to learn more about what the lab does, what he does – and what plans he has for Rosie (which InDro built and shipped in 2023).

Dr. Birglen is a PhD Mechanical Engineer, with a specialisation in robotics. He’s particularly interested in – and an expert on – manipulators and end effectors and has designed and built them. He’s written two books, holds three patents, and is the author or contributing author of at least 94 research papers. He’s also – get this – been listed among the top two per cent most-cited scientists in the world in his area of specialisation.

So it kinda goes without saying, but he’s a pretty big deal in this field.

Dr. Birglen has a deep interest in the role robotics will play in the future of industry. And, within that realm, he’s intensely interested in ensuring that robots, particularly those that will be sharing space with human beings on a factory or warehouse floor, will be safe.

And – he emphasises – he doesn’t trust simulations for important work like this.

“Because simulations lie. They lie all the time,” he says. “You have to understand that reality is infinitely more complex than anything you can have in simulation – so actual experiments are absolutely essential to me. They are essential to my work, to my understanding of what robotic manipulation is.”

“I believe in math, but I know that reality is different. It’s more complex, more complicated, and includes so many un-modelled phenomena.”

 

ROSIE’S JOURNEY

 

Dr. Birglen knew he wanted a new robot for use in the new lab (which we’ll get to shortly). And he knew he wanted a robot with two manipulator arms.

“Dual-arm robots are, in my opinion, the future for industry applications,” he says.

And while humanoid bipeds grab a lot of attention, they’re far more complex (and expensive) than wheeled robots. Plus, he says, most factory applications take place on a single level and don’t require climbing stairs.

“From a factory perspective, a wheeled platform makes a lot of sense because typically in factories you don’t have, say, five levels connected by stairs.”

So he knew he wanted an autonomous, wheeled, dual-arm robot. And he started, initially, to think of a company other than InDro for the build.

“I came across InDro almost by accident,” he explains. “Giovanni Beltrame told me about you because he has purchased many, many robots from you. He said: ‘Those guys can build and assemble the robot for you. They’re close and they do a great job.’ So that’s how I came in contact with you.” (We’ve written previously about the amazing work Dr. Beltrame is working on involving robots and space. You can find that here.)

And so, after a number of calls with Luke Corbeth and the engineering team to settle on design and performance parameters, work on Rosie began.

Below: Technologist Tirth Gajera (‘T’) puts the finishing touches on Rosie in 2023

Rosie and Tirth T

THE LAB

 

Polytechnique Montréal’s Intelligent-Cyber Physical System Lab (I-CPS) is set up as a highly connected Industry 4.0 factory. Faculty from four different departments – computer engineering, electrical engineering, industrial engineering and mechanical engineering (Dr. Birglen) – are involved with the lab. Interns and students, under supervision, also work in the facility.

“So we have four departments involved in this lab and the idea is to build a small scale factory of the future, meaning that everything is connected. We are building a mini-factory inside this lab,” he says.

So think of cameras that can track objects on shelves – and people and robots within the environment. Think of smart tools like a CNC machine, which will eventually be operated by Rosie. And, perhaps just as important as the connectivity within the lab, is connectivity to other research institutes in Quebec, including Université Laval, Université de Sherbrooke and École de Technologié Supérieure (ÉTS). All of those institutes are working with similar mini-factories, and they’re all connected. There’s even a relationship (and connectivity) with manipulator manufacturer Kinova. Funding came via a significant grant from the Canada Foundation for Innovation, or CFI.

“So think of our lab as like one node of this network of mini-factories around Quebec,” explains Dr. Birglen. That connectivity of all components is still a work-in-progress, but “ultimately the goal is that there is a cyber-connection between these different mini-factories, these different laboratories around Quebec, so that one part of one node can work in collaboration with another node in realtime.”

Plus, of course, a lot of learnings will take place within the individual labs themselves.

“We want to bring collaborative robots to work in tandem with humans,” he says. “We want our robots to safely move around people, we want robots to help people. And we also want robots to learn how to work from people.”

 

SAFETY, SAFETY, SAFETY

 

As mentioned earlier, there’s a huge emphasis on safety. And while there are international safety standards for collaborative robots, even a ‘safe’ cobot can pose a threat.

“All the collaborative robots that you have currently on the market more or less follow this technical standard and they are more or less safe, but they’re still dangerous,” explains Dr. Birglen. “And the classical example that we’ve all heard, and which is true, is that if a safe cobot has a knife in its hand and is moving around – it is very dangerous.”

So safety in the lab(s) is paramount – and that means safety at multiple levels. There must be safety:

  • At the task level; you must not have tasks that could endanger people
  • Safety at the control level
  • Safety in terms of collision detection, mitigation, obstacle avoidance
  • Safety at the data security level

Plus – and this really interests Dr. Birglen – you must ensure safety with any additional mechanical innovations that are introduced.

“What you develop, any mechanical system you develop, must be as much as possible intrinsically safe. And actually that’s one of the topics I’m currently working on is to develop end effectors and tooling that is intrinsically safe.”

Below: A LinkedIn post from Luke Corbeth shows Rosie, using both arms, inside the I-CPS lab

THE FUTURE

 

And why is research like this so important? What difference will it make to have robots and humans working safely together, with safe manipulators and end effectors that might even be able to, for example, lift an object in concert with a human being? And why the focus on interconnectedness between all of these facilities?

Well, there’s obviously the value of the research itself – which will lead to greater efficiencies, improved manipulators, gripping technologies, new algorithms and AI enhancements – as well as enhanced safety down the road. But there’s a much bigger picture, says Dr. Birglen, especially if you can get your head around thinking about the future from a global perspective.

China, he says, is no longer a developing nation. The days when the words “Made in China” meant poor quality are – with rare exceptions – gone. The country is, in fact, highly developed – and working at breakneck speed when it comes to innovation and adoption of robotics at scale. A revolution is underway that has massive implications for competitive advantage that simply cannot be ignored. So the research at  I-CPS is not merely important from an academic perspective, it’s strategic when viewed through a global economic lens.

“We as a country – meaning Canada – are in competition with other countries for manufacturing, for producing goods and services. China is a developed country and it is very, very, very good in robotics,” he states. “You know how in the past we saw China as producing low quality goods, low quality robots? That’s over, man. That’s finished.”

And?

“If they are investing in robotics like mad and we are not, we’re going to be a leftover – Canada is going to sink as a rich country. If you want to produce wealth in the 21st Century, you need robots, you need automation, you need integration. In short, you need to be the leader of the pack or you’re going to be eaten.”

It’s a stark warning – and it’s true.

I step outside as author and state this having lived in China back when it was still a developing country in the late 1980s – and having returned several times since then. The transformation has been nothing short of astonishing. How, you might ask, did it achieve all this?

The answer has its genesis with former Chinese leader Deng Xiaoping. who led the country from 1978 to 1989. He didn’t merely open the door to reform; he created policies that began sending waves of students from what had been a xenophobic country abroad to study. There was an emphasis on careers that could help modernise the nation, including all aspects of engineering, aerospace, construction, transportation, architecture, etc. That’s where all this began.

Thankfully (and with credit to federal funding agencies like CFI), there are projects like I-CPS underway – and academics like Dr. Lionel Birglen with the vision to push the needle safely forward.

Below: “Baxter” – the original dual-arm robot. Baxter is still at Polytechnique Montréal, but Rosie is the mobile future. Photo by Luke Corbeth

Baxter
Rosie

INDRO’S TAKE

 

We’re obviously pleased Polytechnique Montréal selected InDro to build Rosie. And we’re particularly pleased to see that she’s being deployed at I-CPS, as part of an integrated and networked research project that has such potentially profound implications for the future.

“I believe Dr. Birglen is correct in his assessment of the importance of robotics and automation in the future,” says InDro Robotics Founder and CEO Philip Reece. “And when you throw innovations with drones and even autonomous Uncrewed Aerial Vehicles capable of carrying large cargo loads and passengers into the mix, we are actually heading into a Jetsons-like future,” he adds.

“I think there’s a growing understanding of the implications of this kind of future from not only the private sector, but also federal regulators and funding agencies. At InDro our mission will always focus on continued innovation. Sometimes those innovations are our own inventions, but a key piece of the puzzle is R&D work carried out by academics like Lionel Birglen. We’re confident that Rosie’s arms are in the right hands.”

Interested in learning more about a custom robotics solution? Feel free to contact us here.

InDro Robotics ROS-based drone an R&D powerhouse

InDro Robotics ROS-based drone an R&D powerhouse

By Scott Simmie

 

InDro Robotics is pleased to unveil details of its highly capable new R&D drone.

Running the Robot Operating System (ROS) and with powerful onboard compute capabilities, the drone is perfect for advanced Research and Development.

“It’s a drone geared toward R&D first and foremost,” explains Luke Corbeth, Head of R&D Sales. “It truly is a flying robot – and you can program and use it in a very similar fashion to all our other robots.”

There’s a real demand in the research world for open-source drones that can be programmed and run highly complex algorithms. These kinds of drones can be used to study swarm behaviour, object detection and identification, mapping in GPS-denied locations and much more.

For some researchers, the budget go-to has been the Crazyflie, a micro-drone that uses a Raspberry Pi for compute. Its advantage is that it’s quite affordable. But its low cost, 27 gram weight and relatively low computing power means it has limitations – including the ability add sensors of any weight.

“This drone can do so much more,” says Corbeth. “With the NVIDIA Xavier NX onboard for compute, it can effectively map entire environments. And when it comes to landing and object recognition, it’s truly phenomenal. It can even land on a moving vehicle.”

Below: A look at InDro’s new drone, which comes complete with LiDAR, a depth-perception camera, 5G connectivity – and much more.

InDro ROS drone

THE BACK STORY

 

If you’ve been following the latest news from InDro, you’ll be aware we have an incubation agreement with Cypher Robotics. That company builds solutions for cycle counting and precision scanning in the industrial/supply chain space. InDro assisted with the development of its signature product, Captis.

Captis integrates an autonomous ground robot with a tethered drone. As the Captis robot autonomously navigates even narrow stock aisles, the drone ascends from a tether attached to that ground robot. The drone then scans the barcodes (it’s code-agnostic) of the products on the shelves. All of that data is transferred seamlessly, in real-time, to the client’s Warehouse Management System (WMS), WCS (Warehouse Control System) and WES (Warehouse Execution System) software.

The capabilities of Captis led to a partnership with global AI fulfilment experts GreyOrange and leading global telco innovator Ericsson. The product debuted at the recent MODEX2024 conference (one of the biggies in the automated supply chain world), where it gained a *lot* of attention.

While working on the project, it was always clear the drone – thanks to multiple modifications – would be highly suitable as a research and development tool. It’s capable of machine vision/object recognition, machine learning, and can find its way around in completely unfamiliar, GPS-denied environments.

“In fact, I have one client that’s using it for research in mines,” says Corbeth.

 

THE JETSON DIFFERENCE

 

NVIDIA has made quite a name for itself – and quite a profit for its shareholders – with its powerful AI-capable processors. The Jetson Xavier NX features a 6-core NVIDIA Carmel Arm®v8.2 64-bit processor running at speeds of up to 1.9 GHz. Its graphics processor unit features a 384-core NVIDIA Volta™ architecture with 48 Tensor Cores. Put it all together, and the computing power is astonishing: The Xavier NX is rated with a maximum achievable output of 21 TOPS – trillion operations per second. (We were going to try to count, but thought it more efficient to rely on NVIDIA’s specs for this.)

The LiDAR unit currently shipping with the drone also has some flex. It’s the Ouster 32-channel OS1 (Rev6.2). With a maximum range of 200 metres (90 metres on a dark, 10 per cent target), its powerful L3 chip is capable of processing scans of up to 5.2 million points per second with 128 channels of vertical resolution (again, we didn’t count). Hostile environment? No problem. The LiDAR can operate from -40°C to 60°C and has an IP68 Ingress Protection rating.

The OS1 is designed for all-weather environments and use in industrial automation, autonomous vehicles, mapping, smart infrastructure, and robotics,” states its manufacturer“The OS1 offers clean, dense data across its entire field of view for accurate perception and crisp detail in industrial, automotive, robotics, and mapping applications.”

The unit uses open source ROS and C++ drivers, and comes with Ouster’s Software Development Kit. Its ability to accurately sense its environment (down to distances of 0.5 metres away), combined with the NVIDIA processor and the depth camera also allows this machine to do something pretty extraordinary: It can recognise and land on a moving platform.

“That’s a very challenging problem to solve and requires not only specific sensing but also really powerful onboard compute. This drone can do it,” explains Corbeth.

Already, word about the product has been spreading. A number of units have already been sold to academic institutions for research purposes – and the team has been hard at work building and testing for the next set of orders (as seen below).

THE FORGE CONNECTION

 

Like all new products, the new drone required custom parts. We looked no further than InDro Forge, our rapid prototyping and limited production run facility in Ottawa.

Using state of the art additive and subtractive tools, the Forge team created custom mounts using carbon fibre and other strong but lightweight materials, while also ensuring the frame was robust enough to take on even the most challenging environments where these drones will be deployed.

“InDro Forge has been critical to the finished product,” says Corbeth. “We wanted a look, feel and quality that matches this drone’s capabilities – and InDro Forge delivered.”

InDro ROS drone

INDRO’S TAKE

 

We’re obviously excited about the capabilities of this new drone, and we’re not alone. Interest in this product from researchers has already been significant. In fact, we’re not aware of any other drone on the market offering this combination of specific capabilities.

It was that void – in concert with our partnership with Cypher Robotics – that led to its creation.

“InDro has always placed a great emphasis on the development of innovative new products,” says CEO Philip Reece. “We build new products at the request of clients and also develop our own when we see a market opportunity. In this case, the requirements for Cypher Robotics dovetailed nicely with demand for such a drone from researchers.”

Production of the new drone is moving at a swift pace. If you’re interested in a briefing or demo, you can contact us here.

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

QUEBEC’S HAPLY ROBOTICS MAKES THE VIRTUAL FEEL REAL

By Scott Simmie

 

Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 

 

THE IMPORTANCE OF FEEL

 

Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.

 

INVERSE3

 

That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:

 

WHAT DOES IT FEEL LIKE?

 

While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”

 

SIMULATION, TRAINING…AND MORE

 

The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:

THE HAPLY STORY

 

Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3

 

 

Haply Robotics Jessica

INDRO’S TAKE

 

It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.

InDro Commander module streamlines robotics R&D

InDro Commander module streamlines robotics R&D

By Scott Simmie

 

Building robots is hard.

Even if you start with a manufactured platform for locomotion (very common in the case of ground robots), the work ahead can be challenging and time-consuming. How many sensors will require a power supply and data routing? What EDGE processing is needed? How will a remote operator interface with the machine? What coding will allow everything to work in unison and ensure the best data and performance possible? How will data be transmitted or stored?

That’s the hard stuff, which inevitably requires a fair bit of time and effort.

It’s that hurdle – one faced by pretty much everyone in the robotics R&D world – that led to the creation of InDro Commander.

InDro Commander

WHAT INDRO COMMANDER DOES

 

InDro Commander is a platform-agnostic module that can bolt on to pretty much any means of locomotion. In the photo above, it’s the box mounted on top of the AgileX bunker (just above the InDro logo).

Commander is, as this webpage explains, “a single box with critical software and hardware designed to simplify payload integration and enable turn-key teleoperations.” Whether you’re adding LiDAR, thermal sensors, RTK, Pan-Tilt-Zoom cameras – or pretty much any other kind of sensor – Commander takes the pain out of integration.

The module offers multiple USB inputs for sensors, allowing developers to decide on a mounting location and then simply plug them in. A powerful Jetson EDGE computer handles onboard compute functions. The complete Robot Operating System software libraries (ROS1 and ROS2) are bundled in, allowing developers to quickly access the code needed for various sensors and functions.

“Our engineering team came up with the concept of the InDro Commander after integrating and customizing our own robots,” says Philip Reece, CEO of InDro Robotics. “We realized there were hurdles common to all of them – so we designed and produced a solution. Commander vastly simplifies turning a platform into a fully functioning robot.”

Account Executive Luke Corbeth takes it further:

“The Commander serves as a “brain-box” for any UGV,” he says. “It safely houses the compute, connectivity, cameras, sensors and other hardware in an IP54 enclosure.”

It also comes in several options, depending on the client’s requirements.

“There are three ‘standard versions’ which are bundles to either be Compute Ready, Teleoperations Ready or Autonomy Ready,” adds Corbeth.

“I’ve realized over time that the value of Commander is our ability to customize it to include, or more importantly, not include specific components depending on the needs of the project and what the client already has available. In reality, most Commanders I sell include some, but not usually all, of what’s in the Commander Navigate. We’re also able to customize to specific needs or payloads.”

Below: Commander comes in multiple configurations

InDro Commander

COMMANDER DOES THE WORK

 

With InDro Commander, developers can spend more time on their actual project or research – and far less time on the build.

“For end-users wanting a fully customized robot, Commander saves a huge amount of time and hassle,” says InDro Engineering Lead Arron Griffiths. “Customers using this module see immediate benefits for sensor integration, and the web-based console for remote operations provides streaming, real-time data. Commander also supports wireless charging, which is a huge bonus for remote operations.”

Commander serves as the brains for several InDro ground robots, including Sentinel. This machine was recently put through its paces over 5G in a test for EPRI, the Electric Power Research Institute.

 

5G OPERATIONS

 

Depending on the model, Commander can also serve as a Plug & Play device for operations over 4G or 5G networks. In fact, InDro was invited by US carrier T-Mobile to a 2022 event in Washington State. There, we demonstrated the live, remote tele-operation of a Sentinel inspection robot.

Using a simple Xbox controller plugged into a laptop at T-Mobile HQ in Bellevue WA, we operated a Sentinel in Ottawa – more than 4,000 kilometres away. There was no perceptible lag, and even untrained operators were able to easily control remote operations and cycle between the Pan Tilt Zoom camera, a thermal sensor, and a wide-angle camera used for situational awareness by the operator. Data from all sensors was displayed on the dashboard, with the ability for the operator to easily cycle between them.

Below: T-Mobile’s John Saw, Executive Vice President, Advanced & Emerging Technologies, talks about InDro Commander-enabled robots teleoperating over 5G networks 

 

FUTURE-PROOF

 

Platforms change. Needs evolve. New sensors hit the market.

With Commander on board, developers don’t need to start from scratch. The modular design enables end-users to seamlessly upgrade platforms down the road by simply unbolting Commander and affixing it to the new set of wheels (or treads).

Below: Any sensor, including LiDAR, can be quickly integrated with InDro Commander

Teleoperated Robots

INDRO’S TAKE

 

You likely know the saying: “Necessity if the mother of invention.”

InDro developed this product because we could see its utility – both for our own R&D, and for clients. We’ve put Commander to use on multiple custom InDro robots, with many more to come. (We have even created a version of this for Enterprise drones.)

On the commercial side, our clients have really benefited from the inherent modularity that the Commander provides,” says Luke Corbeth.

“Since the ‘brains’ are separate from the ‘body,’ this simplifies their ability to make the inevitable repairs or upgrades they’ll require. These clients generally care about having a high functioning robot reliably completing a repetitive task, and Commander allows us to operate and program our robots to do this.”

It can also save developers money.

“On the R&D side, the customizable nature of the Commander means they only purchase what they don’t already have,” adds Corbeth.

“For instance, many clients are fortunate enough to have some hardware already available to them whether it’s a special camera, LiDAR or a Jetson so we can support the integration of their existing systems remotely or they can send this hardware directly to us. This cuts down lead times and helps us work within our clients’ budgets as we build towards the dream robot for their project.”

Still have questions or want to learn more? You can get in touch with Luke Corbeth here.

uPenn robotics team cleans up at SICK LiDAR competition

uPenn robotics team cleans up at SICK LiDAR competition

By Scott Simmie

 

There’s nothing we like more than success stories – especially when technology is involved.

So we’re pleased to share news that a team of bright young engineers from the University of Pennsylvania were the winners of a prestigious competition sponsored by SICK, the German-based manufacturer of LiDAR sensors and industrial process automation technology.

The competition is called the SICK TiM $10K Challenge. The competition involves finding innovative new uses for the company’s TiM-P 2D LiDAR sensor. Laser-based LiDAR sensors scan the surrounding environment in real-time, producing highly accurate point clouds/maps. Paired with machine vision and AI, LiDAR can be used to detect objects – and even avoid them.

And that’s a pretty handy feature if your robot happens to an autonomous garbage collector. We asked Sharon Shaji, one of five UPenn team members (all of whom earned their Masters in Robotics this year), for the micro-elevator pitch:

“It’s an autonomous waste collection robot that can be used specifically for cleaning outdoor spaces,” she says.

And though autonomous, it obviously didn’t build itself.

Below: Members of the team during work on the project.

uPenn Sauberbot

THE COMPETITION

 

When SICK announced the contest, it had a very simple criteria: “The teams will be challenged to solve a problem, create a solution, and bring a new application that utilizes the SICK scanner in any industry.”

SICK received applications from universities across the United States. It then whittled those down to 20 submissions it felt had real potential, and supplied those teams with the TiM-P 270 LiDAR sensor free of charge.

Five students affiliated with UPenn’s prestigious General Robotics, Automation, Sensing and Perception Laboratory, or GRASP Lab, put in a team application. It was one of three GRASP lab teams that would receive sensors from SICK.

That Lab is described here as “an interdisciplinary academic and research center within the School of Engineering and Applied Sciences at the University of Pennsylvania. Founded in 1979, the GRASP Lab is a premier robotics incubator that fosters collaboration between students, research staff and faculty focusing on fundamental research in vision, perception, control systems, automation, and machine learning.”

Before we get to building the robot, how do you go about building a team? Do you just put smart people together – or is there a strategy? In this case, there was.

“One thing we all kept in mind when we were looking for teammates was that we wanted someone from every field of engineering,” explains Shaji. In other words, a multidisciplinary team.

“So we have people from the mechanical engineering background, electrical engineering background, computer science background, software background. We were easily able to delegate work to every person. I think that was important in the success of the product. And we all knew each other, so it was like working with best friends.”

 

GENESIS

 

And how did the idea come about?

Well, says the team (all five of whom hopped on a video call with InDro Robotics), they noticed a problem in need of a solution. Quite frequently on campus – and particularly after events – they’d noticed that the green space was littered. Cans, bottles, wrappers – you name it.

They also noticed that crews would be dispatched to clean everything up. And while that did get the job done, it wasn’t perhaps the most efficient way of tackling the problem. Nor was it glamorous work. It was arguably a dirty and dull job – one of the perfect types of tasks for a robot to take on.

“Large groups of people were coming in and manually picking up this litter,” says Shaji.

“And we realised that automation was the right way to solve that problem. It’s unhygienic, there are sanitation concerns, and physically exhausting. Robots don’t get tired, they don’t get exhausted…we thought this was the best use-case and to move forward with.”

Below: Working on the mechanical side of things

uPenn SICK Sauberbot

GETTING STARTED

 

You’d think, with engineers, the first step in this project would have been to kick around design concepts. But the team focussed initially on market research. Were there similar products out there already? Would there be a demand for such a device? How frequently were crews dispatched for these cleanups? How long, on average, does it take humans to carry out the task? How many people are generally involved? Those kinds of questions.

After that process, they began discussing the nuts and bolts. One of the big questions here was: How should the device go about collecting garbage? Specifically, how should it get the garbage off the ground?

“Cleaning outdoor spaces can vary, because outdoor spaces can vary,” says team member Aadith Kumar. “You might have sandy terrain, you might have open parks, you might have uneven terrain. And each of these pose their own problems. Having a vacuum system on a beach area isn’t going to work because you’re going to collect a lot of sand. The vision is to have a modular mechanism.”

A modular design means flexibility: Different pickup mechanisms would be swappable for specific environments without requiring an entirely new robot. A vacuum system might work well in one setting, a system with the ability to individually pick items of trash might work better somewhere else.

The team decided their initial prototype should focus on open park space. And once that decision was made, it became clear that a brush mechanism, which would sweep the garbage from the grass into a collection box, would be the best solution for this initial iteration.

“We considered vacuum, we considered picking it up, we considered targeted suction,” says Kumar. “But at the end of the day, for economics, it needed to be efficient, fast, nothing too complicated. And the brush mechanism is tried and tested.”

Below: Work on the brush mechanism

 

 

uPenn SICK Sauberbot

SAUBERBOT

 

The team decided to call its robot the SauberBOT. “Sauber” is the German word for “clean”. But that sweeping brush mechanism would be just one part of the puzzle. Other areas to be tackled included:

  • Depth perception camera for identifying trash to be picked up
  • LiDAR programmed so that obstacles, including people, could be avoided
  • Autonomy within a geofenced location – ie, the boundaries of the park to be cleaned

There was more, of course, but one of the most important pieces of the puzzle was the robotic platform itself: The means of locomotion. And that’s where InDro Robotics comes in.

 

THE INDRO CONNECTION

 

Some team members had met InDro Account Executive Luke Corbeth earlier in the year, at the IEEE International Conference on Robotics and Automation, held in Philadelphia in 2022. Corbeth had some robotic platforms from AgileX – which InDro distributes in North America – at the show. At the time the conference took place, the SICK competition wasn’t yet underway. But the students remembered Corbeth – and vice versa.

Once the team formed and entered the contest, discussions with InDro began around potential platforms.

The team was initially leaning toward the AgileX Bunker – a really tough platform that operates with treads, much like a tank. At first glance, those treads seemed like the ideal form of locomotion because they can operate on many different surfaces.

But Luke steered them in a different direction, toward the (less-expensive) Scout 2.0.

“He was the one who suggested the Scout 2.0,” says Udayagiri.

“We actually were thinking of going for the Bunker – but he understood that for our use-case the Scout 2.0 was a better robot. And it was very easy to work with the Scout.”

Corbeth also passed along the metal box that houses the InDro Commander. This enabled the team to save more time (and potential hassle) by housing all of their internal components in an IP-rated enclosure.

“I wanted to help them protect their hardware in an outdoor environment,” he says. “They had a tight budget, and UPenn is a pretty prominent robotics program in the US.”

But buying from InDro begs the question: Why not build their own? A team of five roboticists would surely be able to design and build something like that, right? Well, yes. But they knew they were going to have plenty of work on their own without having to build something from scratch. Taking this on would divert them from their core R&D tasks.

“We knew we would do it in a month or two,” says the team’s Rithwik Udayagiri. “But that would have left us with less time for market research and actually integrating our product, which is the pickup mechanism. We would have been spending too much time on building a platform. So that’s why we went with a standalone platform.”

It took a little longer than planned to get the recently released Scout 2.0 in the hands of the UPenn team. But because of communication with Luke (along with the InDro-supplied use of the Gazebo robot simulation platform), the team was able to quickly integrate the rest of the system with Scout 2.0 soon after it arrived.

“The entire project was ROS-based (Robot Operating System software), and they used our simulation tools, mainly Gazebo, to start working on autonomy,” explains Corbeth. “Even though it took time to get them the unit, they were ready to integrate their tech and get it out in the field very quickly. That was the one thing that blew me away was how quickly they put it together.”

It wasn’t long before SauberBOT was a reality. The team produced a video for its final submission to SICK. The SauberBOT team took first place, winning $10,000 plus an upcoming trip to Germany, where they’ll visit SICK headquarters.

Oh, and SauberBOT? The team says it cleans three times more quickly than using a typical human crew. 

Here’s the video.

 

A CO-BOT, NOT A ROBOT

 

Team SauberBOT knows some people are wary of robots. Some believe they will simply replace human positions and put people out of work.

That’s not the view of these engineers. They see SauberBOT – and other machines like it – as a way of helping to relieve people from boring, physically demanding and even dangerous tasks. They also point out that there’s a labour shortage, particularly in this sector.

“The cleaning industry is understaffed,” reads a note sent by the team. “We choose to introduce automation to the repetitive and mundane aspects of the cleaning industry in an attempt do the tasks that there aren’t enough humans to do.”
 
 
And what about potential jobs losses?
 
 
“We intend to make robots that aren’t aimed to replace humans,” they write.
 
 
“We want to equip the cleaning staff with the tools to handle the mundane part of cleaning outdoor spaces and therefore allow the workforce to target their attention to the more nuanced parts of cleaning which demand human attention.”
 
In other words, think of SauberBOT as a co-operative robot meant to assist but not replace humans. These are sometimes called “co-bots.” 
 
 
Below: Testing out the SauberBOT in the field
UPenn SICK SauberBOT

INDRO’S TAKE

 

We’re obviously pleased to have played a small role in the success of the UPenn team. And while we often service very large clients – including building products on contract for some global tech giants – there’s a unique satisfaction that comes from this kind of relationship.

“It’s very gratifying,” says Corbeth. “In fact, it’s the essence of what I try to do: Enable others to build really cool robots.”

The SauberBOT is indeed pretty cool. And InDro will be keeping an eye on what these young engineers do next.

“The engineering grads of today are tomorrow’s startup CEOs and CTOs,” says InDro Robotics Founder/CEO Philip Reece.

“We love seeing this kind of entrepreneurial spirit, where great ideas and skills lead to the development of new products and processes. In a way, it’s similar to what InDro does on a larger scale. Well done, Team SauberBOT – there’s plenty of potential here for a product down the road.”

If you’ve got a project that could use a robotic platform – or any other engineering challenge that taps into InDro’s expertise with ground robots, drones and remote teleoperations – feel free to get in touch with Luke Corbeth here.