By Scott Simmie


Odds are you’ve heard of remote surgery by now.

That’s where a surgeon, looking at screens that provide incredibly detailed 3D video in realtime, conducts the operation using a controller for each hand. The inputs on those controllers are translated into scaled-down movement of robotic arms fitted with the appropriate medical devices. The robotic arms are capable of moving a precise fraction of the distance of the operators’ hands. As a result, these systems allow for far greater control, particularly during really fine or delicate procedures. 

The surgeon might be at a console in the operating theatre where the patient is. Or they could be operating on someone remotely. You could have a specialist in Montreal perform an operation on someone elsewhere in the world – providing you’ve got a speedy data connection.

The video below does a really good job of explaining how one of the best-known systems works. 




Conducting standard surgery (or a variety of other tasks) without robots involves constant tactile feedback.  If a doctor is moving an instrument through tissue – or even probing inside an ear – they can feel what’s going on. Think of cutting a piece of fruit; you adjust the pressure on the knife depending on how easy the fruit is to slice. When you put a spoon into a bowl of jello, that constant feedback from the utensil helps inform how hard or soft you need to push.

This tactile feedback is very much a part of our everyday lives – whether it’s brushing your teeth or realising there’s a knot in your hair while combing it. Even when you scratch an itch, you’re making use of this feedback to determine the appropriate pressure and movements (though you have the additional data reaching your brain from the spot being scratched).

But how do you train someone to perform delicate operations like surgery – even bomb defusal – via robotics? How do you give them an accurate, tactile feel for what’s happening at the business end? How much pressure is required to snip a wire, or to stitch up a surgical opening?

That’s where a company from Quebec called Haply Robotics comes in.

“Haply Robotics builds force-feedback haptic controllers that are used to add the sense of touch to VR experiences, and to robotic control,” explains Product Manager Jessica Henry. “That means that our controller sits on the human interface side and lets the human actually use their hand to do a task that is conveyed to a robot that’s performing that task.”

We met some of the Haply Robotics team during the fall at the IROS 2023 conference in Detroit. We had an opportunity for a hands-on experience, and were impressed.




That’s the name of Haply’s core product.

“The Inverse3 is the only haptic interface on the market that has been specially designed to be compact, lightweight, and completely portable,” says the company’s website. “Wireless tool tracking enables you to move freely through virtual environments, while our quick tool change mechanism allows you to easily connect and swap VR controllers, replica instruments, and other tools to leverage the Inverse3’s unmatched power and precision for next-generation force-feedback control.

“The Inverse3 replicates tactile sensory input required for simulating technical tasks. It can precisely emulate complex sensations like cutting into tissue or drilling into bone – empowering students, surgeons, and other healthcare professionals to hone and perfect medical interventions before ever performing them in the clinical environment.”

Haply Robotics has produced an excellent video that gives you both a look at the product – and how it works:




While at IROS, we had a chance to put our hands on the Inverse3.

In one of the simulations (which you’ll see shortly), the objective was to push a small sphere through a virtual gelatin-like substance. As you start pushing the ball against that barrier, you begin to feel resistance through the handle of the Inverse3. Using force-feedback, you continue to push and feel that resistance increase. Finally, when you’ve hit precisely the correct amount of pressure, the ball passes through the gelatin. The sensation, which included a satisfying, almost liquid ‘pop’ as the ball passed through, was amazing. It felt exactly like you would have anticipated it would feel with a real-world object.

“Touch adds a more information as opposed to just having the visual information,” explains Henry. “You also have the tactile information, so you have a rich amount of information for your brain to make a decision. You can even introduce different haptic boundaries so you can use things like AI in order to add some kind of safety measure. If the AI can say ‘don’t go there’ – it can force your hand out of the boundary with haptic cues. So it’s not just visual, it’s not just audio.”




The Inverse3 is already in use for simulation training in the medical industry. In fact, many existing devices for robotic surgery do not have haptics – and there’s clearly a demand.

“Robotic surgical consoles don’t use haptics yet, and we’re hearing that surgeons are asking for that to be added because it’s missing that sense,” says Henry. “A mistake they can make is to push an instrument too far in because it’s just visual. If you had haptics on your handles, you would intuitively know to pull back.”

Remember how we tried pushing a virtual object through a gel-like substance? You’ll see that in this video around the :24 mark:



Well, it’s not the entire Haply Robotics story, but here it is in a nutshell.

The idea for the product – for the need for such a product – first surfaced in 2016. The three co-founders were working on haptic devices at Canada’s National Research Council. Existing devices then were large and tended to not have the greatest user experience. They saw an opportunity to create something better. The company has been in business since 2018 – with these three at the helm:

  • Colin Gallacher (MEng, MSc, President)
  • Steve Ding (MEng, Electrical lead)
  • Felix Desourdy (BEng, Mechanical lead)

The trio put their heads together and – a lot of R&D later – produced the Inverse3.

The company manufactures the physical product, which contains three motors to provide haptic feedback. Haply Robotics also makes an API, but the coding for the simulations comes from outside partners. Fundamental VR, for example, is a company devoted to developing virtual training simulations for everything from opthamology to endovascular procedures. It coded that gelatin simulation.

“Studies confirm that VR significantly improves the effectiveness of medical education programs. Adding real haptics increases accuracy and delivers full skills transfer,” says the Fundamental VR website. In fact, it cites research showing a 44 per cent improvement in surgical accuracy when haptics are part of the VR experience.

“In the training space, when you’re using it for simulation, a surgeon’s work is very tactile and dexterous,” says Haply’s Jessica Henry. “We enable them to train using those instruments with the proper weights, the proper forces, that they’d encounter in surgery as opposed to textbooks or cadavers. It’s a more enriched way of interacting.”

And it really, really feels real.

Below: Haply’s Jessica Henry manipulates the Inverse3



Haply Robotics Jessica



It’s always great discovering another new company in the robotics field, particularly one with an innovative solution like the Inverse3. It’s also great when these companies are Canadian.

“Haply Robotics has identified a clear void in the marketplace and created a solution,” says Indro Robotics CEO Philip Reece. “With the growth in remote robotics – not just surgery – I can see a wide range of use-cases for the Inverse3. Congratulations to the Haply team on being ahead of the curve.”

For more info on the product, check out the Haply Robotics website.

Robots on earth help prepare for research on the moon

Robots on earth help prepare for research on the moon

By Scott Simmie


What could small robots on earth have to do with exploration on the moon?

Quite a lot, as it turns out. Professors and engineering students at Polytechnique Montréal have been busy writing algorithms and running experiments with robots and drones with one goal in mind: To enable them to explore unfamiliar and even hostile surroundings far beyond the reach of GPS or other forms of precision location technology.

“What we want to do is to explore environments including caves and surfaces on other planets or satellites using robotics,” explains Dr. Giovanni Beltrame (Ph.D.), a full professor at Polytechnique’s Departments of Computer Engineering and Software Engineering.

Before we get to the how, let’s address the why.

“Caves and lava tubes can be ideal places for settlement: They can be sealed and provide radiation shielding. There’s also a chance of finding water ice in them,” says Dr. Beltrame.

Of course, it’s also less risky – and less expensive – to send robots to other planets and moons rather than human beings. They don’t require life support, don’t get tired (with the exception of having to recharge), and they can gather and process data quickly.

Just think of all the data that’s been acquired on Mars by the twin Rovers and the Mars helicopter.

Below: A selfie taken by NASA’s Perseverance rover November 1, 2023, during the the 960th Martian day of its mission. The rover was built with a focus on astrobiology, searching for signs of ancient microbial life on the red planet. Image courtesy of NASA.

Mars rover Perseverance



It’s a pretty ambitious vision. But for Beltrame and his team, it’s also very real. And it requires a lot of work and research here on earth.

“So to get there (space) and do this with multiple robots, we’ve developed all sorts of technologies – navigation, perception, communication, coordination between the robots, and human-robot interfaces,” he says.

“We’re doing all these things, because our goal is to use a swarm of robots to do planetary exploration. There’s more, but that’s it in a nutshell.”

When you go to the moon, there’s no equivalent of GPS. And environments like caves can be really tricky – both in terms of robots understanding where they are, and also communicating with other robots beyond line of sight.

With the right technologies and algorithms, that communication is possible. And much of Beltrame’s research has involved testing this on earth. In particular, he’s focusing on how groups of robots could take on such tasks collaboratively.

“So our primary activities focus on swarm robotics,” he says.

Generally that starts with simulation models. But there are limits to simulations – and real-world testing is a big part of what’s going on at Polytechnique.

“So we do have this deployment philosophy that we try our technologies in simulation, but then we want to go to deploying robots. You can have the best simulation in the world, but there’s still a reality gap and it’s very extremely important to try things on the real robots,” he says.

“We have a saying in the lab, which is: ‘Everything works in simulation’. You can always make your algorithm work in simulation, and then you get out in the field and things go wrong. So one thing we do in the lab is we always do the full stack. That’s why we need to have real robots. And we don’t only do experiments with real robots in the lab, we do them in the field.”



The lab he’s referring to is known as Polytechnique’s MIST, which stands for Making Innovative Space Technology. Dr. Beltrame is the director of the lab, which focuses on computer engineering targeted towards space technologies. In addition to the researchers, the lab is home to a *lot* of robots. There are big ones, small ones, wheeled ones, flying ones (drones) – literally “hundreds” of robots at the lab.

But as Dr. Beltrame emphasised, proving that something will truly work requires testing in environments that are similar to what might be found on the moon or elsewhere. Locations where he’s carried out fieldwork include:

  • Lava Beds National Monument in California (with NASA JPL)
  • The Kentucky mega-cave with the CoSTAR team
  • Tequixtepec in Mexico with SpéléoQuébec

Just check out the images below of field work, courtesy of Dr. Beltrame:



Some of the robots used in the MIST lab – and perhaps eventually on the moon – arrived via InDro Robotics, a North American distributor for AgileX. In fact, Polytechnique has purchased a number of AgileX products, including platforms that InDro has modified to help speed the R&D process. These include:

  • 24 LIMOs and simulation table
  • AgileX Scout Mini
  • AgileX Scout 2.0
  • Two AgileX Bunker Mini platforms, with custom builds by InDro

We’ve written about the LIMO before – a small, affordable and versatile robot capable of perceiving its environment and even Simultaneous Localization and Mapping out of the box. It’s also an ideal size, particuarly when doing multi-agent/swarm robotics, for use in the lab. (You’d run out of space pretty fast with something much larger).

“The LIMOs are a very good platform for Simultaneous Localization and Mapping  – and perception in general,” says Beltrame.

He says they’re a good choice “because they have a 3D camera, they’re lighter, agile, and are sufficiently low in cost. So we can use them in large numbers. Another good thing about the LIMOs is that once you have a lot of similar robots that are reasonably agile, you can actually make a full deployment of software (across all robots).”

That makes them an ideal platform for multi-agent research and development.

“For example, we developed this tool called Swarm SLAM where many robots collaborate to have a better perception of the environment. We’re currently testing it with the full fleet of LIMOs. That’s something we would have believed impossible with larger robots for logistical reasons.”

Though the focus is firmly on space, the Polytechnique Montréal research has applications on earth. Swarms of robots could aid in disaster response, Search & Rescue, and more.




The LIMO isn’t the only AgileX product in Polytechnique’s stable. And while Beltram likes all of them, he has a soft spot for one in particular.

“I would say that my favorite robot is the Scout Mini,” he says. “It’s fast, it’s agile and the control is extremely precise.”

In fact, Beltrame often takes the Scout Mini with him when doing school presentations. It’s small enough to be carried in the trunk of his car and hand-carried to classrooms. His team has also used the platform to test a new code for path planning and sophisticated energy calculations. It’s capable of tracking the additional energy required for climbing inclines, for example, then calculating when the robot needs to return home to wirelessly recharge.

As always, InDro works with clients to deliver precisely what they need. This saves time for those institutions and corporations on builds, allowing them to get on with the business of R&D.

“We’ve done quite a bit of integration for them,” says Luke Corbeth, InDro’s Head of R&D Sales.

“For example (see picture below), we provide a top plate with all required hardware mounted and integrated. They then add their own sensors, protective structure, etc. So this is a great example of how we work with clients on a case-by-case basis depending on their needs as robotics isn’t one-size-fits-all.”

Polytechnique mini bunkers



With all of this research, what comes next? Will the work being done today at Polytechnique eventually find its way off this planet?

“The answer is it’s going to happen very soon,” says Beltrame. Sometime later this year, a rocket will head toward the moon carrying three small robots. It’s called the Cadre mission.

A trio of small rovers will work as a team to explore the moon autonomously, mapping the subsurface in 3D, collecting distributed measurements, and showing the potential of multirobot missions,” says NASA’s JPL website. One of Beltrame’s students is working on that mission with JPL.

“This is one example of how the work that we’ve been doing in this lab, in the end – through students that were here – become real missions,” says Beltrame.

And that’s not all. As early as 2026, a Canadian-built rover could land on the moon in Canada’s first moon mission.

Its task? To explore the moon’s south polar region in search of “water ice.” This ice is critical to long-term human habitation on the moon – and can also be converted to fuel, both for energy on the moon and potentially to refuel other spacecraft with destinations further afield.

“I have an engineer from the Canadian Space Agency that’s a student of mine that’s developed the Mission Planner. So the idea is that we – our lab – developed the Mission Planner for the Canada rover that’s going to the moon.”

Here’s a look at that planned mission, from the CSA:




There was some big news this week from Polytechnique Montréal. On January 24 it announced the formation of ASTROLITH, a body for “research in space resource and infrastructure engineering.”

It’s the first Canadian group dedicated to lunar engineering, according to a news release.

Comprising experts from all seven Polytechnique departments, ASTROLITH will pursue the mission of helping to develop next-generation technologies and training the engineers of tomorrow to ensure Canada’s presence in space and lunar exploration, as well as addressing critical needs on our planet within the context of climate change, resource management and sustainable development,” reads the release.

So while the emphasis is on the moon, ASTROLITH will also result in some very practical – and urgent – use-cases on our home planet.

“As encapsulated in its Latin motto Ad Lunam pro Terra, ASTROLITH is dedicated to developing technologies with direct impacts here on Earth: enabling development of infrastructure in the Far North or facilitating the energy transition, for example,” says the release.

“Indeed, the research unit’s founding members are already involved in developing technologies in various areas related to space and extreme environments, from design of resilient habitats and infrastructures for remote regions to deployment of cislunar communications technologies to development of advanced robotics systems for prospecting and mining, among many others. Their work is bolstered by contributions from specialists in life-cycle analysis, sustainable development and space-related policy development.”

The team is composed of academics and researchers that span all seven Polytechnique departments. Beltrame, not surprisingly, is on the team – which is pictured below. (He’s in the back row, centre.)




We find the work being carried out at Polytechnique Montréal, the MIST lab – and now ASTROLITH – both fascinating and important. It’s also a terrific example of how dedicated researchers and students can develop and test projects in the lab that eventually have real-world (and off-world) applications.

“I’m incredibly impressed with the work being carried out here, and the fact it can be put to positive use-cases both on earth and in space,” says InDro Robotics CEO Philip Reece.

“We wish Dr. Beltrame and his colleagues well, and we’ll certainly be watching these lunar missions with great interest. It’s always a pleasure when InDro can support teams doing important work like this.”

You can find more about the MIST lab here. And if you’d like to talk about AgileX robots (or any other robotic solution), connect with an InDro expert here

Aerometrix methane detection operation poised for new growth

Aerometrix methane detection operation poised for new growth

By Scott Simmie


This job, on occasion, stinks.

But it’s all in a day’s work for Aerometrix, Canada’s only company specialising in methane detection using drones. It’s not the methane itself that smells – it’s actually an odourless gas – but it’s the locations where methane can be emitted.

Imagine flying a massive landfill on a hot day in California. Further imagine that, in order to keep the dust down, the landfill operators have recently sprayed the location with leachate – the slimy runoff juice created by the landfill itself. It’s very biologically active, and it smells really bad.

“It’s horrendous – horrible,” chuckles Eric Saczuk, who often carries out the complex flights.

“It just ends up just suffusing through you and anything that you’re wearing. It even seems like it goes into your skin.”

Thankfully, not all missions are like that. But all of them do achieve results.

And now, for multiple reasons, Aerometrix is poised to be taking on many more of them – branching out into detection at oil and gas refineries.

Below: Flight Operations Lead Eric Saczuk prepares for an Aerometrix flight


Eric Saczuk Aerometrix



When it comes to climate change, methane is an invisible threat. Though we often hear about CO2 emissions, methane is a serious problem when it comes to greenhouse gases.

“Methane has more than 80 times the warming power of carbon dioxide over the first 20 years after it reaches the atmosphere,” states the Environmental Defense Fund.

“Even though CO2 has a longer-lasting effect, methane sets the pace for warming in the near term.”

So there’s increasing urgency to detect and mitigate methane emissions. At landfills, for example, once the emission points are detected, the gas can be trapped – and even used to generate electricity.

“The main emphasis recently has been on landfills, both in Canada and the US,” explains Aerometrix Co-Founder Philip Reece. “We flew 16 missions over the last 12 months.”

And these missions aren’t simply popping a drone up for a brief flight. Nor are they automated. Every Aerometrix flight has to be carried out manually.

“Over large sites like the Vancouver landfill, that took us three days of flying, six hours a day,” says Saczuk. “We fly these missions at five metres above the ground – and there’s lots of things that can get in the way at that level.”

Aerometrix flights deploy either the DJI M300 or M350 drone. But the secret sauce is not so much the drone as the sensors. (And it’s most certainly not the leachate.)




Aerometrix deploys two different sensors to detect methane. One of them is called an Open Path Laser Spectrometer (OPLS), developed by NASA for use on the Mars Rover. It was designed to detect trace gases. In the case of methane detection, the laser is tuned to a specific frequency that is absorbed when it encounters that particular gas. The greater the absorption, the higher the methane concentration.

The sensor requires “clean air” for accurate readings – meaning there can’t be any prop wash or turbulence caused by the drone itself. Aerometrix engineers built a brace that holds the sensor well forward of the drone for this purpose. Having that sensor and rod, of course, upsets the balance of the drone. In fact, Saczuk estimates the weight of the rod and sensor at roughly 800 grams, perched about 1.5 metres forward of the drone.

And while the flight controller is capable of compensating for that, Saczuk always performs a calibration once the drone is in the air.

“We take off to maybe three or four meters above the ground. Once the drone is airborne, we go into the controller and initiate the calibration. So the drone calculates its revised centre of gravity and knows what its steady state is. The two front propellers then spin a little bit faster to keep the nose from dipping.”

Flying manually, Saczuk uses Tripod Mode to limit the drone’s speed. The most accurate readings occur when flying at about five metres per second.

On an ideal flight day, there would be a steady wind at around eight metres per second. If the breeze is coming from the east and blowing over the landfill, this provides a couple of advantages. First of all, by positioning operations at the eastern end you can avoid most of the smell. But the real reason is because the drone will begin its flight in clean air not contaminated by methane. That will enable the methane, once detected, to really contrast with the surrounding environment.

“What we don’t like is no wind, because then the methane just goes up vertically and it’s variable – it just gets pushed around by a little vortices here and there,” says Saczuk.

The drone will make multiple passes (in this example, north and south) over the site. When the laser hits methane, some of those rays will be absorbed and some reflected, depending on the concentration. Flying multiple paths allows enough data to be gathered to create a visualization of methane in a vertical plane.

We’ll do this on the upwind and downwind side of the site as well as a full perimeter to understand where the main emissions source likely are,” he adds. 

“We shouldn’t be seeing much methane on that eastern side, assuming the wind is coming from the east. And then as we fly the western edge, that would capture all of the methane that’s being pushed by the wind and that would be the downwind curtain.”

While Saczuk is piloting, there’s a second controller that displays the data. A Raspberry Pi onboard the drone takes the data from the sensor and merges it with the flight data from the aircraft. So Saczuk can see the invisible gas while piloting.

The goal is obtain a really good cross-section, as illustrated below. Feel free to try your hand at the equation.

Flux Curtain



The second sensor deployed is called a Laser Falcon. The sensor, mounting hardware and accessories will set you back close to $60k CDN. It is mounted directly on the drone and faces downward.

In this case, the laser is factory tuned for methane detection – it is the only gas the Laser Falcon can detect.

“It’s an active sensor that will detect the amount of absorption that’s happening. The scattering of the laser in the air tells the sensor how much methane there is not at a point – but through a column of air.”

In both cases, the data is crunched to make the invisible visible. The result is called a “flux curtain” or “flux plane” – with differing colours representing different concentrations of methane, measured in parts per million. In the graphic below, the greatest concentrations are seen in the middle of the image, just below the centre.

Methane Detection



In December the Honourable Steven Guilbeault, Minister of Environment and Climate Change, announced draft methane regulations. These regulations aim to reduce methane emissions by 75 per cent by the year 2030, when compared with emission from 2012. The focus is on the oil and gas industry.

“Oil and gas facilities are the largest industrial emitters of methane in Canada—they release about half of total methane emissions,” reads the draft.

“These releases occur during normal operation of equipment and from leaks. To comply with Canada’s existing methane Regulations, industries had to adopt practices to monitor for leaks and ensure that repairs happen to reduce the amount of gas intentionally vented into the air.

“Under the draft methane amendments, the Government of Canada is enhancing the emissions-monitoring requirements through a risk-based approach to structure inspections for fugitive emissions—facilities with equipment that has greater potential for emissions must undertake more frequent inspections. All inspections must be conducted using instruments with a standard minimum detection limit, and repair timelines will depend on emissions rates. Further, the draft regulations introduce an audit system, requiring one annual third-party inspection to validate company program results.”

In other words, it won’t be long before oil and gas facilities will need to bring experts like Aerometrix onboard to verify that the reported data is accurate.

“Lowering methane emissions from our oil and gas sector is one of the fastest and most cost-effective ways we can cut the pollution that is fueling climate change,” said Minister Guilbeault in this news release.

“As the world’s fourth largest oil and gas producer, we have both the responsibility and the know-how to do everything we can. At this time of robust profit margins and high energy prices, there has never been a better time for the oil and gas sector to invest in slashing methane emissions.”




In creating Aerometrix, co-Founders Philip Reece and Michael Whiticar developed a solution to a significant and largely invisible problem. Now, with even greater emphasis on reducing methane emissions, Aerometrix has attracted a major new investor.

That investor is Omar Asad, the company’s new Director. He sees great potential ahead.

“The cutting-edge technology utilised by Aerometrix is unmatched and has already translated into significant savings for clients,” says Asad. “What’s more, we offer both a much-needed and innovative solution – while helping to reduce methane emissions at a critical time.”

Asad’s investment, in conjunction with Canada’s impending methane legislation, paves the way for accelerated growth.

Below: Eric Saczuk points to the second controller, highlighting real-time methane detection


Aerometrix Flux Curtain



Philip Reece, of course, is also the Founder and CEO of InDro Robotics. And he’s clearly pleased with both the investment – and the growth trajectory.

“Landfill detection alone has kept Aerometrix busy and profitable,” says Reece. “With the pending legislation we are poised for significant growth in the oil and gas sector.

“Not only is using these sensors with drones more accurate than traditional hand-held walk-arounds, but Aerometrix has racked up years of experience in turning our findings into clear and actionable data. This company, particularly with Omar onboard, is ready for the next phase of growth.”

Interested in learning more about methane detection by Aerometrix? Contact them here.

Good dogs: A look at the newest Unitree quadrupeds

Good dogs: A look at the newest Unitree quadrupeds

By Scott Simmie

When people think of robots, they often picture industrial robotic arms doing repetitive work on assembly lines: Precision welding, picking and placing objects – those sorts of applications. Or perhaps a wheeled platform carrying a load from one location in a factory to another.

In recent years, however, new algorithms and technologies have led to an increase in the number of quadruped robots. These are the four-legged devices that inevitably remind observers of dogs, since they have roughly the same shape and move with a similar gait. They’re also (depending on the robot) roughly the same size as medium to large dog breeds.

The most well-known of these is likely Spot, built by Boston Dynamics. Built primarily for industrial inspections, this machine has also taken the spotlight (excuse the pun) with choreographed performances with the likes of Cirque du Soleil.

In fact, videos of Spot dancing proved so viral that Boston Dynamics produced a video to clarify that its robot is capable of much more:



Why four legs? Why not just wheels, like most mobile robotic platforms?

Good question. And we put that to InDro Robotics Account Executive Luke Corbeth.

“In most predictable environments, wheels or tracks will suffice,” he says.

“Quadrupeds excel at unpredictable terrain. You can start looking at complex infrastructure like refineries, where there might be stairs or pipes that need to be stepped over. Quadrupeds are also suitable for Search and Rescue, where there might be rubble on the ground or potentially unsafe conditions. Robots like these are very good at navigating terrain that would be impossible for a robot with wheeled or tracked locomotion.”




Unitree Robotics is one of a small but growing number of firms specializing in these robots. Its founder is Wang Xinxing, an engineer who started working on quadrupeds roughly a decade ago at Shanghai University. He built his first quadruped, XDog, by designing and building virtually everything, including motor drive boards, the master-slave architecture, the legs – and more.

All that hard work led to the founding of Unitree in 2016. And Wang and his team of engineers have never stopped trying to push the envelope. As the Unitree website explains, the company puts a heavy emphasis on R&D:

“Unitree attaches great importance to independent research and development and technological innovation, fully self-researching key core robot components such as motors, reducers, controllers, LiDAR and high-performance perception and motion control algorithms, integrating the entire robotics industry chain, and reaching global technological leadership in the field of quadruped robots. At present, we have applied for more than 150 domestic patents and granted more than 100 patents, and we are a national high-tech certified enterprise.”

We’re going to explore two new models from Unitree in just a moment, but it’s worth taking a look back at the early days. This video was uploaded seven years ago – after XDog was already in development for more than a year.




One of the new Unitree quadrupeds is the GO2. This is a step up from the GO1 EDU, which has been popular for research and development, corporate innovation parks, and even entertainment. (Yes, like Spot, the GO series can also dance – but they also do *much* more than that.)

The GO2 is a significant redesign from the GO1 series. Unitree has dropped some of the multiple cameras from the GO1 and developed its own LiDAR module, called the L1. It features a 360° x 90° hemispherical capture. With a minimal blind spot, Unitree says the GO2 is 200 per cent better at recognizing its surroundings than the GO1 series. It can detect surroundings as close as .05m away.

Because of the LiDAR, it’s obviously capable of mapping even unfamiliar surroundings and avoiding obstacles, meaning it’s perfect for Simultaneous Localization and Mapping (SLAM) applications. In conjunction with that LiDAR, the GO2 features the new NVIDIA Orin Nano for powerful onboard AI-enhanced EDGE computing

“From my experience, the LiDAR does a much better job at SLAM than the depth cameras on the previous models,” says Corbeth. “The obstacle avoidance is really good out-of-the-box and it can obviously be improved on with development (GO2 is Open Source). And the Orin is a really notable upgrade when it comes to computing power.”




One of the more intriguing features is that the GO2 is integrated with Chat GPT and can respond to voice commands. You could ask it to explain Einstein’s Theory of Relativity and it would speak the answer to you. More useful, though, is that you can instruct the GO2 to carry out certain tasks by voice.

“If you say: ‘Hey, go back to where I first turned you on,’ then it’s going to return home. So that’s a practical use. This is one of the first robots that can accept voice commands out-of-the-box and literally action some of those voice commands.”

You can even ask GO2, viat Chat GPT, to generate code for new tasks. Think about that for a moment.

It’s also capable of wireless charging. The GO2 can rest itself on a small optional pad and be ready for its next mission without human intervention. There’s also an option for a servo arm if a manipulator is useful for your application. It’s faster than the GO1 EDU, capable of trotting along at 5 metres/second. The GO2 also has a significantly longer run time – between two and four hours, depending on how strenuously it’s working. Battery capacity and endurance have jumped by 150 per cent compared to the previous model.

“The locomotion – their internal algorithm for how the robot moves – is much improved. So it can go faster, it’s more reliable, it’s quieter,” adds Corbeth. Firmware upgrades are OTA (over the air), with user authorisation. The GO2 connects via 4G, Wi-FI6 and Bluetooth.

Unitree Go2 Quadruped



Though the GO2 could be used for basic industrial applications, it’s intended more for R&D and education (there’s even the option of drag-and-drop block coding). InDro Robotics is also capable of modifying the robot with our InDro Backpack – which enables data-dense 5G operation with an easy-to-use dashboard and comprehensive documentation. The Backpack also contains USB slots for additional sensors, as well as the Robot Operating System (ROS) code necessary for seamless integration.

“Anything the GO1 could do, the GO2 can do better, faster, longer,” says Corbeth.

There are even variants available – the GO2 Enterprise and GO2 Enterprise Plus – with some additional bells and whistles intended for law enforcement, Search and Rescue and other First Responder applications. Those features include dual backup communication links, a searchlight and emergency flashing lights, an additional camera and the ability for two-way voice communication.

Here’s a look at the basic GO2 in action:



Unitree’s other new quadruped is the B2. It’s an incredibly powerful, enterprise-level machine that can be deployed in even the most demanding conditions. Use-cases include:

  • Industrial asset monitoring and surveillance
  • Search and Rescue/First Responder work
  • Carrying heavy payloads/cargo over even rough terrain
  • Working in water (Ingress Protection rating IP67)

Capable of moving at 6 metres per second (21.6 km/hour), Unitree says the B2 is the world’s fastest enterprise-level quadruped.

“That’s really fast – like ridiculously fast,” says InDro’s Corbeth.

“The B2 is designed less for development and more for real commercial applications. It’s also Open Source, which differentiates it from quadrupeds like Spot, or those made by Ghost Robotics,  ANYmal, etc. So we have the option to deploy proprietary software on it that we’ve built and designed, or our partners have built and designed.”

Like its predecessor the B1, the size of the B2 is striking. It weighs 60 kg and measures 1098mm x 450mm x 645mm.

B2 Robot



Straight from the box, the B2 is ready for a variety of use-cases. With strength and endurance, this machine has been tested carrying a 45 kilogram load 7.98 kilometres on a single charge (or 20kg more than 15 km). If it’s not carrying a load, it can walk more than 20 kilometres non-stop.

The B2 can handle slopes of 45° with ease, even in rough terrain. It can even walk on greasy or oil-covered floors without falling down. (You’ll see an impressive demo involving banana peels shortly.)

Unitree has measured a 170 per cent improvement in joint performance over the B1, with 360 Nm (Newton-metres, or 265.5 foot-pounds) of torque. Run-time is vastly improved, with the B2 capable of operating between four and six hours on a mission (depending on terrain, payload and speed). The heavy-duty battery is designed for rapid swapouts, and the option of autonomous wireless charging via pad is an option.

From the factory, the B2 is equipped with a 32-wire automotive-grade LiDAR, two depth cameras, a high-resolution optical camera, and a high-capacity 45Ah (2250Wh) battery.

“And the B2 can be further customized, either directly from the factory or by InDro Robotics for specific use-case scenarios,” says Account Executive Corbeth. “We can integrate additional sensors, including thermal and even gas-detecting modules according to client needs. And, of course, we can also outfit the B2 with the InDro Robotics Backpack, which enables 5G operation and allows for rapid integration of additional sensors.”

All of those are great options to have, but Corbeth emphasizes “this quadruped is also capable of starting work straight out of the box.”




Make no mistake. This robot has been built to thrive in punishing conditions, including operating in water. It’s also very strong, capable of bearing a load of 120kg while standing. Control and perception are managed by multiple processors, including an NVIDIA Jetson Orin NX, three Intel Core i7s and an Intel Core i5. (These can vary if you’re looking for a custom factory build.) Plus, of course, InDro has expertise in modifying all of the Unitree quadrupeds pending client needs.

“InDro Robotics does have the ability to outfit these with any sensors that aren’t standard from Unitree,” explains Corbeth.

Plus, there’s also the option of wheels. The lower legs can be swapped out with wheeled versions. If the B2 is operating on flat terrain these are more efficient than walking.

“This option combines the best of both worlds between a legged and a wheeled robot – you get the speed and efficiency of a wheeled robot, yet with the other legs it can also climb stairs and manage rubble or other obstacles on the ground,” he adds.

And how does this new machine compare to the competition? Unitree says its measured parameters are superior – and there’s agreement from Corbeth.

“Compared with Spot, ANYmal and Ghost Robotics, I think we’re very competitive on the hardware side. I actually think Unitree has got to the point hardware-wise where it’s now superior to pretty much all the other options.”

Have a look for yourself:



As a North American distributor for Unitree, we obviously have faith in their products. We’ve also been partnered long enough to see the company’s commitment to continuously and meticulously advancing its products. These are excellent and durable quadrupeds, as our many clients will attest.

InDro also takes pride in supplementing Unitree’s documentation to get clients up and running quickly, and on those rare occasions when something goes wrong – we know how to repair them.

“Unitree is quickly becoming a world leader in the quadruped sector,” says InDro Robotics CEO Philip Reece.

“The new models are exceptionally well-built, with significant gains in power, run-time and processing abilities. Plus, add-ons like the InDro Backpack make these quadrupeds even more versatile for virtually any use-case scenario.”

Interested? Get in touch with us HERE to arrange a demo.

TLR – Technology Readiness Levels – explained

TLR – Technology Readiness Levels – explained

By Scott Simmie


So: You’ve got a great idea for a new technology product or process.

That’s the first step: A concept that you’ve put some thought into. Of course, there’s a long road ahead before that brilliant idea becomes an actual commercial product. But how do you gauge that progress as you move along the development path? How would you describe where you’re at in a way that others might quickly grasp?

Luckily, there’s a tool for that. It’s called the Technology Readiness Levels scale, or TRL.

“It’s a standard measuring stick for everyone to communicate where they are with development,” explains InDro Robotics Engineering Lead Arron Griffiths.

The TRL tool was first developed by NASA researcher Stan Sadin back in 1974 with seven basic levels. It would take another 15 years before the levels were formally defined, during which time two additional levels were added. There are now nine steps up the ladder, where TRL 9 is the equivalent of a working product that could be mass-produced or commercialized.

Which means, of course, that Level 1 is at the very beginning of the technology development process.

“Level 1 is universally seen as a napkin idea – where you’ve jotted down a concept,” says Griffiths.

That’s a perfect analogy for TRL 1.

For greater clarity, each level on the scale offers a short definition, a description, and examples of activities. The short definition for Level 1 is “Basic Principles Observed and Reported.” The description is “Lowest level of technology readiness. Scientific research begins to be translated into applied research and development (R&D).”

In terms of examples, Level 1 “Activities might include theoretical studies of a technology’s basic properties.” And yes, that could include a napkin sketch.

Below: Aerospace is one of many industries to use TRLs. The noise-reducing chevron nozzles seen on the cowling below would have gone through each of the nine levels. Photo via Wikimedia Commons by John Crowley.

TRL chevrons



Great! You’ve got that napkin sketch done.

Obviously there’s a lot to do between that initial idea and a finished product suitable for commercialization. To get to TRL 2, you simply need to put more thought into it. You’re not actually building or programming yet, just putting greater clarity and focus on what you hope to accomplish.

TRL 2 is defined as “Technology concept and/or application formulated.” Here’s its description:

“Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative, and there may be no proof or detailed analysis to support the assumptions.”

You could think of this stage as refining the idea, with activities limited to research and/or analytical studies.

TRL 3 means you’re actually beginning the R&D process. This might include some lab or analytical studies. At this stage you’re trying to validate predictions you’ve made about separate elements or components of the technology. The components you’re working with aren’t yet integrated – nor is it expected that the components you’re working with are at their final version.


Before we move along, it’s worth noting there are actually two different TRL scales in use. The first (and the one we’re using here) is the NASA scale. But the European Union has its own TRL scale. 

“So there is some cloudiness,” explains Griffiths. “Typically the top and bottom of the scales are the same, but the middle moves around a bit. You have to be sure people are reading from the same scale. Typically when I talk to a client, I will show them the scale I am using.”

Griffiths also emphasizes that during R&D, the phase of development doesn’t always slot neatly into one of the TRL stages. 

“It’s typical to say: ‘We’re roughly about TRL 6’ – it’s not an exact science.”

Below: The InDro Commander module, with LiDAR sensor. This popular commercial product, which allows for rapid integration of ROS-based robots and sensors (and more), is TRL 9.

Teleoperated Robots



American inventor Thomas Edison once said: “Genius is one per cent inspiration and ninety-nine per cent perspiration.” The same could be said of the process of inventing a product for commercialization. Once that napkin sketch is done (the one per cent), there’s still a lot of methodical slogging ahead. (Trust us, we know.)

TRL 4 is the stage where you’re putting things together. Basic components are integrated and readied for testing in a simulated environment. The short definition, via Canada’s Department of National Defence, is “Component(s)/subsystem(s) and/or process validation in a laboratory environment.”

The logical progression continues with the next step.

“TRL 5 means it’s ready for testing in a lab environment,” explains InDro Lead Engineer Griffiths. He also adds that this middle stage – TRL 4 through 7 – “is always the difficult part.”

Once TRL 5 is passed, it’s time to start seeing if the integrated components will work together in a simulated or lab environment. At this stage, TRL 6, the product is considered to be getting pretty close to its desired configuration. Yes, there will be further tweaking to come, but you’re getting there.

Below: InDro’s Street Smart Robot (the large white unit). The product has been built but not yet deployed in winter conditions. This would be at TRL 7. Every other robot in this image would have made it to TRL 9.


SSR Street Smart Robot



All that hard work has been paying off. Your product is assembled and has been tested in simulation or other lab environment. Now it’s time to get it out into the real world to see how it performs. Congratulations, you’ve reached TRL 7, where “Prototype system [is] ready (form, fit, and function) for demonstration in an appropriate operational environment.”

“TRL 7 is more like a long-term deployment. Once you can show it to be working in a real-world environment – outside of the lab – then you get to Levels 8 and 9,” says Griffiths.

These final two levels are usually pretty exciting. Once the product/solution has been proven to work in its final form – and in the environment where it’s expected to be deployed as a product – you’ve reached TRL 8. Just one more to go.




Remember that Street Smart Robot you just saw a picture of? Well, it’s ready to go. And once the wintry conditions take hold in Ottawa, we’ll be operating that machine in ice and snow on Ottawa streets. Specifically, on bike lanes in Ottawa, where it will detect hazardous conditions (including potholes) that might pose challenges for safe cycling. City of Ottawa maintenance crews will then be notified of the problem (and its location) so they can address the issue. (You can read more about the SSR here.)

And once the SSR is operating smoothly in those intended conditions? We will have achieved TRL 9, meaning “Actual solution proven through successful deployment in an operational setting.”



It’s easy enough to describe these levels. And in doing so, it can appear to be a straightforward, linear path where engineers move seamlessly from one level to the next. Reality is not quite so simple. Depending on the project, progress in the early stages can be made very rapidly.

“Most people get up to Level 5 fairly quickly,” says Griffiths. “You can even get to Level 5 in a day if you’re doing software development – you can literally go from an idea all the way up to a basic rudimentary prototype.”

But – as flagged earlier – things get a little trickier once you hit those middle levels.

“You can think of it as walking up a hill to Level 5,” he says. “Then there’s this valley. A lot of stuff dies in Level 6 and 7. There’s not a lot of success there because once you push the technology into actual environments the success rate is very low. So a lot of time is spent in Levels 5 and 6 trying to make a system that can make it to Level 7 successfully, and then on to Level 8 – where you’re essentially across the valley.”

Below: A graphic outlines the short definitions of Technology Readiness Levels

Technology Readiness Levels



The TRL scale is extremely useful in the R&D world, in that it concisely conveys where a product is along the path to commercial development. And while it’s great for engineers, it’s also useful to help clients understand where one of our products is along that journey.

We’ve scaled this ladder many times over the years. Sometimes it’s a relatively easy climb. But, like all Research and Development companies, we’ve also had a few products that never made it beyond the valley Arron Griffiths described. That’s R&D.

“The Technology Readiness Level scale is a really useful tool, and part of our daily language at InDro Robotics,” says CEO Philip Reece. “Each level represents unique challenges – and that valley Arron described can sometimes be a disappointing bit of landscape. But we learn something even with the occasional failure.

“Thankfully, we have a creative and tenacious engineering team that seems to thrive on difficult challenges – and InDro now has a growing stable of products that have achieved TRL 9 and gone on to commercial success.”

If you’re working on your own project and would like to know where it is on the TRL scale, you can use this assessment tool from Industry, Science and Economic Development Canada.


That’s a wrap: Another great Aerial Evolution Association of Canada Conference

That’s a wrap: Another great Aerial Evolution Association of Canada Conference

By Scott Simmie


What a great show.

The Aerial Evolution Association of Canada (formerly Unmanned Systems Canada – Systèmes Télécommandés Canada) held its annual conference and trade exhibition November 7-10 in Ottawa. The event had an excellent turnout, along with the usual selection of high-quality learning sessions.

There was plenty of discussion around the coming world of Advanced Air Mobility, where new and transformative aircraft (many of which are innovative new autonomous drone designs with detect-and-avoid features) will routinely deliver heavy cargo and even passengers over dense urban centres and to regional communities not currently served by traditional aircraft.

Another timely topic was the increasing use of drones in the conflict in Ukraine, as well as the latest developments in Counter-UAS technologies (including both detection and mitigation). There was even a live demonstration of a new kinetic C-UAS drone that uses a net to disable and capture a rogue RPAS.

Reps from Transport Canada and NAV Canada were on hand to discuss proposed changes on the regulatory landscape and – always an important part of these gatherings – hear questions and concerns directly from the industry. These open exchanges have long been a hallmark of the annual event.

AEAC Plenary



There was a notable emphasis this year on Indigenous use of drones and other technologies, including a powerful session about detecting unmarked burial sites on the grounds of former residential schools. The concept of data sovereignty – who owns data captured on unceded territories – was also discussed. There was even a presentation on how drones have helped to capture important First Nations cultural events. Plus, of course, the employment and opportunities that RPAS education and initiatives are creating for Indigenous entrepreneurs and communities.

Below, one of the Indigenous panels, moderated by Kristin Kozuback (C)



SAIT‘s Shahab Moeini talked about a program using UAS to detect land mines using AI, machine vision and sensor fusion. Many previous and current efforts have used magnetometers, but these metal-detecting sensors are neither effective nor appropriate given that many land mines are made of plastics and other non-metallic materials. Machine Learning is being used to train drones to recognise the many, many, different types of land mines – even if only a portion of the device is visible above ground.

“Land mines,” said Moeini, “are the nastiest creation of mankind.”

Below: Shahab Moeini, who runs SAIT’S Centre for Innovation and Research in Unmanned Systems (CIRUS)



Among the many excellent and innovative presentations, one by Spexi Geospatial caught our attention. The Vancouver-based company has built software that allows pilots of micro-drones to automatically fly and capture hexagonal-shaped areas the company calls “Spexigons.” Each Spexigon covers 22 acres and when an adjacent Spexigon is flown the data and imagery are seamlessly connected. With enough Spexigons captured, you’ve got a high-resolution version of Google Earth – and a ton of use-cases for the data.

The Spexi software carries out the flights automatically using DJI sub-250g drones, flying standardized capture missions to produce imagery at scale. The data is uploaded to the cloud where it’s stitched together to form highly detailed images of very large areas with a resolution of 3cm/pixel. (A satellite, by contrast, captures at 30cm/pixel while a standard airplane generally captures at 10cm/pixel.)

During one recent mission, “over 10,000 acres of imagery was captured in three days,” explained Spexi COO Alec Wilson.

“We’ve made it super simple to get images in and out at scale… And we’re super-excited to be able to start building bigger and better platforms for the drone industry.”

Below: Spexi’s Alec Wilson explains how the system works…

Alec Wilson Spexi



This year’s conference saw an increased emphasis on Women in Drones.

Though this has been on the agenda at past events, the 2023 event had somehow a different feel: The recognition that women are not only increasingly entering and shaping this male-dominated sector, but that many are high-level subject matter experts making significant contributions.

While progress has been made, there’s still work to do on the equity front. And there was a strong sense the AEAC is committed to achieving that.

Below: The close of the Women in Drones breakfast

AEAC Women in Drones Breakfast



One of the most memorable parts of any Aerial Evolution Association of Canada conference is the awards ceremony. Individuals and organizations that have made outstanding contributions to the RPAS industry are nominated, voted for by their peers, and selected for recognition. Recipients range from student engineers (the RPAS CTOs of tomorrow) through to service providers, manufacturers – and even government agencies.

Those honoured at this year’s conference include:

  • Dr. Frederique Pivot: Pip Rudkin Individual Achievement Award
  • Jacob Taylor: 2023 Indigenous Innovation Award
  • National Research Council of Canada Aerospace Research Centre: 2023 Organizational Achievement Award
  • Bryan Kikuta, Toronto Metropolitan University: 2023 Mark Cuss Memorial Scholarship
  • Ana Pereira, University of Victoria: Best Student Oral Presentation Award (judged)
  • Aman Basawanal, Carleton University: Best Student Technical Paper Award (judged)

Below: The National Research Council Team receives its award



There was one more award recipient to whom we’d like to give a special shout-out. It’s Katelin (Kate) Klassen, who received the 2023 Aerial Evolution Ellevatus Award “for her outstanding dedication in uplifting, empowering, and inspiring women in the Canadian RPAS sector.”

Kate is truly a pioneer in this field. A multi-rated private pilot and flight instructor with traditional aircraft, Kate has been a significant force in the drone field for years. She’s an educator (her online courses have trained more than 10,000 pilots), a lobbyist (she’s taken part in multiple consultations with regulators – including being co-chair of the CanaDAC Drone Advisory Committee), and a true advocate for RPAS education. Her knowledge of the Canadian Aviation Regulations (CARs) is legendary – and she has inspired and encouraged countless women (and men) in this industry.

Plus, she’s truly an all-round awesome human being – always willing to share her time and expertise. Congratulations, Kate – and all the other winners!

Kate (C) looking justifiably happy…

Kate Ellevatus



Though they didn’t receive any awards, three key members of the Association certainly merit public recognition for their contributions. Jordan Cicoria (CEO of Aerium Analytics) did an outstanding job as Conference Chair. In fact, he’s overseen the last two in-person conferences, while also taking the helm of the virtual gathering during the peak of the pandemic. That’s a *lot* of work, and Jordan has carried out these tasks both professionally and modestly while juggling a plethora of moving parts.

A lot of work on the conference – and elsewhere – came from AEAC Executive Director Declan Sweeney. Declan worked hard behind the scenes (and on countless calls) with sponsors, exhibitors, membership drives – you name it. He’s also deeply involved in the annual student competition. Declan does it all with professionalism, and a great sense of humour.

Equally deserving of recognition is AEAC Chair of the Board Michael Cohen (also the CEO of Qii.AI).

Michael has been serving the Association well, and was key in the transition and rebranding from Unmanned Systems Canada / Systèmes Télécommandés Canada to the Aerial Evolution Association of Canada. This was far more than a name change, but an organizational shift to reflect the coming era of Advanced Air Mobility. He’s been instrumental in the Association’s push toward greater Diversity, Equity and Inclusion – which was reflected in the conference agenda.

The Association also benefits greatly from Michael’s extensive knowledge and background; he’s a former commercial jet pilot – a distinct advantage when discussing the Big Picture (and the minutia) with regulators.

Thank you, all.

Below: Jordan Cicoria (L) with Declan Sweeney, followed by Michael Cohen (R) with Transport Canada’s Ryan Coates

Jordan Declan
Michael Cohen Ryan Coates



As always, we were pleased to participate at the annual Aerial Evolution Association of Canada conference. In addition to the sessions, the networking and the trade exhibit – it’s of tremendous value to have the industry and the regulators together for collaborative discussions. There’s been tremendous progress in this sector over the past decade, and much of that is due to regulators truly working with the industry to safely advance RPAS use in Canadian airspace, including BVLOS flight and other more complex operations. Technology that was seen almost as a threat in the early days is now being accepted as a useful – and critical – adjunct to the overall world of aviation.

InDro Robotics staff appeared on multiple panels; CEO and AEAC Board Member Philip Reece, pictured below, took part in the Counter-UAS panel and a live demo of kinetic C-UAS drone at Area X.O‘s Drone and Advanced Robotics Training and Testing (DARTT) facility. (That’s Philip below.)

Philip Reece



We’d be lying if we didn’t tell you that a true highlight for us was seeing Kate Klassen receive the Ellevatus Award.

“One might easily conclude we’re happy simply because Kate is a flight instructor and regulatory expert with InDro Robotics,” says CEO Philip Reece. “But that’s really just a sliver of the truth. Kate’s contributions over the years have been plentiful, significant, and lasting. We’d be applauding this recognition just as loudly even if she didn’t work with InDro.”

We are, however, very happy – and fortunate – that she does.