Dual manipulator Rosie the robot used for Industry 4.0 research

Dual manipulator Rosie the robot used for Industry 4.0 research

By Scott Simmie

 

At least some of you will remember The Jetsons.

The television series, created by Hanna-Barbera Cartoons Inc., was a space-age version of The Flintstones (another Hanna-Barbera production). It originally aired in 1962-1963 with later episodes created in a reboot from 1985 to 1987.

But while Fred Flintstone drove a stone-age car (complete with stone wheels) that he powered by pushing his feet along the ground, George Jetson and his family lived in Orbit City, where Jetson commuted to his two-hour per week job via a flying car with a bubble top. And instead of having dinosaurs (including pterodactyls) help carry out tasks, The Jetsons live in a future where they’re surrounded by automated devices. You could think of their surroundings as the 1960s vision of the Smart Home.

And an integral part of that home? Well, that would be Rosey (later changed to ‘Rosie’) the robot.

Rosey was the family’s robotic maid. She carried out tasks that weren’t performed by the many other automatic conveniences that filled the Jetson’s home. She had two manipulator arms and an internally stored vacuum that be deployed on demand.

She was very useful around the house, carrying out tasks to save the family time.

And this story? Well, it’s about our own Rosie – which is also very space-age.

Below: A Rosie the robot publicity cel, signed by show creators William Hanna and Joseph Barbera. The cel was auctioned in 2018; image by Heritage Auctions

Rosie the Robot from The Jetsons Heritage Auctions image

THE ROSIE STORY

 

So. What is Rosie? We asked Head of R&D Sales Luke Corbeth for a snapshot.

“Rosie is a dual arm mobile manipulation robot designed for pick and place in an industry 4.0 setting,” he says. In other words, it has two arms and manoeuvres on a wheeled platform, and is capable of moving objects from one location to another or even manipulating a single object with both end effectors.

And Rosie has a few tricks up her sleeve. Or, more accurately, sleeves.

“The actual robot is very unique because it has six mounting points for the arms. So you can mount the arms on top, high on the side or low on the side to access shelving of different heights. In fact, you could actually mount one arm directly on the top right, for example, and then mount the second one on the bottom left. So you could grab something from the top of the shelf and from the floor at the same time, which is kind of cool, right?”

Yes, indeed.

Rosie’s home is not with the Jetsons (she has no vacuum cleaner) but in a new lab that hasn’t yet been officially launched at Polytechnique Montréal. It’s called the Intelligent-Cyber Physical System Lab, or I-CPS. So we contacted Lionel Birglen, a professor with the Department of Mechanical Engineering. We wanted to learn more about what the lab does, what he does – and what plans he has for Rosie (which InDro built and shipped in 2023).

Dr. Birglen is a PhD Mechanical Engineer, with a specialisation in robotics. He’s particularly interested in – and an expert on – manipulators and end effectors and has designed and built them. He’s written two books, holds three patents, and is the author or contributing author of at least 94 research papers. He’s also – get this – been listed among the top two per cent most-cited scientists in the world in his area of specialisation.

So it kinda goes without saying, but he’s a pretty big deal in this field.

Dr. Birglen has a deep interest in the role robotics will play in the future of industry. And, within that realm, he’s intensely interested in ensuring that robots, particularly those that will be sharing space with human beings on a factory or warehouse floor, will be safe.

And – he emphasises – he doesn’t trust simulations for important work like this.

“Because simulations lie. They lie all the time,” he says. “You have to understand that reality is infinitely more complex than anything you can have in simulation – so actual experiments are absolutely essential to me. They are essential to my work, to my understanding of what robotic manipulation is.”

“I believe in math, but I know that reality is different. It’s more complex, more complicated, and includes so many un-modelled phenomena.”

 

ROSIE’S JOURNEY

 

Dr. Birglen knew he wanted a new robot for use in the new lab (which we’ll get to shortly). And he knew he wanted a robot with two manipulator arms.

“Dual-arm robots are, in my opinion, the future for industry applications,” he says.

And while humanoid bipeds grab a lot of attention, they’re far more complex (and expensive) than wheeled robots. Plus, he says, most factory applications take place on a single level and don’t require climbing stairs.

“From a factory perspective, a wheeled platform makes a lot of sense because typically in factories you don’t have, say, five levels connected by stairs.”

So he knew he wanted an autonomous, wheeled, dual-arm robot. And he started, initially, to think of a company other than InDro for the build.

“I came across InDro almost by accident,” he explains. “Giovanni Beltrame told me about you because he has purchased many, many robots from you. He said: ‘Those guys can build and assemble the robot for you. They’re close and they do a great job.’ So that’s how I came in contact with you.” (We’ve written previously about the amazing work Dr. Beltrame is working on involving robots and space. You can find that here.)

And so, after a number of calls with Luke Corbeth and the engineering team to settle on design and performance parameters, work on Rosie began.

Below: Technologist Tirth Gajera (‘T’) puts the finishing touches on Rosie in 2023

Rosie and Tirth T

THE LAB

 

Polytechnique Montréal’s Intelligent-Cyber Physical System Lab (I-CPS) is set up as a highly connected Industry 4.0 factory. Faculty from four different departments – computer engineering, electrical engineering, industrial engineering and mechanical engineering (Dr. Birglen) – are involved with the lab. Interns and students, under supervision, also work in the facility.

“So we have four departments involved in this lab and the idea is to build a small scale factory of the future, meaning that everything is connected. We are building a mini-factory inside this lab,” he says.

So think of cameras that can track objects on shelves – and people and robots within the environment. Think of smart tools like a CNC machine, which will eventually be operated by Rosie. And, perhaps just as important as the connectivity within the lab, is connectivity to other research institutes in Quebec, including Université Laval, Université de Sherbrooke and École de Technologié Supérieure (ÉTS). All of those institutes are working with similar mini-factories, and they’re all connected. There’s even a relationship (and connectivity) with manipulator manufacturer Kinova. Funding came via a significant grant from the Canada Foundation for Innovation, or CFI.

“So think of our lab as like one node of this network of mini-factories around Quebec,” explains Dr. Birglen. That connectivity of all components is still a work-in-progress, but “ultimately the goal is that there is a cyber-connection between these different mini-factories, these different laboratories around Quebec, so that one part of one node can work in collaboration with another node in realtime.”

Plus, of course, a lot of learnings will take place within the individual labs themselves.

“We want to bring collaborative robots to work in tandem with humans,” he says. “We want our robots to safely move around people, we want robots to help people. And we also want robots to learn how to work from people.”

 

SAFETY, SAFETY, SAFETY

 

As mentioned earlier, there’s a huge emphasis on safety. And while there are international safety standards for collaborative robots, even a ‘safe’ cobot can pose a threat.

“All the collaborative robots that you have currently on the market more or less follow this technical standard and they are more or less safe, but they’re still dangerous,” explains Dr. Birglen. “And the classical example that we’ve all heard, and which is true, is that if a safe cobot has a knife in its hand and is moving around – it is very dangerous.”

So safety in the lab(s) is paramount – and that means safety at multiple levels. There must be safety:

  • At the task level; you must not have tasks that could endanger people
  • Safety at the control level
  • Safety in terms of collision detection, mitigation, obstacle avoidance
  • Safety at the data security level

Plus – and this really interests Dr. Birglen – you must ensure safety with any additional mechanical innovations that are introduced.

“What you develop, any mechanical system you develop, must be as much as possible intrinsically safe. And actually that’s one of the topics I’m currently working on is to develop end effectors and tooling that is intrinsically safe.”

Below: A LinkedIn post from Luke Corbeth shows Rosie, using both arms, inside the I-CPS lab

THE FUTURE

 

And why is research like this so important? What difference will it make to have robots and humans working safely together, with safe manipulators and end effectors that might even be able to, for example, lift an object in concert with a human being? And why the focus on interconnectedness between all of these facilities?

Well, there’s obviously the value of the research itself – which will lead to greater efficiencies, improved manipulators, gripping technologies, new algorithms and AI enhancements – as well as enhanced safety down the road. But there’s a much bigger picture, says Dr. Birglen, especially if you can get your head around thinking about the future from a global perspective.

China, he says, is no longer a developing nation. The days when the words “Made in China” meant poor quality are – with rare exceptions – gone. The country is, in fact, highly developed – and working at breakneck speed when it comes to innovation and adoption of robotics at scale. A revolution is underway that has massive implications for competitive advantage that simply cannot be ignored. So the research at  I-CPS is not merely important from an academic perspective, it’s strategic when viewed through a global economic lens.

“We as a country – meaning Canada – are in competition with other countries for manufacturing, for producing goods and services. China is a developed country and it is very, very, very good in robotics,” he states. “You know how in the past we saw China as producing low quality goods, low quality robots? That’s over, man. That’s finished.”

And?

“If they are investing in robotics like mad and we are not, we’re going to be a leftover – Canada is going to sink as a rich country. If you want to produce wealth in the 21st Century, you need robots, you need automation, you need integration. In short, you need to be the leader of the pack or you’re going to be eaten.”

It’s a stark warning – and it’s true.

I step outside as author and state this having lived in China back when it was still a developing country in the late 1980s – and having returned several times since then. The transformation has been nothing short of astonishing. How, you might ask, did it achieve all this?

The answer has its genesis with former Chinese leader Deng Xiaoping. who led the country from 1978 to 1989. He didn’t merely open the door to reform; he created policies that began sending waves of students from what had been a xenophobic country abroad to study. There was an emphasis on careers that could help modernise the nation, including all aspects of engineering, aerospace, construction, transportation, architecture, etc. That’s where all this began.

Thankfully (and with credit to federal funding agencies like CFI), there are projects like I-CPS underway – and academics like Dr. Lionel Birglen with the vision to push the needle safely forward.

Below: “Baxter” – the original dual-arm robot. Baxter is still at Polytechnique Montréal, but Rosie is the mobile future. Photo by Luke Corbeth

Baxter
Rosie

INDRO’S TAKE

 

We’re obviously pleased Polytechnique Montréal selected InDro to build Rosie. And we’re particularly pleased to see that she’s being deployed at I-CPS, as part of an integrated and networked research project that has such potentially profound implications for the future.

“I believe Dr. Birglen is correct in his assessment of the importance of robotics and automation in the future,” says InDro Robotics Founder and CEO Philip Reece. “And when you throw innovations with drones and even autonomous Uncrewed Aerial Vehicles capable of carrying large cargo loads and passengers into the mix, we are actually heading into a Jetsons-like future,” he adds.

“I think there’s a growing understanding of the implications of this kind of future from not only the private sector, but also federal regulators and funding agencies. At InDro our mission will always focus on continued innovation. Sometimes those innovations are our own inventions, but a key piece of the puzzle is R&D work carried out by academics like Lionel Birglen. We’re confident that Rosie’s arms are in the right hands.”

Interested in learning more about a custom robotics solution? Feel free to contact us here.

InDro clients Polytechnique Montréal featured on CNN with swarm research on ‘Mars’

InDro clients Polytechnique Montréal featured on CNN with swarm research on ‘Mars’

By Scott Simmie

There’s nothing quite as satisfying as seeing really good R&D in the field.

And when that research gets coverage from CNN? Well, that’s even better.

The news network just profiled some cutting-edge work being carried out by students at Polytechnique Montréal. Specifically, students who work in the MIST Lab – where that acronym stands for Making Innovative Space Technology.

We’ve profiled the work being carried out there before (you can find it here). Essentially, students are working on innovative robotics research they hope will one day prove useful on the moon and Mars.

“What we want to do is to explore environments including caves and surfaces on other planets or satellites using robotics,” explained Dr. Giovanni Beltrame (Ph.D.), a full professor at Polytechnique’s Departments of Computer Engineering and Software Engineering during our earlier interview. “Caves and lava tubes can be ideal places for settlement: They can be sealed and provide radiation shielding. There’s also a chance of finding water ice in them.”

The research certainly caught our attention – partly because the MIST Lab is an InDro client. We’ve supplied them with platforms and robots which they’ve enhanced with “backpacks” enabling swarm robotics research. Recently, they took a fleet of those connected robots to the Canadian Space Agency’s Mars Yard. The site has been built to replicate the surface on Mars – what’s known as a Planetary Analogue Terrain.

The mission? To have these interconnected robots autonomously map that surface in high-resolution.

Below: The Mars Yard. Photo by the Canadian Space Agency, followed by a pic of some of the robots InDro modified and supplied to Polytechnique Montréal

CSA Mars Yard
MIST

SWARM ROBOTICS

 

Fundamental to this research is deploying the robots in a swarm – where the robots carry out tasks autonomously while communicating with each other. In this experiment, they’re mapping that Planetary Analogue Terrain and compiling the data into a high-resolution digital twin.

“We absolutely believe that swarm robotics is the future of space exploration,” PhD student Riana Gagnon Souleiman told CNN. “It’s more efficient to have more robots and you’re less reliant on a single agent failing.”

We’ve written about swarm robotics before (and recently shipped a swarm to a US academic client). But this CNN story provides a full look at what the MIST Lab team has accomplished, modifying the robots with their own “backpack” for creating a local area network and meshing all that data.

In the video, which we’ll link to in a moment, you’ll see several of the 18 platforms InDro can supply. At the Mars Yard, you’ll see a Scout Mini, two Bunker Minis (seen in the photo above) and one Scout 2.0 – all working collaboratively.

The MIST Lab team has done an incredible job with modifying these robots and pulling off what we know is a very difficult mission. Kudos also to CNN for doing an exemplary job in explaining this story.

All set? You can watch the video here.

Below: Some of the MIST Lab researchers in a screen grab from the CNN story

Robosense sets new bar for affordable, powerful LiDAR sensors

Robosense sets new bar for affordable, powerful LiDAR sensors

By Scott Simmie

 

Building or modifying a robot?

Specifically, are you working on something with autonomy that needs to understand an unfamiliar environment? Then you’re likely looking at adding two key sensors: A depth camera and a LiDAR unit.

LiDAR (as most of you likely know), scans the surrounding environment with a continuous barrage of eye-safe laser beams. It measures what’s known as the “Time of Flight” – meaning the time it takes for the photons to be reflected off surrounding surfaces and return to the LiDAR unit. The closer that surface is, the shorter the Time of Flight. LiDARs calculate the time of each of those reflected beams and convert that into distance. Scatter enough of those beams in a short period of time (and LiDARs do), and you get an accurate digital representation of the surrounding environment – even while the robot is moving through it.

This is particularly useful for autonomous missions and especially for Simultaneous Localisation and Mapping, or SLAM. That’s where a LiDAR-equipped robot can be placed in a completely unfamiliar (and even GPS-denied) environment and produce a point-cloud map of its surroundings while avoiding obstacles. Quality LiDARs are also capable of producing 3D precision scans for a wide variety of use-cases.

All great, right? Except for one thing: LiDAR sensors tend to be very expensive. So expensive, they can be out of reach for an R&D team, academic institution or Startup.

There is, however, a solution: Robosense.

The company produces LiDAR sensors (both mechanical and solid-state) that rival the established players in the market. And they do so for about one-third of the cost of the industry heavyweights.

“The performance of Robosense is outstanding – absolutely on par with its main competitors in North America,” says InDro Account Executive Callum Cameron. “We have been integrating Robosense LiDAR on our products for about two years, and their performance is exceptional.”

Below: A fleet of four robots, equipped with Robosense LiDAR, which recently shipped to an academic client.

 

Robosense LiDAR

ROBOSENSE

 

The company might not yet be a household name (unless your household has robots), but as of May 2024 the firm had sold 460,000 LiDAR units. Its sensors power a large number of autonomous cars, delivery vehicles and other robots – and it’s the first company to achieve mass production of automotive-grade LiDAR units with its own in-house developed chip.

The company was founded in 2014, with some A-level engineering talent – and it’s been on a stellar trajectory ever since. One of the reasons is because Robosense produces all three core technologies behind its products: The actual chipsets, the LiDAR hardware, and the perception software. We’ll let the company itself tell you more:

“In 2016, RoboSense began developing its R Platform mechanical LiDAR. One year later, in 2017, we introduced our perception software alongside the automotive-grade M Platform LiDAR sensors tailored for advanced driver assistance and autonomous driving systems. We achieved the start-of-production (SOP) of the M1 in 2021, becoming the world’s first LiDAR company to mass-produce automotive-grade LiDAR equipped with chips developed in-house,” says its website.

The company now has thousands of engineers. And it didn’t take long before the world noticed what they were producing.

“As of May 17, 2024, RoboSense has secured 71 vehicle model design wins and enabled 22 OEMs and Tier 1 customers to start mass production of 25 models. We serve over 2,500 customers in the robotics and other non-automotive industries and are the global LiDAR market leader in cumulative sales volume.”

The company has also received prestigious recognition for its products, including two CES Innovation awards, the Automotive News PACE award, and the Audi Innovation Lab Champion prize.

“This company has standout features, including Field of View, point cloud density and high frame rates,” says Cameron. “If you look at that fleet of four robots we recently built, using the competition those LiDAR units alone would have come to close to $80,000. The Robosense solution cost roughly one-quarter of that with similar capabilities.”

And the factories? State of the art. Though this video focuses on its solid-state LiDAR, Robosense uses the same meticulous process for its mechanical units:

LiDAR FOR EVERY APPLICATION

 

Robosense produces many different LiDAR sensors. But what particularly appeals to us is that the company has (excuse the pun) a laser-like focus on the robotics industry. Its Helios multi-beam LiDAR units have been designed from the ground up for robots and intelligent vehicles. There are customisable fields of view, depending on application, and a near-field blind-spot of ≤ 0.2 metres. In addition, Helios LiDAR comes in 16- and 32-beam options depending on point-cloud density and FOV requirements. Both are capable of functioning in temperatures as low as -40° or on a scorching day in the Sahara desert. There’s also protection against multi-radar interference and strong light (which can be an issue with LiDAR). You can learn more about its features here.

Its Bpearl unit proves that very good things can indeed come in small packages. With a 360° horizontal and 90° vertical hemispherical FOV, it’s been designed for near-field blind spots, capable of detection at ≤10 cm. That’s why we selected it for a robot designed to inspect cycling lanes for hazards (while avoiding cyclists, of course). We actually have two Bpearls on that robot (one on each side), since detecting blind spots and avoiding other obstacles is so critical to this application.

“We’ve integrated both the Bpearl and Helios LiDAR units into multiple different robots and the performance has been excellent, even under adverse conditions,” says Cameron. “Obstacle avoidance has been outstanding, and SLAM missions are a snap.”

Below: This InDro robot features two 32-beam Robosense Bpearl LiDAR units. You can see one of them – that tiny bubble on the side (and there’s another one on the opposite side):

InDro Sentinel

THE THREE “D”s

 

You’ve likely heard this before, but robots are perfect for jobs that are Dirty, Dull or Dangerous – because they remove humans from those scenarios. Robots, particularly inspection robots, are often subjected to extremes in terms of weather and other conditions.

So this is a good place to mention that if a Robosense LiDAR encounters fog, rain, dust or snow it has a de-noising function to ensure it’s still capturing accurate data and that your point cloud isn’t a representation of falling snow. All of the Robosense LiDAR sensors have outstanding Ingress Protection ratings.

Because adverse conditions are quite likely to occur at some point during a robotic mission, Robosense puts its products through absolutely gruelling tests. Hopefully your robot won’t encounter the scenarios seen below, but if it does – the LiDAR will keep working:

INDRO’S TAKE

 

We take pride in putting only the highest quality sensors into our products.

Prior to adopting Robosense as our “go-to” LiDAR about two years ago, we were using big-name products. But those products also came with a big price tag. When we discovered the quality and price of Robosense LiDAR units, it was an obvious choice to make the switch. We have shipped multiple Robosense-enabled robots to clients, saving them thousands of dollars – in one case, tens of thousands – while still capturing every bit of data they require. Robosense is now our go-to, even on our flagship products. (We recently did a demonstration of one of our newer Helios-equipped autonomous quadrupeds to a high-profile client; they were amazed with the results.)

“Robosense is every bit the equal of the heavyweight LiDAR manufacturers, without the downside of the high cost,” says InDro Robotics CEO Philip Reece. “The field-of-view, point cloud density and quality of construction are all state-of-the-art, as are the manufacturing facilities. What’s more, Robosense continues to push the envelope with every new product it releases.”

Interested in learning more, including price and options? Contact Account Executive Callum Cameron right here, and he’ll give you all the info you need.

George Mason U. researchers enable robots to intelligently navigate challenging terrain

George Mason U. researchers enable robots to intelligently navigate challenging terrain

By Scott Simmie

 

Picture this: You’re out for a drive and in a hurry to reach your destination.

At first, the road is clear and dry. You’ve got great traction and things are going smoothly. But then the road turns to gravel, with twists and turns along the way. You know your vehicle well, and have navigated such terrain before.

And so, instinctually, you slow the vehicle to navigate the more challenging conditions. By doing so, you avoid slipping on the gravel. Your experience with driving, and in detecting the conditions, has saved you from a potential mishap. Yes, you slowed down a bit. But you’ll speed up again when the conditions improve. The same scenario could apply to driving on grass, ice – or even just a hairpin corner on a dry paved road.

For human beings, especially those with years of driving experience, such adjustments are second-nature. We have learned from experience, and we know the limitations of our vehicles. We see and instantly recognize potentially hazardous conditions – and we react.

But what about if you’re a robot? Particularly, a robot that wants to reach a destination at the maximum safe speed?

That’s the crux of fascinating research taking place at George Mason University: Building robots that are taught – and can subsequently teach themselves – how to adapt to changing terrain to ensure stable travel at the maximum safe speed.

It’s very cool research, with really positive implications.

Below: You don’t want this happening on a critical mission…

George Mason Xuesu Xiao Hunter SE

“XX”

 

Those are the initials of Dr. Xuesu Xiao, an Assistant Professor at George Mason University. He holds a PhD in Computer Science, and runs a lab that plays off his initials, called the RobotiXX Lab. Here’s a snippet of the description from his website:

“At RobotiXX lab, researchers (XX-Men) and robots (XX-Bots) perform robotics research at the intersection of motion planning and machine learning with a specific focus on robustly deployable field robotics. Our research goal is to develop highly capable and intelligent mobile robots that are robustly deployable in the real world with minimal human supervision.”

We spoke with Dr. Xiao about this work.

It turns out he’s particularly interested in making robots that are particularly useful to First Responders, and carrying out those dull, dirty and dangerous tasks. Speed in such situations can be critical, but comes with its own set of challenges. A robot that makes too sharp a turn at speed on a high friction surface can easily roll over – effectively becoming useless in its task. Plus, there are the difficulties previously flagged with other terrains.

This area of “motion planning” fascinates Dr. Xiao. Specifically, how to take robots beyond traditional motion planning and enable them to identify and adapt to changing conditions. And that involves machine vision and machine learning.

“Most motion planners used in existing robots are classical methods,” he says. “What we want to do is embed machine learning techniques to make those classical motion planners more intelligent. That means I want the robots to not only plan their own motion, but also learn from their own past experiences.”

In other words, he and his students have been focussing on pushing robots to develop capabilities that surpass the instructions and algorithms a roboticist might traditionally program.

“So they’re not just executing what has been programmed by their designers, right? I want them to  improve on their own, utilising all the different sources of information they can get while working in the field.”

 

THE PLATFORM

 

The RobotiXX Lab has chosen the Hunter SE from AgileX as its core platform for this work. That platform was supplied by InDro Robotics, and modified with the InDro Commander module. That module enables communication over 5G (and 4G) networks, enabling high speed data throughput. It comes complete with multiple USB slots and the Robot Operating System (ROS) library onboard, enabling the easy addition (or removal) of multiple sensors and other modifications. It also has a remote dashboard for controlling missions, plotting waypoints, etc.

Dr. Xiao was interested in this platform for a specific reason.

“The main reason is it is because it’s high speed, with a top speed of 4.8m per second. For a one-fifth/one-sixth scale vehicle that is a very, very high speed. And we want to study what will happen when you are executing a turn, for example, while driving very quickly.”

As noted previously, people with driving experience instinctively get it. They know how to react.

“Humans have a pretty good grasp on what terrain means,” he says. “Rocky terrain means things will get bumpy, grass can impede a motion, and if you’re driving on a high-friction surface you can’t turn sharply at speed. We understand these phenomenon. The problem is, robots don’t.”

So how can we teach robots to be more human in their ability to navigate and adjust to such terrains – and to learn from their mistakes?

As you’ll see in the diagram below, it gets *very* technical. But we’ll do our best to explain.

George Mason Hunter Xuesu Xiao

THE APPROACH

 

The basics here are pretty clear, says Dr. Xiao.

“We want to teach the robots to know the consequences of taking some aggressive maneuvers at different speeds on different terrains. If you drive very quickly while the friction between your tires and the ground is high, taking a very sharp turn will actually cause the vehicle to roll over – and there’s no way the robot by itself will be able to recover from it, right? So the whole idea of the paper is trying to enable robots to understand all these consequences; to make them ‘competence aware.'”

The paper Dr. Xiao is referring to has been submitted for scientific publication. It’s pretty meaty, and is intended for engineers/roboticists. It’s authored by Dr. Xiao and researchers Anuj Pokhrel, Mohammad Nazeri, and Aniket Datar. It’s entitled: CAHSOR: Competence-Aware High-Speed Off-Road Ground Navigation in SE(3).

That SE(3) term is used to describe how objects can move and rotate in 3D space. Technically, it stands for Special Euclidean group in three dimensions. It refers to keeping track of an object in 3D space – including position and orientation.

We’ll get to more of the paper in a minute, but we asked Dr. Xiao to give us some help understanding what the team did to achieve these results. Was it just coding? Or were there some hardware adjustments as well?

Turns out, there were both. Yes, there was plenty of complex coding. There was also the addition of an RTK GPS unit so that the robot’s position in space could be measured as accurately as possible. Because the team soon discovered that intense vibration over rough surfaces could loosen components, threadlock was used to keep things tightly in place.

But, as you might have guessed, machine vision and machine learning are a big part of this whole process. The robot needs to identify the terrain in order to know how to react.

We asked Dr. Xiao if an external data library was used and imported for the project. The answer? “No.”

“There’s no dataset out there that includes all these different basic catastrophic consequences when you’re doing aggressive maneuvers. So all the data we used to train the robot and to train our machine learning algorithms were all collected by ourselves.”

 

SLIPS, SLIDES, ROLLOVERS

 

As part of the training process, the Hunter SE was driven over all manner of demanding terrain.

“We actually bumped it through very large rocks many times and also slid it all over the place,” he says. “We actually rolled the vehicle over entirely many times. This was all very important for us to collect some data so that it learns to not do that in the future, right?”
 
And while the cameras and machine vision were instrumental in determining what terrain was coming up, the role of the robot’s Inertial Measurement Unit was also key.

“It’s actually multi-modal perception, and vision is just part of it. So we are looking at the terrain using camera images and we are also using our IMU. Those inertial measurement unit readings  sense the acceleration and the angular velocities of the robot so that it can better respond,” he says.

“Because ultimately it’s not only about the visual appearance of the terrain, it is also about how you drive on it, how you feel it.”

 

THE RESULTS

 

Well, they’re impressive.

The full details are outlined in this paper, but here’s the headline: Regardless of whether the robot was operating autonomously heading to defined waypoints, or whether a human was controlling it, there was a significant reduction in incidents (slips, slides, rollovers etc.) with only a small reduction in overall speed.

Specifically, “CAHSOR (Competence-Aware High-Speed Off-Road Ground Navigation) can efficiently reduce vehicle instability by 62% while only compromising 8.6% average speed with the help of TRON (visual and inertial Terrain Representation for Off-road Navigation).”

That’s a tremendous reduction in instability – meaning the likelihood that these robots will reach their destination without incident is greatly improved. Think of the implications for a First Responder application, where without this system a critical vehicle rushing to a scene carrying medical supplies – or even simply for situational awareness – might roll over and be rendered useless. The slight reduction in speed is a small price to pay for greatly enhancing the odds of an incident-free mission.

“Without using our method, a robot will just blindly go very aggressively over every single terrain – while risking rolling over, bumps and vibrations on rocks, maybe even sliding and rolling off a cliff.”

What’s more, these robots continue to learn with each and every mission. They can also share data with each other, so that the experience of one machine can be shared with many. Dr. Xiao also says the learnings from this project, which began in January of 2023, can also be applied to marine and even aerial robots.

For the moment, though, the emphasis has been fully on the ground. And there can be no question this research has profound and positive implications for First Responders (and others) using robots in mission-critical situations.

Below: The Hunter SE gets put through its paces. (All images courtesy of Dr. Xiao.)

Hunter SE George Mason Xuesu Xiao

INDRO’S TAKE

 

We’re tremendously impressed with the work being carried out by Dr. Xiao and his team at George Mason University. We’re also honoured to have played a small role in supplying the Hunter SE, InDro Commander, as well as occasional support as the project progressed.

“The use of robotics by First Responders is growing rapidly,” says InDro Robotics CEO Philip Reece. “Improving their ability to reach destinations safely on mission-critical deployments is extremely important work – and the data results are truly impressive.

“We are hopeful the work of Dr. Xiao and his team are adopted in future beyond research and into real-world applications. There’s clearly a need for this solution.”

If your institution or R&D facility is interested in learning more how InDro’s stable of robots (and there are many), please reach out to us here.

InDro builds, delivers custom robot to global client

InDro builds, delivers custom robot to global client

By Scott Simmie

 

We’ve built a new robot we’d like to tell you about.

It’s for a highly specialised use-case scenario for a global client. (And when we say global client, it’s a household name.)

This isn’t the first project where we’ve been tapped by a heavy-hitting company to design and build custom robots. We have ongoing contracts with others, where unfortunately NDAs prohibit us from disclosing pretty much anything. (We can tell you that one of the ground robots we’re building for one of those clients is pretty big.)

In this case, the client has agreed to let us tell you a fair bit about the product, providing we don’t reveal their name. We think this is a really intriguing robot, so we’re going to share some details – including images of the final product.

Here it is. And, by the way, it’s as tall as the average person. The sensor poking out on the right near the top of the cylindrical portion is positioned at eye-level.

Custom Robot

NOT A PIZZA OVEN

 

With that stretching, stovepipe-like neck, it might look like a pizza oven on wheels. But it’s not. It’s designed that way so that sensors can be roughly at the head height of human beings. The box at the bottom could be thought of as a computer on steroids.

That’s because the client wanted this robot for a very specific purpose: To be able to navigate complex crowds of people.

“The client wants to use Vision SLAM (Simultaneous Localisation and Mapping) to essentially detect humans and pathways through chaotic environments,” says Arron Griffiths, InDro’s Engineering Manager. Arron works out of our Area X.O location, where the robot was fabricated.

“Think malls, shopping centres, and stuff like that where humans are mingling to navigate around. And there’s no really defined path, the robot must organically move around people. Yes, you’d have an overall predetermined path with a desired destination, but once the chaos of humans comes in the robot would safely meander its way through crowds.”

 

LOTS OF TECHNOLOGY

 

That’s not a simple task. The client is going to supply its own autonomy software, but InDro had to work closely with them on the robot’s design and capabilities.

We mentioned earlier that this robot is SLAM-capable. That means it can map its surroundings in real time and make its own decisions – while it’s moving – about where in the ever-changing environment it makes sense to go to next. Two ZED depth cameras provide a detailed look at those surroundings (one close to the ground, the other at human eye level). So it’s constantly scanning, mapping, and making decisions about where to move next in real-time.

This is a data-dense task that requires a *lot* of onboard computing power.

“It’s basically a really powerful desktop computer on wheels,” says InDro Account Executive Luke Corbeth. “It’s outfitted with serious computational power, including the same graphic cards that people use to mine bitcoin.”

And that posed another challenge for our engineering team. The client wanted the robot to be able to operate for several hours at a time. But that advanced computing capability really puts a drain on power. 

“Once you stick these high-end computers into a battery powered robotic system, your run time drops like a stone,” explains Griffiths. “It’s a bit of a beast on power. That’s why we had to put a second battery into the unit. This is an excercise in finding a balance point, and producing a robot that will do a high-end deployment with all of this high end technology.”

Custom Robot Canada

CLIENT-CENTRED PROCESS

 

This wasn’t the first custom-robot that client has requested. The international company has a longer-term research project focussed on enabling a robot to navigate when surrounded by unpredictable human beings. It has developed, and will continue to tweak, its own autonomy software to carry out this task in conjunction with this robot.

InDro worked closely with the client on the design – both the technical requirements in terms of processors, sensors, graphic cards, run time – as well as the physical appearance. Because the client had some very tight timelines, InDro designed and built this robot in a very short period of time: Seven weeks from outset until the product was shipped.

“That’s extremely fast,” says Griffiths. “That’s the fastest custom robot I’ve seen in my working profession. You’ve got to think design cycles, manufacturing, outsourcing, testing. From this being nothing, to being shipped out in less than two months is incredible.”

 

SOLUTION-FOCUSED

 

But there’s a difference between carrying out an expedited task – and doing a rush job. The focus always had to remain on ensuring that the capabilities, design, build and testing of this machine would meet or exceed the client’s rigorous standards. And that meant even the tiniest details counted.

For example, we’d discovered with a previous robot using the same locomotion platform that there could be an issue on rough surfaces. Specifically, if you were turning a tight corner or accelerating while turning, the wheels could shudder and jump. This was especially an issue on asphalt and concrete.

InDro’s engineering team knew that with this robot any such shudders would be amplified due to the height of the machine; a minor shudder at the base would translate into significant wobbling at the robot’s top. That wasn’t something we wanted happening.

And so we created a solution. We covered the individual wheels with a 3D-printed wrap. This provides a barrier between the sticky rubber and ground, allowing the robot to slightly slide during such manoeuvres and avoiding those troubling vibrations.

 

Below: Detail of the wheels, with their new coating

Custom Robot

CLIENT REACTION

 

When we pack up and ship a custom build, the client always gets in touch after they’ve received the product. That’s the moment of truth – and the feedback we eagerly await.

Not long after the robot arrived, an email from the client landed. It included the following:

“The robot is fantastic,” they wrote. “The craftsmanship is superb; the power on the base is enabling; the intricate way in which the computer fits in the base housing is incredible; the compute box + mast feels ‘just right’ (there’s no template for social robot design, but I feel like we got very close).

“All these things make me really confident that, with the right algorithms (my responsibility) we can safely and efficiently navigate through crowds. It’s a really special robot that I can’t wait to put in the field.  Your team deserves a raise!”

This robot, though it can’t cook pizzas, is one of the most powerful Uncrewed Ground Vehicles InDro has built, at least in terms of raw onboard computational power. Engineering lead Griffiths believes its capabilities could make a variation of this machine suitable for other clients, as well.

“I think it’s a very good platform for clients who want very high computing power in a small form factor  that actually has some range, some longevity to it,” he says.

Below: Even when they’re under the gun, our engineering team takes it all in stride

Robotics Engineers

INDRO’S TAKE

 

We’re often working on projects like this. In fact, this isn’t the first major global client to tap InDro for custom builds. As our tagline states: “Invent. Enhance. Deploy.” That’s what we do.

“This was an expedited design, build and test of a completely new and computationally powerful robot,” says InDro Robotics CEO Philip Reece. “We know that InDro’s reputation rides on every product we ship and every service we provide. So we’re delighted to hear the client is as pleased with this robot as we are – and look forward to building more for them.”

Interested in what a powerhouse machine like this might do for you? Feel free to explore the possibilities by setting up a conversation with Account Executive Luke Corbeth.

 

CBC HIGHLIGHTS YOW DRONE DETECTION SYSTEM

CBC HIGHLIGHTS YOW DRONE DETECTION SYSTEM

By Scott Simmie

 

If you follow InDro Robotics, you’ll likely be aware that we were a co-founder and core technology partner of the YOW Drone Detection Pilot Project.

The system has been operating since the fall of 2020, and detects drone intrusions not only at the Ottawa International Airport, but as far as 40 kilometres away in the National Capital Region. Data from the project helps to inform airport protocols and is shared on a regular basis with Transport Canada and law enforcement.

Back during the “Freedom Convoy” protests in downtown Ottawa, the system got onto the mainstream radar after we published this story, which outlined the high number of unauthorised drone flights taking place in downtown Ottawa. The Ottawa Citizen covered that story here and it was also a cover story for WINGS Magazine.

Now, the system is back in the news for a different reason: The recent visit of US President Joe Biden to Ottawa.

President Biden

AIR FORCE ONE

 

Prior to the actual visit, advance teams from the Secret Service and Air Force One wanted to check out security and logistics at the Ottawa International Airport. And one of the first questions? Whether YOW had a drone detection system.

The answer, as you know, is Yes. We interviewed Michael Baudette, YOW’s VP of Security, Emergency Management and Customer Transportation. The resulting post garnered a lot of attention, including a lengthy interview by CBC Ottawa.

To view the segment on the Drone Detection Pilot project, check out the video below.